Ver Fonte

feat: 优化架构

shuisheng há 2 anos atrás
pai
commit
de6251d3fe

+ 2 - 3
.vscode/settings.json

@@ -2,8 +2,6 @@
   // 指定行尾序列为\n(LF)或者\r\n(CRLF)或者auto
   // "files.eol": "\n",
 
-  "vue.codeActions.savingTimeLimit": 1000,
-
   // 在保存时格式化
   "editor.formatOnSave": true,
   "editor.formatOnSaveMode": "file",
@@ -44,5 +42,6 @@
     "tsx",
     "vue"
   ],
-  "typescript.tsdk": "node_modules/typescript/lib"
+  "typescript.tsdk": "node_modules/typescript/lib",
+  "vue.codeActions.savingTimeLimit": 1000
 }

+ 0 - 12
README.md

@@ -168,18 +168,6 @@ npm run build
 
 [https://live.hsslive.cn/about/faq](https://live.hsslive.cn/about/faq)
 
-### flv.js
-
-~~不通过 npm 安装 flv.js,因为安装了 flv.js 后,`import flvJs from 'flv.js'` 会导致 vscode 的 ts 错乱。因此直接下载 flv.min.js 使用。~~,应该是我的 vscode 用了 vscode 的 ts 版本(ts 的 5.x 版本),用回工作区(也就是项目里面安装的 ts 的 4.9 的版本)的 ts 版本就没事了?
-
-### video.js 报错
-
-Chrome 版本 114.0.5735.133(正式版本) (arm64),调试移动端的时候,此时的地址栏是:http://localhost:8000/h5/3?liveType=srsHlsPull,使用模拟的安卓设备,点击播放没问题(播放的 hls),但是换成模拟一个苹果设备(任意苹果设备都行,iphone6,7,8,X,12 Pro 等等),点击播放都会报错:`VIDEOJS: ERROR: (CODE:4 MEDIA_ERR_SRC_NOT_SUPPORTED) The media could not be loaded, either because the server or network failed or because the format is not supported.`
-
-Firefox 版本 114.0.2 (64 位),调试移动端时,此时的地址栏是:http://localhost:8000/h5/3?liveType=srsHlsPull,模拟安卓、苹果设备都能正常播放,但是小概率会报 blob 解码,但刷新就又好了
-
-Safari 版本:版本 16.5.1 (18615.2.9.11.7),开发===>响应式设计模式,模拟任何苹果设备,都能正常播放,并且行为和实际的苹果手机行为一致(苹果手机有的 bug,在电脑的 Safari 调试的时候也有。但电脑的 Firfox 和 Chrome 调试时没有,实际上电脑的 Firfox 和 Chrome 调试时应该也要出现这个 bug)
-
 ## 团队
 
 [https://live.hsslive.cn/about/team](https://live.hsslive.cn/about/team)

+ 35 - 0
Remark.md

@@ -0,0 +1,35 @@
+### 破音问题
+
+当前电脑正在播放音频,如果此时开始直播,添加了一个视频素材(有声音的),然后拖拽这个视频素材,就会出现类似破音问题。如果把电脑的外放音频都关闭了,再开始直播添加视频素材,拖拽就不会出现破音问题。
+
+### video标签属性
+
+```html
+<!-- x-webkit-airplay这个属性应该是使此视频支持ios的AirPlay功能 -->
+<!-- playsinline、 webkit-playsinline IOS微信浏览器支持小窗内播放 -->
+<!-- x5-video-player-type 启用H5播放器,是wechat安卓版特性 -->
+<!-- x5-video-player-fullscreen 全屏设置 -->
+<!-- x5-video-orientation 声明播放器支持的方向,可选值landscape横屏,portraint竖屏。默认值portraint。 -->
+<video
+  autoplay
+  webkit-playsinline="true"
+  playsinline
+  x-webkit-airplay="allow"
+  x5-video-player-type="h5"
+  x5-video-player-fullscreen="true"
+  x5-video-orientation="portraint"
+  muted
+></video>
+```
+
+### flv.js
+
+~~不通过 npm 安装 flv.js,因为安装了 flv.js 后,`import flvJs from 'flv.js'` 会导致 vscode 的 ts 错乱。因此直接下载 flv.min.js 使用。~~,应该是我的 vscode 用了 vscode 的 ts 版本(ts 的 5.x 版本),用回工作区(也就是项目里面安装的 ts 的 4.9 的版本)的 ts 版本就没事了?
+
+### video.js 报错
+
+Chrome 版本 114.0.5735.133(正式版本) (arm64),调试移动端的时候,此时的地址栏是:http://localhost:8000/h5/3?liveType=srsHlsPull,使用模拟的安卓设备,点击播放没问题(播放的 hls),但是换成模拟一个苹果设备(任意苹果设备都行,iphone6,7,8,X,12 Pro 等等),点击播放都会报错:`VIDEOJS: ERROR: (CODE:4 MEDIA_ERR_SRC_NOT_SUPPORTED) The media could not be loaded, either because the server or network failed or because the format is not supported.`
+
+Firefox 版本 114.0.2 (64 位),调试移动端时,此时的地址栏是:http://localhost:8000/h5/3?liveType=srsHlsPull,模拟安卓、苹果设备都能正常播放,但是小概率会报 blob 解码,但刷新就又好了
+
+Safari 版本:版本 16.5.1 (18615.2.9.11.7),开发===>响应式设计模式,模拟任何苹果设备,都能正常播放,并且行为和实际的苹果手机行为一致(苹果手机有的 bug,在电脑的 Safari 调试的时候也有。但电脑的 Firfox 和 Chrome 调试时没有,实际上电脑的 Firfox 和 Chrome 调试时应该也要出现这个 bug)

Diff do ficheiro suprimidas por serem muito extensas
+ 130 - 30
pnpm-lock.yaml


+ 18 - 0
remark.md

@@ -1,3 +1,9 @@
+### 破音问题
+
+当前电脑正在播放音频,如果此时开始直播,添加了一个视频素材(有声音的),然后拖拽这个视频素材,就会出现类似破音问题。如果把电脑的外放音频都关闭了,再开始直播添加视频素材,拖拽就不会出现破音问题。
+
+### video标签属性
+
 ```html
 <!-- x-webkit-airplay这个属性应该是使此视频支持ios的AirPlay功能 -->
 <!-- playsinline、 webkit-playsinline IOS微信浏览器支持小窗内播放 -->
@@ -15,3 +21,15 @@
   muted
 ></video>
 ```
+
+### flv.js
+
+~~不通过 npm 安装 flv.js,因为安装了 flv.js 后,`import flvJs from 'flv.js'` 会导致 vscode 的 ts 错乱。因此直接下载 flv.min.js 使用。~~,应该是我的 vscode 用了 vscode 的 ts 版本(ts 的 5.x 版本),用回工作区(也就是项目里面安装的 ts 的 4.9 的版本)的 ts 版本就没事了?
+
+### video.js 报错
+
+Chrome 版本 114.0.5735.133(正式版本) (arm64),调试移动端的时候,此时的地址栏是:http://localhost:8000/h5/3?liveType=srsHlsPull,使用模拟的安卓设备,点击播放没问题(播放的 hls),但是换成模拟一个苹果设备(任意苹果设备都行,iphone6,7,8,X,12 Pro 等等),点击播放都会报错:`VIDEOJS: ERROR: (CODE:4 MEDIA_ERR_SRC_NOT_SUPPORTED) The media could not be loaded, either because the server or network failed or because the format is not supported.`
+
+Firefox 版本 114.0.2 (64 位),调试移动端时,此时的地址栏是:http://localhost:8000/h5/3?liveType=srsHlsPull,模拟安卓、苹果设备都能正常播放,但是小概率会报 blob 解码,但刷新就又好了
+
+Safari 版本:版本 16.5.1 (18615.2.9.11.7),开发===>响应式设计模式,模拟任何苹果设备,都能正常播放,并且行为和实际的苹果手机行为一致(苹果手机有的 bug,在电脑的 Safari 调试的时候也有。但电脑的 Firfox 和 Chrome 调试时没有,实际上电脑的 Firfox 和 Chrome 调试时应该也要出现这个 bug)

+ 0 - 1
script/config/webpack.dev.ts

@@ -2,7 +2,6 @@ import ForkTsCheckerWebpackPlugin from 'fork-ts-checker-webpack-plugin';
 import portfinder from 'portfinder';
 import { Configuration } from 'webpack';
 import WebpackBar from 'webpackbar';
-import { VueLoaderPlugin } from 'vue-loader';
 
 import { outputStaticUrl, webpackBarEnable } from '../constant';
 import TerminalPrintPlugin from '../TerminalPrintPlugin';

+ 0 - 2
script/config/webpack.prod.ts

@@ -11,7 +11,6 @@ import TerserPlugin from 'terser-webpack-plugin';
 // import { version as vueVersion } from 'vue/package.json';
 import { Configuration } from 'webpack';
 import WebpackBar from 'webpackbar';
-import { VueLoaderPlugin } from 'vue-loader';
 
 import { gzipEnable } from '../constant';
 import { chalkINFO } from '../utils/chalkTip';
@@ -177,7 +176,6 @@ const prodConfig: Configuration = {
     ],
   },
   plugins: [
-    // new VueLoaderPlugin(),
     // 构建进度条
     new WebpackBar(),
     // http压缩

+ 1 - 0
src/constant.ts

@@ -32,4 +32,5 @@ export const mediaTypeEnumMap = {
   [MediaTypeEnum.img]: '图片',
   [MediaTypeEnum.txt]: '文字',
   [MediaTypeEnum.media]: '视频',
+  [MediaTypeEnum.time]: '时间',
 };

+ 2 - 0
src/hooks/use-play.ts

@@ -45,6 +45,7 @@ export function useFlvPlay() {
   }
 
   function startFlvPlay(data: { flvurl: string }) {
+    console.log('startFlvPlay', data.flvurl);
     destroyFlv();
     return new Promise<{ width: number; height: number }>((resolve) => {
       if (mpegts.getFeatureList().mseLivePlayback && mpegts.isSupported()) {
@@ -128,6 +129,7 @@ export function useHlsPlay() {
   );
 
   function startHlsPlay(data: { hlsurl: string }) {
+    console.log('startHlsPlay', data.hlsurl);
     destroyHls();
     const videoEl = createVideo({ muted: appStore.muted, autoplay: true });
     hlsVideoEl.value = videoEl;

+ 14 - 26
src/hooks/use-pull.ts

@@ -13,12 +13,10 @@ import { createVideo, videoToCanvas } from '@/utils';
 
 export function usePull({
   localVideoRef,
-  canvasRef,
   isSRS,
   liveType,
 }: {
   localVideoRef: Ref<HTMLVideoElement[]>;
-  canvasRef: Ref<Element | undefined>;
   isSRS: boolean;
   liveType: liveTypeEnum;
 }) {
@@ -39,14 +37,11 @@ export function usePull({
     }[]
   >([]);
   const videoElArr = ref<HTMLVideoElement[]>([]);
+  const remoteVideo = ref<HTMLElement[]>([]);
   const {
     getSocketId,
     initWs,
-    addTrack,
-    delTrack,
-    canvasVideoStream,
-    lastCoverImg,
-    roomLiveing,
+    roomLiving,
     liveRoomInfo,
     anchorInfo,
     roomNoLive,
@@ -54,12 +49,6 @@ export function usePull({
     localStream,
     liveUserList,
     damuList,
-    maxBitrate,
-    maxFramerate,
-    resolutionRatio,
-    currentMaxFramerate,
-    currentMaxBitrate,
-    currentResolutionRatio,
   } = useWs();
 
   const { flvPlayer, flvVideoEl, startFlvPlay, destroyFlv } = useFlvPlay();
@@ -88,7 +77,7 @@ export function usePull({
       size: { width, height },
     });
     stopDrawingArr.value.push(stopDrawing);
-    canvasRef.value!.appendChild(canvas);
+    remoteVideo.value.push(canvas);
     videoLoading.value = false;
   }
 
@@ -104,7 +93,7 @@ export function usePull({
       size,
     });
     stopDrawingArr.value.push(initCanvas.stopDrawing);
-    canvasRef.value!.appendChild(initCanvas.canvas);
+    remoteVideo.value.push(initCanvas.canvas);
     flvPlayer.value!.on(mpegts.Events.MEDIA_INFO, () => {
       console.log('数据变了');
       size.width = flvVideoEl.value!.videoWidth!;
@@ -114,6 +103,7 @@ export function usePull({
   }
 
   async function handlePlay() {
+    console.log('kkkk21', roomLiveType.value);
     if (roomLiveType.value === liveTypeEnum.srsFlvPull) {
       if (!autoplayVal.value) return;
       await handleFlvPlay();
@@ -134,14 +124,12 @@ export function usePull({
   );
 
   watch(
-    () => roomLiveing.value,
+    () => roomLiving.value,
     (val) => {
       if (val) {
-        flvurl.value = val.live?.live_room?.flv_url!;
-        hlsurl.value = val.live?.live_room?.hls_url!;
-        // if (val && roomLiveType.value === liveTypeEnum.webrtcPull) {
+        flvurl.value = liveRoomInfo.value?.flv_url!;
+        hlsurl.value = liveRoomInfo.value?.hls_url!;
         handlePlay();
-        // }
       }
     }
   );
@@ -171,7 +159,7 @@ export function usePull({
             const video = createVideo({});
             video.setAttribute('track-id', track.id);
             video.srcObject = new MediaStream([track]);
-            canvasRef.value?.appendChild(video);
+            remoteVideo.value.push(video);
             videoElArr.value.push(video);
           });
           stream.value?.getAudioTracks().forEach((track) => {
@@ -179,7 +167,7 @@ export function usePull({
             const video = createVideo({});
             video.setAttribute('track-id', track.id);
             video.srcObject = new MediaStream([track]);
-            canvasRef.value?.appendChild(video);
+            remoteVideo.value.push(video);
             videoElArr.value.push(video);
           });
           videoLoading.value = false;
@@ -193,8 +181,7 @@ export function usePull({
             video.setAttribute('track-id', track.id);
             video.srcObject = new MediaStream([track]);
             // document.body.appendChild(video);
-            // console.log('kkkk', video);
-            canvasRef.value?.appendChild(video);
+            remoteVideo.value.push(video);
             videoElArr.value.push(video);
           });
           stream.value?.getAudioTracks().forEach((track) => {
@@ -202,7 +189,7 @@ export function usePull({
             const video = createVideo({});
             video.setAttribute('track-id', track.id);
             video.srcObject = new MediaStream([track]);
-            canvasRef.value?.appendChild(video);
+            remoteVideo.value.push(video);
             videoElArr.value.push(video);
           });
           videoLoading.value = false;
@@ -310,8 +297,9 @@ export function usePull({
     addVideo,
     handleHlsPlay,
     handleFlvPlay,
+    remoteVideo,
     roomLiveType,
-    roomLiveing,
+    roomLiving,
     autoplayVal,
     videoLoading,
     roomNoLive,

+ 13 - 2
src/hooks/use-push.ts

@@ -70,6 +70,10 @@ export function usePush({
       type: MediaTypeEnum.media,
       txt: '视频',
     },
+    [MediaTypeEnum.time]: {
+      type: MediaTypeEnum.time,
+      txt: '时间',
+    },
   };
 
   const {
@@ -89,6 +93,8 @@ export function usePush({
     currentResolutionRatio,
     addTrack,
     delTrack,
+    sendStartLive,
+    startNewWebRtc,
   } = useWs();
 
   watch(
@@ -105,7 +111,7 @@ export function usePush({
         const video = createVideo({});
         video.setAttribute('track-id', track.id);
         video.srcObject = new MediaStream([track]);
-        localVideoRef.value?.appendChild(video);
+        // localVideoRef.value?.appendChild(video);
         videoElArr.value.push(video);
       });
       stream?.getAudioTracks().forEach((track) => {
@@ -113,7 +119,7 @@ export function usePush({
         const video = createVideo({});
         video.setAttribute('track-id', track.id);
         video.srcObject = new MediaStream([track]);
-        localVideoRef.value?.appendChild(video);
+        // localVideoRef.value?.appendChild(video);
         videoElArr.value.push(video);
       });
     },
@@ -221,6 +227,11 @@ export function usePush({
         }
       }
     }
+    sendStartLive();
+    startNewWebRtc({
+      videoEl: document.createElement('video'),
+      receiver: 'srs',
+    });
   }
 
   /** 结束直播 */

+ 132 - 112
src/hooks/use-ws.ts

@@ -5,27 +5,33 @@ import { fetchRtcV1Play, fetchRtcV1Publish } from '@/api/srs';
 import { WEBSOCKET_URL } from '@/constant';
 import {
   DanmuMsgTypeEnum,
-  IAnswer,
-  ICandidate,
   IDanmu,
-  IHeartbeat,
-  IJoin,
   ILiveRoom,
   ILiveUser,
-  IMessage,
-  IOffer,
-  IOtherJoin,
-  IUpdateJoinInfo,
   IUser,
   LiveRoomTypeEnum,
   liveTypeEnum,
 } from '@/interface';
+import {
+  WSGetRoomAllUserType,
+  WsAnswerType,
+  WsCandidateType,
+  WsGetLiveUserType,
+  WsHeartbeatType,
+  WsJoinType,
+  WsLeavedType,
+  WsMessageType,
+  WsOfferType,
+  WsOtherJoinType,
+  WsRoomLivingType,
+  WsUpdateJoinInfoType,
+} from '@/interface-ws';
 import { WebRTCClass } from '@/network/webRTC';
 import {
   WebSocketClass,
   WsConnectStatusEnum,
   WsMsgTypeEnum,
-  prettierReceiveWebsocket,
+  prettierReceiveWsMsg,
 } from '@/network/webSocket';
 import { AppRootState, useAppStore } from '@/store/app';
 import { useNetworkStore } from '@/store/network';
@@ -41,7 +47,7 @@ export const useWs = () => {
   const roomId = ref('');
   const roomName = ref('');
   const roomNoLive = ref(false);
-  const roomLiveing = ref<IJoin['data']>();
+  const roomLiving = ref(false);
   const liveRoomInfo = ref<ILiveRoom>();
   const anchorInfo = ref<IUser>();
   const isAnchor = ref(false);
@@ -281,7 +287,7 @@ export const useWs = () => {
             ?.replaceTrack(canvasVideoStream.value!.getAudioTracks()[0]);
           const vel = createVideo({});
           vel.srcObject = canvasVideoStream.value!;
-          document.body.appendChild(vel);
+          // document.body.appendChild(vel);
           console.log(
             rtc.peerConnection
               ?.getSenders()
@@ -319,7 +325,7 @@ export const useWs = () => {
           }`
         );
       }
-      const data: IUpdateJoinInfo['data'] = {
+      const data: WsUpdateJoinInfoType['data'] = {
         live_room_id: Number(roomId.value),
         track: {
           audio: appStore.getTrackInfo().audio > 0 ? 1 : 2,
@@ -372,7 +378,7 @@ export const useWs = () => {
           }`
         );
       }
-      const data: IUpdateJoinInfo['data'] = {
+      const data: WsUpdateJoinInfoType['data'] = {
         live_room_id: Number(roomId.value),
         track: {
           audio: appStore.getTrackInfo().audio > 0 ? 1 : 2,
@@ -391,17 +397,15 @@ export const useWs = () => {
     return networkStore.wsMap.get(roomId.value)?.socketIo?.id || '-1';
   }
 
-  function handleHeartbeat(liveId: number) {
+  function handleHeartbeat(socketId: string) {
     loopHeartbeatTimer.value = setInterval(() => {
-      const instance = networkStore.wsMap.get(roomId.value);
-      if (!instance) return;
-      const heartbeatData: IHeartbeat['data'] = {
-        live_id: liveId,
-        live_room_id: Number(roomId.value),
-      };
-      instance.send({
+      const ws = networkStore.wsMap.get(roomId.value);
+      if (!ws) return;
+      ws.send<WsHeartbeatType['data']>({
         msgType: WsMsgTypeEnum.heartbeat,
-        data: heartbeatData,
+        data: {
+          socket_id: socketId,
+        },
       });
     }, 1000 * 5);
   }
@@ -436,22 +440,11 @@ export const useWs = () => {
       let res;
 
       if (isPull.value) {
-        console.log(
-          roomLiveing.value,
-          2222222222,
-          roomLiveing.value!.live!.live_room!.rtmp_url!.replace(
-            'rtmp',
-            'webrtc'
-          )
-        );
         res = await fetchRtcV1Play({
           api: `/rtc/v1/play/`,
           clientip: null,
           sdp: sdp!.sdp!,
-          streamurl: roomLiveing.value!.live!.live_room!.rtmp_url!.replace(
-            'rtmp',
-            'webrtc'
-          ),
+          streamurl: liveRoomInfo.value!.rtmp_url!.replace('rtmp', 'webrtc'),
           tid: getRandomString(10),
         });
       } else {
@@ -465,7 +458,7 @@ export const useWs = () => {
           ),
           tid: getRandomString(10),
         });
-        const data: IUpdateJoinInfo['data'] = {
+        const data: WsUpdateJoinInfoType['data'] = {
           live_room_id: Number(roomId.value),
           track: {
             audio: appStore.getTrackInfo().audio > 0 ? 1 : 2,
@@ -487,6 +480,12 @@ export const useWs = () => {
     }
   }
 
+  function sendStartLive() {
+    networkStore.wsMap.get(roomId.value)?.send({
+      msgType: WsMsgTypeEnum.startLive,
+    });
+  }
+
   function sendJoin() {
     const instance = networkStore.wsMap.get(roomId.value);
     if (!instance) return;
@@ -509,24 +508,20 @@ export const useWs = () => {
         );
       }
     }
-    const joinData: IJoin['data'] = {
-      live_room: {
-        id: Number(roomId.value),
-        name: roomName.value,
-        cover_img: lastCoverImg.value,
-        type: isSRS.value
-          ? LiveRoomTypeEnum.user_srs
-          : LiveRoomTypeEnum.user_wertc,
-        rtmp_url: resUrl,
-      },
-      live: {
-        track_audio: appStore.getTrackInfo().audio > 0 ? 1 : 2,
-        track_video: appStore.getTrackInfo().video > 0 ? 1 : 2,
-      },
-    };
-    instance.send({
+    instance.send<WsJoinType['data']>({
       msgType: WsMsgTypeEnum.join,
-      data: joinData,
+      data: {
+        socket_id: getSocketId(),
+        live_room: {
+          id: Number(roomId.value),
+          name: roomName.value,
+          cover_img: lastCoverImg.value,
+          type: isSRS.value
+            ? LiveRoomTypeEnum.user_srs
+            : LiveRoomTypeEnum.user_wertc,
+          rtmp_url: resUrl,
+        },
+      },
     });
   }
 
@@ -633,7 +628,8 @@ export const useWs = () => {
     if (!ws?.socketIo) return;
     // websocket连接成功
     ws.socketIo.on(WsConnectStatusEnum.connect, () => {
-      prettierReceiveWebsocket(WsConnectStatusEnum.connect);
+      prettierReceiveWsMsg(WsConnectStatusEnum.connect, ws.socketIo);
+      handleHeartbeat(ws.socketIo!.id);
       if (!ws) return;
       ws.status = WsConnectStatusEnum.connect;
       ws.update();
@@ -642,15 +638,15 @@ export const useWs = () => {
 
     // websocket连接断开
     ws.socketIo.on(WsConnectStatusEnum.disconnect, () => {
-      prettierReceiveWebsocket(WsConnectStatusEnum.disconnect, ws);
+      prettierReceiveWsMsg(WsConnectStatusEnum.disconnect, ws);
       if (!ws) return;
       ws.status = WsConnectStatusEnum.disconnect;
       ws.update();
     });
 
     // 收到offer
-    ws.socketIo.on(WsMsgTypeEnum.offer, async (data: IOffer) => {
-      prettierReceiveWebsocket(
+    ws.socketIo.on(WsMsgTypeEnum.offer, async (data: WsOfferType) => {
+      prettierReceiveWsMsg(
         WsMsgTypeEnum.offer,
         `发送者:${data.data.sender},接收者:${data.data.receiver}`,
         data
@@ -673,7 +669,7 @@ export const useWs = () => {
           await rtc.setRemoteDescription(data.data.sdp);
           const sdp = await rtc.createAnswer();
           await rtc.setLocalDescription(sdp!);
-          const answerData: IAnswer = {
+          const answerData: WsAnswerType['data'] = {
             sdp,
             sender: getSocketId(),
             receiver: data.data.sender,
@@ -690,8 +686,8 @@ export const useWs = () => {
     });
 
     // 收到answer
-    ws.socketIo.on(WsMsgTypeEnum.answer, async (data: IOffer) => {
-      prettierReceiveWebsocket(
+    ws.socketIo.on(WsMsgTypeEnum.answer, async (data: WsOfferType) => {
+      prettierReceiveWsMsg(
         WsMsgTypeEnum.answer,
         `发送者:${data.data.sender},接收者:${data.data.receiver}`,
         data
@@ -710,22 +706,22 @@ export const useWs = () => {
     });
 
     // 收到candidate
-    ws.socketIo.on(WsMsgTypeEnum.candidate, (data: ICandidate) => {
-      prettierReceiveWebsocket(
+    ws.socketIo.on(WsMsgTypeEnum.candidate, (data: WsCandidateType['data']) => {
+      prettierReceiveWsMsg(
         WsMsgTypeEnum.candidate,
-        `发送者:${data.data.sender},接收者:${data.data.receiver}`,
+        `发送者:${data.sender},接收者:${data.receiver}`,
         data
       );
       if (isSRS.value) return;
       if (!ws) return;
-      const rtc = networkStore.getRtcMap(`${roomId.value}___${data.socket_id}`);
+      const rtc = networkStore.getRtcMap(`${roomId.value}___${data.sender}`);
       if (!rtc) return;
-      if (data.socket_id !== getSocketId()) {
+      if (data.sender !== getSocketId()) {
         console.log('不是我发的candidate');
         const candidate = new RTCIceCandidate({
-          sdpMid: data.data.sdpMid,
-          sdpMLineIndex: data.data.sdpMLineIndex,
-          candidate: data.data.candidate,
+          sdpMid: data.sdpMid,
+          sdpMLineIndex: data.sdpMLineIndex,
+          candidate: data.candidate,
         });
         rtc.peerConnection
           ?.addIceCandidate(candidate)
@@ -741,9 +737,10 @@ export const useWs = () => {
     });
 
     // 管理员正在直播
-    ws.socketIo.on(WsMsgTypeEnum.roomLiveing, (data: IJoin) => {
-      prettierReceiveWebsocket(WsMsgTypeEnum.roomLiveing, data);
-      roomLiveing.value = data.data;
+    ws.socketIo.on(WsMsgTypeEnum.roomLiving, (data: WsRoomLivingType) => {
+      prettierReceiveWsMsg(WsMsgTypeEnum.roomLiving, data);
+      roomLiving.value = true;
+      roomNoLive.value = false;
       // 如果是srs开播,则不需要等有人进来了才new webrtc,只要Websocket连上了就开始new webrtc
       if (isSRS.value) {
         if (isPull.value) {
@@ -759,18 +756,29 @@ export const useWs = () => {
 
     // 管理员不在直播
     ws.socketIo.on(WsMsgTypeEnum.roomNoLive, (data) => {
-      prettierReceiveWebsocket(WsMsgTypeEnum.roomNoLive, data);
+      prettierReceiveWsMsg(WsMsgTypeEnum.roomNoLive, data);
       roomNoLive.value = true;
+      roomLiving.value = false;
     });
 
     // 当前所有在线用户
-    ws.socketIo.on(WsMsgTypeEnum.liveUser, (data) => {
-      prettierReceiveWebsocket(WsMsgTypeEnum.liveUser, data);
-    });
+    ws.socketIo.on(
+      WsMsgTypeEnum.liveUser,
+      (data: WSGetRoomAllUserType['data']) => {
+        prettierReceiveWsMsg(WsMsgTypeEnum.liveUser, data);
+        const res = data.liveUser.map((item) => {
+          return {
+            id: item.id,
+            // userInfo: item.id,
+          };
+        });
+        liveUserList.value = res;
+      }
+    );
 
     // 收到用户发送消息
-    ws.socketIo.on(WsMsgTypeEnum.message, (data: IMessage) => {
-      prettierReceiveWebsocket(WsMsgTypeEnum.message, data);
+    ws.socketIo.on(WsMsgTypeEnum.message, (data: WsMessageType) => {
+      prettierReceiveWsMsg(WsMsgTypeEnum.message, data);
       if (!ws) return;
       damuList.value.push({
         socket_id: data.socket_id,
@@ -781,58 +789,67 @@ export const useWs = () => {
     });
 
     // 用户加入房间完成
-    ws.socketIo.on(WsMsgTypeEnum.joined, (data: IJoin) => {
-      prettierReceiveWebsocket(WsMsgTypeEnum.joined, data);
-      if (data.data.live) {
-        handleHeartbeat(data.data.live.id!);
-      }
+    ws.socketIo.on(WsMsgTypeEnum.joined, (data: WsJoinType['data']) => {
+      prettierReceiveWsMsg(WsMsgTypeEnum.joined, data);
       joined.value = true;
-      trackInfo.track_audio = data.data.live?.track_audio!;
-      trackInfo.track_video = data.data.live?.track_video!;
+      trackInfo.track_audio = 1;
+      trackInfo.track_video = 1;
       liveUserList.value.push({
-        id: `${getSocketId()}`,
+        id: data.socket_id,
         userInfo: data.user_info,
       });
-      liveRoomInfo.value = data.data.live_room;
-      anchorInfo.value = data.data.anchor_info;
+      liveRoomInfo.value = data.live_room;
+      anchorInfo.value = data.anchor_info;
+      ws.send<WsGetLiveUserType['data']>({
+        msgType: WsMsgTypeEnum.getLiveUser,
+        data: {
+          live_room_id: data.live_room.id!,
+        },
+      });
       // 如果是srs开播,则不需要等有人进来了才new webrtc,只要Websocket连上了就开始new webrtc
-      if (isSRS.value) {
-        if (!isPull.value) {
-          startNewWebRtc({
-            receiver: 'srs',
-            videoEl: localVideo.value,
-          });
-        }
-      }
+      // if (isSRS.value) {
+      //   if (!isPull.value) {
+      //     startNewWebRtc({
+      //       receiver: 'srs',
+      //       videoEl: localVideo.value,
+      //     });
+      //   }
+      // }
     });
 
     // 其他用户加入房间
-    ws.socketIo.on(WsMsgTypeEnum.otherJoin, (data: IOtherJoin) => {
-      prettierReceiveWebsocket(WsMsgTypeEnum.otherJoin, data);
+    ws.socketIo.on(WsMsgTypeEnum.otherJoin, (data: WsOtherJoinType['data']) => {
+      prettierReceiveWsMsg(WsMsgTypeEnum.otherJoin, data);
       liveUserList.value.push({
-        id: data.data.join_socket_id,
-        userInfo: data.data.join_user_info,
+        id: data.join_socket_id,
+        userInfo: data.join_user_info,
       });
       const danmu: IDanmu = {
         msgType: DanmuMsgTypeEnum.otherJoin,
-        socket_id: data.data.join_socket_id,
-        userInfo: data.data.join_user_info,
+        socket_id: data.join_socket_id,
+        userInfo: data.join_user_info,
         msg: '',
       };
       damuList.value.push(danmu);
+      ws.send<WsGetLiveUserType['data']>({
+        msgType: WsMsgTypeEnum.getLiveUser,
+        data: {
+          live_room_id: data.live_room.id!,
+        },
+      });
       // 如果是srs开播,且进来的用户不是srs-webrtc-pull,则不能再new webrtc了
       if (isSRS.value) return;
       if (joined.value) {
-        startNewWebRtc({
-          receiver: data.data.join_socket_id,
-          videoEl: localVideo.value,
-        });
+        // startNewWebRtc({
+        //   receiver: data.join_socket_id,
+        //   videoEl: localVideo.value,
+        // });
       }
     });
 
     // 用户离开房间
     ws.socketIo.on(WsMsgTypeEnum.leave, (data) => {
-      prettierReceiveWebsocket(WsMsgTypeEnum.leave, data);
+      prettierReceiveWsMsg(WsMsgTypeEnum.leave, data);
       if (!ws) return;
       ws.send({
         msgType: WsMsgTypeEnum.leave,
@@ -841,19 +858,20 @@ export const useWs = () => {
     });
 
     // 用户离开房间完成
-    ws.socketIo.on(WsMsgTypeEnum.leaved, (data) => {
-      prettierReceiveWebsocket(WsMsgTypeEnum.leaved, data);
+    ws.socketIo.on(WsMsgTypeEnum.leaved, (data: WsLeavedType['data']) => {
+      prettierReceiveWsMsg(WsMsgTypeEnum.leaved, data);
       networkStore.rtcMap
-        .get(`${roomId.value}___${data.socketId as string}`)
+        .get(`${roomId.value}___${data.socket_id as string}`)
         ?.close();
-      networkStore.removeRtc(`${roomId.value}___${data.socketId as string}`);
+      networkStore.removeRtc(`${roomId.value}___${data.socket_id as string}`);
       const res = liveUserList.value.filter(
-        (item) => item.id !== data.socketId
+        (item) => item.id !== data.socket_id
       );
       liveUserList.value = res;
       damuList.value.push({
-        socket_id: data.socketId,
+        socket_id: data.socket_id,
         msgType: DanmuMsgTypeEnum.userLeaved,
+        userInfo: data.user_info,
         msg: '',
       });
     });
@@ -896,9 +914,11 @@ export const useWs = () => {
     initWs,
     addTrack,
     delTrack,
+    startNewWebRtc,
+    sendStartLive,
     canvasVideoStream,
     lastCoverImg,
-    roomLiveing,
+    roomLiving,
     liveRoomInfo,
     anchorInfo,
     roomNoLive,

+ 95 - 0
src/interface-ws.ts

@@ -0,0 +1,95 @@
+import { DanmuMsgTypeEnum, ILiveRoom, IUser } from './interface';
+
+export interface IWsFormat<T> {
+  /** 用户socket_id */
+  socket_id: string;
+  /** 是否是主播 */
+  is_anchor: boolean;
+  /** 用户信息 */
+  user_info?: IUser;
+  data: T;
+}
+
+export type WsUpdateJoinInfoType = IWsFormat<{
+  live_room_id: number;
+  track?: { audio: number; video: number };
+  rtmp_url?: string;
+}>;
+
+export type WSGetRoomAllUserType = IWsFormat<{
+  liveUser: { id: any; rooms: any[] }[];
+}>;
+
+export type WsRoomLivingType = IWsFormat<{
+  live_room: ILiveRoom;
+}>;
+
+export type WsGetLiveUserType = IWsFormat<{
+  live_room_id: number;
+}>;
+
+export type WsMessageType = IWsFormat<{
+  msgType: DanmuMsgTypeEnum;
+  msg: string;
+  live_room_id: number;
+}>;
+
+export type WsOtherJoinType = IWsFormat<{
+  live_room: ILiveRoom;
+  live_room_user_info: IUser;
+  join_user_info?: IUser;
+  join_socket_id: string;
+}>;
+
+export type WsRoomNoLiveType = IWsFormat<{
+  live_room: ILiveRoom;
+}>;
+
+export type WsJoinType = IWsFormat<{
+  socket_id: string;
+  live_room: ILiveRoom;
+  anchor_info?: IUser;
+  user_info?: IUser;
+}>;
+
+export type WsLeavedType = IWsFormat<{
+  socket_id: string;
+  user_info?: IUser;
+}>;
+
+export type WsStartLiveType = IWsFormat<{
+  socket_id: string;
+  user_info: IUser;
+  data: any;
+}>;
+
+export interface IRoomLiving {
+  live_room: ILiveRoom;
+}
+
+export type WsOfferType = IWsFormat<{
+  sdp: any;
+  sender: string;
+  receiver: string;
+  live_room_id: number;
+}>;
+
+export type WsAnswerType = IWsFormat<{
+  sdp: any;
+  sender: string;
+  receiver: string;
+  live_room_id: number;
+}>;
+
+export type WsHeartbeatType = IWsFormat<{
+  socket_id: string;
+}>;
+
+export type WsCandidateType = IWsFormat<{
+  live_room_id: number;
+  candidate: string;
+  sdpMid: string | null;
+  sdpMLineIndex: number | null;
+  receiver: string;
+  sender: string;
+}>;

+ 19 - 0
src/interface.ts

@@ -327,6 +327,7 @@ export enum MediaTypeEnum {
   txt,
   img,
   media,
+  time,
   webAudio,
 }
 
@@ -380,6 +381,24 @@ export interface IOtherJoin {
   };
 }
 
+export interface IRoomNoLive {
+  data: {
+    live_room: ILiveRoom;
+  };
+}
+
+export interface IStartLive {
+  socket_id: string;
+  user_info: IUser;
+  data: any;
+}
+
+export interface IRoomLiving {
+  data: {
+    live_room: ILiveRoom;
+  };
+}
+
 export interface IJoin {
   socket_id: string;
   is_anchor: boolean;

+ 11 - 11
src/network/webRTC.ts

@@ -393,17 +393,17 @@ export class WebRTCClass {
         console.log('准备发送candidate', event.candidate.candidate);
         const roomId = this.roomId.split('___')[0];
         const receiver = this.roomId.split('___')[1];
-        const data: ICandidate['data'] = {
-          candidate: event.candidate.candidate,
-          sdpMid: event.candidate.sdpMid,
-          sdpMLineIndex: event.candidate.sdpMLineIndex,
-          sender: networkStore.wsMap.get(roomId)?.socketIo?.id || '',
-          receiver,
-          live_room_id: Number(roomId),
-        };
-        networkStore.wsMap
-          .get(roomId)
-          ?.send({ msgType: WsMsgTypeEnum.candidate, data });
+        networkStore.wsMap.get(roomId)?.send<ICandidate['data']>({
+          msgType: WsMsgTypeEnum.candidate,
+          data: {
+            candidate: event.candidate.candidate,
+            sdpMid: event.candidate.sdpMid,
+            sdpMLineIndex: event.candidate.sdpMLineIndex,
+            sender: networkStore.wsMap.get(roomId)?.socketIo?.id || '',
+            receiver,
+            live_room_id: Number(roomId),
+          },
+        });
       } else {
         console.log('没有候选者了');
       }

+ 18 - 6
src/network/webSocket.ts

@@ -1,5 +1,6 @@
 import { Socket, io } from 'socket.io-client';
 
+import { IWsFormat } from '@/interface-ws';
 import { useNetworkStore } from '@/store/network';
 import { useUserStore } from '@/store/user';
 
@@ -38,7 +39,7 @@ export enum WsMsgTypeEnum {
   /** 用户发送消息 */
   message = 'message',
   /** 房间正在直播 */
-  roomLiveing = 'roomLiveing',
+  roomLiving = 'roomLiving',
   /** 房间不在直播 */
   roomNoLive = 'roomNoLive',
   /** sendBlob */
@@ -52,11 +53,15 @@ export enum WsMsgTypeEnum {
   offer = 'offer',
   answer = 'answer',
   candidate = 'candidate',
+  startLive = 'startLive',
 }
 
-export function prettierReceiveWebsocket(...arg) {
+export function prettierReceiveWsMsg(...arg) {
   console.warn('【websocket】收到消息', ...arg);
 }
+export function prettierSendWsMsg(...arg) {
+  console.warn('【websocket】发送消息', ...arg);
+}
 
 export class WebSocketClass {
   socketIo: Socket | null = null;
@@ -82,7 +87,14 @@ export class WebSocketClass {
   }
 
   // 发送websocket消息
-  send = ({ msgType, data }: { msgType: WsMsgTypeEnum; data?: any }) => {
+  send = <T extends unknown>({
+    // 写成<T extends unknown>而不是<T>是为了避免eslint将箭头函数的<T>后面的内容识别成jsx语法
+    msgType,
+    data,
+  }: {
+    msgType: WsMsgTypeEnum;
+    data?: T;
+  }) => {
     if (!this.socketIo?.connected) {
       console.error(
         '【websocket】未连接成功,不发送websocket消息!',
@@ -91,13 +103,13 @@ export class WebSocketClass {
       );
       return;
     }
-    console.warn('【websocket】发送消息', msgType, data);
+    prettierSendWsMsg(msgType, data);
     const userStore = useUserStore();
-    const sendData = {
+    const sendData: IWsFormat<any> = {
       socket_id: this.socketIo.id,
       is_anchor: this.isAnchor,
       user_info: userStore.userInfo,
-      data,
+      data: data || {},
     };
     this.socketIo?.emit(msgType, sendData);
   };

+ 1 - 0
src/store/app/index.ts

@@ -23,6 +23,7 @@ export type AppRootState = {
     muted?: boolean;
     videoEl?: HTMLVideoElement;
     txtInfo?: { txt: string; color: string };
+    timeInfo?: { color: string };
     rect?: { top: number; left: number };
     scaleInfo?: { scaleX: number; scaleY: number };
   }[];

+ 77 - 62
src/views/pull/index.vue

@@ -22,6 +22,7 @@
             </div>
           </div>
         </div>
+        <div class="other">在线人数:{{ liveUserList.length }}</div>
       </div>
       <div
         ref="containerRef"
@@ -47,10 +48,15 @@
             }"
           ></div>
           <div
-            ref="canvasRef"
+            ref="remoteVideoRef"
             class="media-list"
             :class="{ item: appStore.allTrack.length > 1 }"
           ></div>
+          <!-- <div
+            ref="remoteVideoRef"
+            class="media-list"
+            :class="{ item: appStore.allTrack.length > 1 }"
+          ></div> -->
           <VideoControls></VideoControls>
         </div>
       </div>
@@ -64,6 +70,7 @@
           v-for="(item, index) in giftGoodsList"
           :key="index"
           class="item"
+          @click="handlePay()"
         >
           <div
             class="ico"
@@ -171,8 +178,10 @@
         </div>
       </div>
     </div>
-    <RechargeCpt v-if="showRecharge"></RechargeCpt>
-    <!-- </template> -->
+    <RechargeCpt
+      :show="showRecharge"
+      @close="(v) => (showRecharge = v)"
+    ></RechargeCpt>
   </div>
 </template>
 
@@ -207,7 +216,7 @@ const showSidebar = ref(true);
 const topRef = ref<HTMLDivElement>();
 const bottomRef = ref<HTMLDivElement>();
 const danmuListRef = ref<HTMLDivElement>();
-const canvasRef = ref<HTMLDivElement>();
+const remoteVideoRef = ref<HTMLDivElement>();
 const containerRef = ref<HTMLDivElement>();
 const localVideoRef = ref<HTMLVideoElement[]>([]);
 const {
@@ -219,6 +228,7 @@ const {
   sendDanmu,
   addVideo,
   videoLoading,
+  remoteVideo,
   roomNoLive,
   damuList,
   liveUserList,
@@ -227,7 +237,6 @@ const {
   anchorInfo,
 } = usePull({
   localVideoRef,
-  canvasRef,
   liveType: route.query.liveType as liveTypeEnum,
   isSRS: [
     liveTypeEnum.srsWebrtcPull,
@@ -236,6 +245,52 @@ const {
   ].includes(route.query.liveType as liveTypeEnum),
 });
 
+onUnmounted(() => {
+  closeWs();
+  closeRtc();
+});
+
+watch(
+  () => remoteVideo.value,
+  (newVal) => {
+    newVal.forEach((item) => {
+      remoteVideoRef.value?.appendChild(item);
+    });
+  },
+  {
+    deep: true,
+    immediate: true,
+  }
+);
+
+onMounted(() => {
+  setTimeout(() => {
+    scrollTo(0, 0);
+  }, 100);
+  getGoodsList();
+  if (
+    [
+      liveTypeEnum.srsHlsPull,
+      liveTypeEnum.srsFlvPull,
+      liveTypeEnum.srsWebrtcPull,
+    ].includes(route.query.liveType as liveTypeEnum)
+  ) {
+    showSidebar.value = false;
+  }
+  if (topRef.value && bottomRef.value && containerRef.value) {
+    const res =
+      bottomRef.value.getBoundingClientRect().top -
+      (topRef.value.getBoundingClientRect().top +
+        topRef.value.getBoundingClientRect().height);
+    height.value = res;
+  }
+  initPull();
+});
+
+function handlePay() {
+  window.$message.info('敬请期待~');
+}
+
 async function getGoodsList() {
   try {
     giftLoading.value = true;
@@ -255,8 +310,9 @@ async function getGoodsList() {
 }
 
 function handleRecharge() {
+  console.log(showRecharge.value);
   if (!loginTip()) return;
-  showRecharge.value = !showRecharge.value;
+  showRecharge.value = true;
 }
 
 function handleJoin() {
@@ -278,32 +334,6 @@ watch(
     }, 0);
   }
 );
-
-onUnmounted(() => {
-  closeWs();
-  closeRtc();
-});
-
-onMounted(() => {
-  getGoodsList();
-  if (
-    [
-      liveTypeEnum.srsHlsPull,
-      liveTypeEnum.srsFlvPull,
-      liveTypeEnum.srsWebrtcPull,
-    ].includes(route.query.liveType as liveTypeEnum)
-  ) {
-    showSidebar.value = false;
-  }
-  if (topRef.value && bottomRef.value && containerRef.value) {
-    const res =
-      bottomRef.value.getBoundingClientRect().top -
-      (topRef.value.getBoundingClientRect().top +
-        topRef.value.getBoundingClientRect().height);
-    height.value = res;
-  }
-  initPull();
-});
 </script>
 
 <style lang="scss" scoped>
@@ -355,26 +385,7 @@ onMounted(() => {
         display: flex;
         flex-direction: column;
         justify-content: center;
-        font-size: 12px;
-        .top {
-          display: flex;
-          align-items: center;
-          .item {
-            display: flex;
-            align-items: center;
-            margin-right: 20px;
-            .ico {
-              display: inline-block;
-              margin-right: 4px;
-              width: 10px;
-              height: 10px;
-              border-radius: 50%;
-            }
-          }
-        }
-        .bottom {
-          margin-top: 10px;
-        }
+        font-size: 14px;
       }
     }
     .container {
@@ -419,16 +430,16 @@ onMounted(() => {
             width: 100%;
             height: 100%;
           }
-          &.item {
-            :deep(video) {
-              width: 50%;
-              height: initial !important;
-            }
-            :deep(canvas) {
-              width: 50%;
-              height: initial !important;
-            }
-          }
+          // &.item {
+          //   :deep(video) {
+          //     width: 50%;
+          //     height: initial !important;
+          //   }
+          //   :deep(canvas) {
+          //     width: 50%;
+          //     height: initial !important;
+          //   }
+          // }
         }
 
         .controls {
@@ -481,6 +492,9 @@ onMounted(() => {
       box-sizing: border-box;
       margin: 5px 0;
       height: 100px;
+      > :last-child {
+        position: absolute;
+      }
       .item {
         display: flex;
         align-items: center;
@@ -494,6 +508,7 @@ onMounted(() => {
         &:hover {
           background-color: #ebe0ce;
         }
+
         .ico {
           position: relative;
           width: 45px;

+ 22 - 6
src/views/pull/recharge/index.vue

@@ -6,6 +6,7 @@
       title="充值"
       preset="card"
       class="container"
+      @update:show="handleOnClose"
     >
       <div>
         充值金额(最低充值{{ minMoney }}元,最高充值{{ maxMoney }}元)
@@ -36,13 +37,13 @@
 </template>
 
 <script lang="ts" setup>
-import { nextTick, reactive, ref } from 'vue';
+import { nextTick, reactive, ref, watch } from 'vue';
 
 import { fetchFindByTypeGoods } from '@/api/goods';
 import QrPayCpt from '@/components/QrPay/index.vue';
 import { GoodsTypeEnum } from '@/interface';
 
-const showModal = ref(true);
+const showModal = ref(false);
 const maxMoney = 200;
 const minMoney = 0.1;
 const money = ref(minMoney);
@@ -54,6 +55,24 @@ const goodsInfo = reactive({
   liveRoomId: -1,
 });
 
+const props = defineProps({
+  show: { type: Boolean, default: false },
+});
+
+const emits = defineEmits(['close']);
+
+watch(
+  () => props.show,
+  (v) => {
+    showModal.value = v;
+  }
+);
+
+function handleOnClose(v) {
+  emits('close', v);
+  showQrPay.value = false;
+}
+
 async function startPay() {
   if (money.value < minMoney) {
     window.$message.warning(`最少充值${minMoney}元!`);
@@ -71,7 +90,4 @@ async function startPay() {
 }
 </script>
 
-<style lang="scss" scoped>
-.recharge-wrap {
-}
-</style>
+<style lang="scss" scoped></style>

+ 144 - 52
src/views/pushByCanvas/index.vue

@@ -264,7 +264,15 @@ import {
 } from '@vicons/ionicons5';
 import { fabric } from 'fabric';
 import { UploadFileInfo } from 'naive-ui';
-import { markRaw, onMounted, onUnmounted, reactive, ref, watch } from 'vue';
+import {
+  Raw,
+  markRaw,
+  onMounted,
+  onUnmounted,
+  reactive,
+  ref,
+  watch,
+} from 'vue';
 import { useRoute } from 'vue-router';
 import * as workerTimers from 'worker-timers';
 
@@ -304,6 +312,7 @@ const fabricCanvas = ref<fabric.Canvas>();
 const localVideoRef = ref<HTMLVideoElement>();
 const audioCtx = ref<AudioContext>();
 const remoteVideoRef = ref<HTMLVideoElement[]>([]);
+const timeCanvasDom = ref<Raw<fabric.Text>[]>([]);
 const isSRS = route.query.liveType === liveTypeEnum.srsPush;
 const wrapSize = reactive({
   width: 0,
@@ -319,8 +328,8 @@ const {
   endLive,
   sendDanmu,
   keydownDanmu,
-  lastCoverImg,
   canvasVideoStream,
+  lastCoverImg,
   isLiving,
   allMediaTypeList,
   currentResolutionRatio,
@@ -357,7 +366,6 @@ onMounted(() => {
   setTimeout(() => {
     scrollTo(0, 0);
   }, 100);
-  // initNullAudio();
   initUserMedia();
   initCanvas();
   handleCache();
@@ -383,7 +391,7 @@ function onPageVisibility() {
     if (isLiving.value) {
       const delay = 1000 / 60; // 16.666666666666668
       workerTimerId.value = workerTimers.setInterval(() => {
-        fabricCanvas.value?.renderAll();
+        renderAll();
       }, delay);
     }
   } else {
@@ -422,12 +430,19 @@ function initUserMedia() {
     });
 }
 
+function renderAll() {
+  timeCanvasDom.value.forEach((item) => {
+    item.text = new Date().toLocaleString();
+  });
+  fabricCanvas.value?.renderAll();
+}
+
 function clearFrame() {
   window.cancelAnimationFrame(requestAnimationFrameId.value);
 }
 
 function renderFrame() {
-  fabricCanvas.value?.renderAll();
+  renderAll();
   requestAnimationFrameId.value = window.requestAnimationFrame(renderFrame);
 }
 
@@ -456,7 +471,7 @@ function initNullAudio() {
   gainNode.connect(audioContext.destination);
   const destination = audioContext.createMediaStreamDestination();
 
-  const audioTrack: AppRootState['allTrack'][0] = {
+  const webAudioTrack: AppRootState['allTrack'][0] = {
     id: getRandomEnglishString(8),
     audio: 1,
     video: 2,
@@ -469,22 +484,16 @@ function initNullAudio() {
     hidden: true,
     muted: false,
   };
-  const res = [...appStore.allTrack, audioTrack];
+  const res = [...appStore.allTrack, webAudioTrack];
   appStore.setAllTrack(res);
-  const webAudioItem = resourceCacheStore.list.find(
-    (item) => item.type === MediaTypeEnum.webAudio
-  );
-  if (!webAudioItem) {
-    resourceCacheStore.setList([...resourceCacheStore.list, audioTrack]);
-  }
   const vel = createVideo({});
-  // vel.style.width = `1px`;
-  // vel.style.height = `1px`;
+  vel.style.width = `1px`;
+  vel.style.height = `1px`;
   vel.style.position = 'fixed';
   vel.style.bottom = '0';
   vel.style.right = '0';
-  // vel.style.opacity = '0';
-  // vel.style.pointerEvents = 'none';
+  vel.style.opacity = '0';
+  vel.style.pointerEvents = 'none';
   vel.srcObject = destination.stream;
   document.body.appendChild(vel);
 }
@@ -493,6 +502,7 @@ let streamTmp: MediaStream;
 let vel;
 
 function handleMixedAudio() {
+  console.log('handleMixedAudiohandleMixedAudio');
   const allAudioTrack = appStore.allTrack.filter((item) => item.audio === 1);
   if (audioCtx.value) {
     const gainNode = audioCtx.value.createGain();
@@ -532,6 +542,10 @@ function handleMixedAudio() {
 }
 
 function handleStartLive() {
+  // WARN 不能省略initNullAudio,否则开播时候没有音频的时候,srs那边的audio是 Stream #0:0: Audio: aac, 44100 Hz, stereo, 128 kb/s
+  // 会导致加载直播很慢,正常的audio应该是Stream #0:0: Audio: aac (LC), 48000 Hz, stereo, fltp
+  // 开播前执行initNullAudio,audio就会是正常的
+  initNullAudio();
   if (!audioCtx.value) {
     audioCtx.value = new AudioContext();
   }
@@ -582,7 +596,7 @@ function autoCreateVideo({
   rect?: { left: number; top: number };
   muted?: boolean;
 }) {
-  console.warn('autoCreateVideo', id);
+  console.warn('autoCreateVideoautoCreateVideo', id);
   const videoEl = createVideo({});
   if (muted !== undefined) {
     videoEl.muted = muted;
@@ -670,42 +684,62 @@ function changeCanvasAttr({
       videoRatio.value;
     fabricCanvas.value.setWidth(resolutionWidth);
     fabricCanvas.value.setHeight(resolutionHeight);
-    // fabricCanvas.value.forEachObject((canvas) => {
-    //   canvas.setCoords();
-    // });
-    appStore.allTrack.forEach((item) => {
-      if (item.canvasDom) {
-        // 分辨率变小了,将图片变小
-        if (newHeight < oldHeight) {
-          const ratio = newHeight / oldHeight;
-          const ratio1 = (item.canvasDom.scaleX || 1) * ratio;
-          const ratio2 = oldHeight / newHeight;
-          console.log(
-            ratio,
-            ratio1,
-            '分辨率变小了,将图片变小-----',
-            item.canvasDom
-          );
-          item.canvasDom.scale(ratio1);
-          item.canvasDom.left = item.canvasDom.left! / ratio2;
-          item.canvasDom.top = item.canvasDom.top! / ratio2;
-        } else {
-          // 分辨率变大了,将图片变大
-          const ratio = newHeight / oldHeight;
-          const ratio1 = (item.canvasDom.scaleX || 1) * ratio;
-          const ratio2 = oldHeight / newHeight;
-          console.log(
-            ratio,
-            ratio1,
-            '分辨率变大了,将图片变大-----',
-            item.canvasDom
-          );
-          item.canvasDom.scale(ratio1);
-          item.canvasDom.left = item.canvasDom.left! / ratio2;
-          item.canvasDom.top = item.canvasDom.top! / ratio2;
-        }
+    fabricCanvas.value.forEachObject((item) => {
+      // 分辨率变小了,将图片变小
+      if (newHeight < oldHeight) {
+        const ratio = newHeight / oldHeight;
+        const ratio1 = (item.scaleX || 1) * ratio;
+        const ratio2 = oldHeight / newHeight;
+        console.log(ratio, ratio1, '分辨率变小了,将图片变小-----', item);
+        item.scale(ratio1);
+        item.left = item.left! / ratio2;
+        item.top = item.top! / ratio2;
+        fabricCanvas.value?.renderAndReset();
+      } else {
+        // 分辨率变大了,将图片变大
+        // const ratio = newHeight / oldHeight;
+        // const ratio1 = (item.scaleX || 1) * ratio;
+        // const ratio2 = oldHeight / newHeight;
+        // console.log(ratio, ratio1, '分辨率变大了,将图片变大-----', item);
+        // item.scale(ratio1);
+        // item.left = item.left! / ratio2;
+        // item.top = item.top! / ratio2;
       }
     });
+    // appStore.allTrack.forEach((item) => {
+    //   console.log('当前类型', item.type);
+    //   if (item.canvasDom) {
+    //     // 分辨率变小了,将图片变小
+    //     if (newHeight < oldHeight) {
+    //       const ratio = newHeight / oldHeight;
+    //       const ratio1 = (item.canvasDom.scaleX || 1) * ratio;
+    //       const ratio2 = oldHeight / newHeight;
+    //       console.log(
+    //         ratio,
+    //         ratio1,
+    //         '分辨率变小了,将图片变小-----',
+    //         item.canvasDom
+    //       );
+    //       item.canvasDom.scale(ratio1);
+    //       item.canvasDom.left = item.canvasDom.left! / ratio2;
+    //       item.canvasDom.top = item.canvasDom.top! / ratio2;
+    //     } else {
+    //       // 分辨率变大了,将图片变大
+    //       const ratio = newHeight / oldHeight;
+    //       const ratio1 = (item.canvasDom.scaleX || 1) * ratio;
+    //       const ratio2 = oldHeight / newHeight;
+    //       console.log(
+    //         ratio,
+    //         ratio1,
+    //         '分辨率变大了,将图片变大-----',
+    //         item.canvasDom
+    //       );
+    //       item.canvasDom.scale(ratio1);
+    //       item.canvasDom.left = item.canvasDom.left! / ratio2;
+    //       item.canvasDom.top = item.canvasDom.top! / ratio2;
+    //     }
+    //   }
+    // });
     changeCanvasStyle();
   }
 }
@@ -903,6 +937,23 @@ async function handleCache() {
         fabricCanvas.value.add(canvasDom);
         obj.canvasDom = canvasDom;
       }
+    } else if (item.type === MediaTypeEnum.time) {
+      obj.timeInfo = item.timeInfo;
+      if (fabricCanvas.value) {
+        const canvasDom = markRaw(
+          new fabric.Text(new Date().toLocaleString(), {
+            top: item.rect?.top || 0,
+            left: item.rect?.left || 0,
+            fill: item.timeInfo?.color,
+          })
+        );
+        timeCanvasDom.value.push(canvasDom);
+        handleMoving({ canvasDom, id: obj.id });
+        handleScaling({ canvasDom, id: obj.id });
+        canvasDom.scale(item.scaleInfo?.scaleX || 1);
+        fabricCanvas.value.add(canvasDom);
+        obj.canvasDom = canvasDom;
+      }
     } else if (item.type === MediaTypeEnum.img) {
       queue.push(handleImg());
     }
@@ -924,6 +975,7 @@ async function addMediaOk(val: {
   deviceId: string;
   mediaName: string;
   txtInfo?: { txt: string; color: string };
+  timeInfo?: { color: string };
   imgInfo?: UploadFileInfo[];
   mediaInfo?: UploadFileInfo[];
 }) {
@@ -1093,6 +1145,46 @@ async function addMediaOk(val: {
     // @ts-ignore
     addTrack(txtTrack);
 
+    console.log('获取文字成功', fabricCanvas.value);
+  } else if (val.type === MediaTypeEnum.time) {
+    const timeTrack: AppRootState['allTrack'][0] = {
+      id: getRandomEnglishString(8),
+      audio: 2,
+      video: 1,
+      mediaName: val.mediaName,
+      type: MediaTypeEnum.time,
+      track: undefined,
+      trackid: undefined,
+      stream: undefined,
+      streamid: undefined,
+      hidden: false,
+      muted: false,
+    };
+    if (fabricCanvas.value) {
+      const canvasDom = markRaw(
+        new fabric.Text(new Date().toLocaleString(), {
+          top: 0,
+          left: 0,
+          fill: val.timeInfo?.color,
+        })
+      );
+      timeCanvasDom.value.push(canvasDom);
+      handleMoving({ canvasDom, id: timeTrack.id });
+      handleScaling({ canvasDom, id: timeTrack.id });
+      timeTrack.timeInfo = val.timeInfo;
+      // @ts-ignore
+      timeTrack.canvasDom = canvasDom;
+      fabricCanvas.value.add(canvasDom);
+    }
+
+    const res = [...appStore.allTrack, timeTrack];
+    // @ts-ignore
+    appStore.setAllTrack(res);
+    // @ts-ignore
+    resourceCacheStore.setList(res);
+    // @ts-ignore
+    addTrack(timeTrack);
+
     console.log('获取文字成功', fabricCanvas.value);
   } else if (val.type === MediaTypeEnum.img) {
     const imgTrack: AppRootState['allTrack'][0] = {

+ 21 - 0
src/views/pushByCanvas/mediaModal/index.vue

@@ -42,6 +42,14 @@
             </div>
           </div>
         </template>
+        <template v-if="props.mediaType === MediaTypeEnum.time && timeInfo">
+          <div class="item">
+            <div class="label">颜色</div>
+            <div class="value">
+              <n-color-picker v-model:value="timeInfo.color" />
+            </div>
+          </div>
+        </template>
         <template v-if="props.mediaType === MediaTypeEnum.img">
           <div class="item">
             <div class="label">图片</div>
@@ -109,6 +117,7 @@ const emits = defineEmits(['close', 'ok']);
 
 const inputOptions = ref<{ label: string; value: string }[]>([]);
 const txtInfo = ref<{ txt: string; color: string }>();
+const timeInfo = ref<{ color: string }>();
 const imgInfo = ref<UploadFileInfo[]>();
 const mediaInfo = ref<UploadFileInfo[]>();
 const currentInput = ref<{
@@ -160,6 +169,7 @@ function handleOk() {
     txtInfo: txtInfo.value,
     imgInfo: imgInfo.value,
     mediaInfo: mediaInfo.value,
+    timeInfo: timeInfo.value,
   });
 }
 
@@ -227,6 +237,17 @@ async function init() {
     setTimeout(() => {
       inputInstRef.value?.focus();
     }, 100);
+  } else if (props.mediaType === MediaTypeEnum.time) {
+    currentInput.value = {
+      ...currentInput.value,
+      type: MediaTypeEnum.time,
+    };
+    timeInfo.value = { color: 'rgba(255,215,0,1)' };
+    mediaName.value = `时间-${
+      appStore.allTrack
+        .filter((item) => item.type === MediaTypeEnum.time)
+        .filter((item) => !item.hidden).length + 1
+    }`;
   } else if (props.mediaType === MediaTypeEnum.img) {
     currentInput.value = {
       ...currentInput.value,

+ 1 - 1
src/views/pushByCanvas/selectMediaModal/index.vue

@@ -27,7 +27,7 @@ import { onMounted } from 'vue';
 
 import { MediaTypeEnum } from '@/interface';
 
-const props = withDefaults(
+withDefaults(
   defineProps<{
     allMediaTypeList: {
       [index: string]: {

Alguns ficheiros não foram mostrados porque muitos ficheiros mudaram neste diff