githubEdit

HTML5 Video 與 WebRTC

HTML5 Video

1.有關canvas串流

參考此 https://github.com/EasonWang01/Node.js-stream-video/tree/master/Desktop/Node.js-stream-video-masterarrow-up-right

但其原理為使用canvas擷取影像,並使用websocket傳遞canvas資料,所以只有影像,這種做法client端用來顯示的 dataURL會持續改變,所以雖然是串流,但畫面會有閃爍問題

2.從HTML5錄製影片並下載

教學

https://developers.google.com/web/updates/2016/01/mediarecorderarrow-up-right

Source code

https://github.com/webrtc/samples/blob/gh-pages/src/content/getusermedia/record/js/main.jsarrow-up-right

其原理為使用 navigator.mediaDevices.getUserMedia存取網頁攝影機後使用new MediaRecorder錄製

而MediaRecorder接到資料後要存入blob

const haveLoadedMetadata = stream => {
      const video = document.querySelector("#localVideo");
      video.srcObject = stream;
      video.play();
      return new Promise(resolve => video.onloadedmetadata = () => resolve(stream)); 
    };
    var constraints = { audio: true, video: { width: 400, height: 200 } };

    navigator.mediaDevices
      .getUserMedia(constraints)
      .then(mediaStream => haveLoadedMetadata(mediaStream))
      .then((mediaStream) => {
          var options = { mimeType: "video/webm; codecs=vp9" };
          const recorder = new MediaRecorder(mediaStream, options);
          recorder.ondataavailable = (e) => {
            console.log(e) //這裡記得要呼叫 recorder.stop() 才會有  ondataavailable
          }
          recorder.start();
      setTimeout(() => {
        recorder.stop()
      }, 2000);
     })
    .catch(function (err) {
      console.log(err.name + ": " + err.message);
    });
    
    
//或是可以用第三方模組 msr
import MediaStreamRecorder from "msr";
var multiStreamRecorder = new MediaStreamRecorder.MultiStreamRecorder(
   mediaStream // from getUserMedia
);
multiStreamRecorder.ondataavailable = function (blob) {
  // POST/PUT "Blob" using FormData/XHR2
  ws.send(blob.video);
};
multiStreamRecorder.start(3000);

之後再把blob轉格式

var superBuffer = new Blob(blob1, {type: 'video/webm'});

最後轉為可用在url的型態

把他放到video的src即可

可參考mediaSource 模式:

https://github.com/webrtc/samples/blob/gh-pages/src/content/getusermedia/record/js/main.jsarrow-up-right

3. getUserMedia streaming with WebSocket

後來想到可以使用將影片擷取10秒一格並分開連續傳送給client達到串流的效果,但一樣因為最後要在前端將video.srcObject 改為blob,只要更改video src都會造成畫面閃爍

client

server

但因為video src 每次更新後畫面會閃爍

4. 之後有了mediaSource API

https://developer.mozilla.org/zh-TW/docs/Web/API/MediaSourcearrow-up-right

MediaSource 串流範例

video.src = URL.createObjectURL(mediaSource);mediaSource.addEventListener("sourceopen") 才會觸發

https://stackoverflow.com/a/52379544arrow-up-right

Async & 加上 websocket 版本:

socket.io 版本參照bitbucket

WebRTC串流

WebRTC 可用範例:

Debug

chrome://webrtc-internals

名詞:

WebRTC 流程:

https://developer.mozilla.org/en-US/docs/Web/API/WebRTC_API/Signaling_and_video_callingarrow-up-right

範例:

https://shanetully.com/2014/09/a-dead-simple-webrtc-example/arrow-up-right

https://github.com/shanet/WebRTC-Examplearrow-up-right

以上兩個為很好且簡單的範例,上面是文章下面是程式碼。

Client 過程

Server

單純廣播所有接收到的訊息給連線的client

video.src vs srcObject

Older versions of the Media Source specification required using createObjectURL()arrow-up-right to create an object URL then setting srcarrow-up-right to that URL. Now you can just set srcObject to the MediaStreamarrow-up-right directly.

https://developer.mozilla.org/en-US/docs/Web/API/HTMLMediaElement/srcObjectarrow-up-right

可能錯誤

  1. Failed to execute 'appendBuffer' on 'SourceBuffer': This SourceBuffer has been removed from the parent media source

解法:將recorder.start(2000); 錄影間隔毫秒縮短即可

Video 屬性

  1. video.autoplay = true; 等同於

WebRTC 錄製

https://stackoverflow.com/questions/16571044/how-to-record-webcam-and-audio-using-webrtc-and-a-server-based-peer-connectionarrow-up-right

React Native WebRTC

https://github.com/react-native-webrtc/react-native-webrtcarrow-up-right

0.6 版本以後需要配置如下,不然 Android 啟動後會 crash

https://github.com/react-native-webrtc/react-native-webrtc/issues/885#issuecomment-723116643arrow-up-right

Last updated