# HTTP Live Streaming https://www.rfc-editor.org/rfc/rfc8216 ## Introduction > ...delivering continuous and long-form video over the Internet. It allows a receiver to adapt the bit rate of the media to the current network conditions in order to maintain uninterrupted playback at the best possible quality. ## m3u8 HLS 定義的播放清單,[IPTV](https://github.com/iptv-org/iptv) 也用這種格式。另外像 PotPlayer 這樣的軟體也直接撥放 .m3u8 的清單。 有些網站存取 .m3u8 時會卡一些條件,一些 key 或是 cookie。例如 [LineTV](https://www.linetv.tw/drama/13194/eps/1),而 [GIMY](https://gimy.app/eps/533-7-1041.html) 和 [楓林網](https://9maple.org/video/26833-1-1.html) 則好像沒卡條件。Youtube 似乎不是用 HLS,用的機制目前還不知道。 #### Example ``` #EXTM3U #EXT-X-TARGETDURATION:10 #EXTINF:9.009, http://media.example.com/first.ts #EXTINF:9.009, http://media.example.com/second.ts #EXTINF:3.003, http://media.example.com/third.ts ``` ## Format * MPEG-2 Transport Streams * [Fragmented MPEG-4](https://stackoverflow.com/questions/35177797/what-exactly-is-fragmented-mp4fmp4-how-is-it-different-from-normal-mp4) * Packed Audio * WebVTT ## Sample code 不論是前後端都已經有 library 或 framework 可以完成視訊串流,因此需要寫的程式碼很少。 #### Frontend 用 HTML5 的 [video](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/video) 就可以了,不過[支援程度不是很好](https://caniuse.com/?search=video),好在已經有 library 可以用了。 ##### [Videojs](https://github.com/videojs/video.js) ```html <head> <link href="http://vjs.zencdn.net/6.2.8/video-js.css" rel="stylesheet"> </head> <body> <video id="player" class="video-js vjs-default-skin" controls preload> <source src="path_to_m3u8" type="application/x-mpegURL"> </video> <script src="http://vjs.zencdn.net/6.2.8/video.js"></script> <script src="https://cdnjs.cloudflare.com/ajax/libs/videojs-contrib-hls/5.12.2/videojs-contrib-hls.min.js"></script> <script> var p = videojs('player', {width: width, height: height}); p.play(); </script> </body> ``` ##### [HLSjs](https://github.com/video-dev/hls.js/) ```html <script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script> <video id="video" width="320" height="240"></video> <script> if (Hls.isSupported()) { var video = document.getElementById('video'); var hls = new Hls(); hls.on(Hls.Events.MEDIA_ATTACHED, function () { console.log('video and hls.js are now bound together !'); }); hls.on(Hls.Events.MANIFEST_PARSED, function (event, data) { console.log( 'manifest loaded, found ' + data.levels.length + ' quality level' ); video.play(); }); hls.loadSource('/video/test.m3u8'); // bind them together hls.attachMedia(video); } </script> ``` 這些 library 都是靠 [Media Source Extensions](https://developer.mozilla.org/en-US/docs/Web/API/Media_Source_Extensions_API) 去實作的。具體細節就沒看了。 #### Backend 後端不需要處理 HLS,只需要準備好對應的檔案。以 [flask](https://github.com/pallets/flask) 為例,`send_from_directory` 只是對 [Werkzeug](https://werkzeug.palletsprojects.com/en/2.2.x/) 包了一層,沒有特別的邏輯。 ```python app = Flask(__name__) @app.route('/') def index(): return render_template('index.html') @app.route('/video/<string:file_name>') def stream(file_name): return send_from_directory(directory=video_dir, path=file_name) ``` ## Misc [DASH](https://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP) 跟 HLS 就我粗淺的理解是很像的,也是個播放清單並分成多個小檔案。Vimeo 就是用 DASH 協議,看 [code](https://github.com/AbCthings/vimeo-audio-video/blob/master/vimeo-audio-and-video.py) 去理解個大概會是個好方法,大致流程是根據 master.json(就像 .m3u8 一樣) 裡的資料去分別抓 audio & video,再用 FFMPEG 去合成完整的影音檔。 ## Code https://github.com/mingpepe/sample-code/tree/main/hls_web