hls.js icon indicating copy to clipboard operation
hls.js copied to clipboard

The problem of drawing frame images on the canvas

Open yzydeveloper opened this issue 1 year ago • 7 comments

Is your feature request related to a problem? Please describe.

None

Describe the solution you'd like

In hls.js, using MediaSource combined with Video to display videos, but is there a way to obtain YUV data for each frame? Then use Canvas and AudioContext to play the video

Additional context

No response

yzydeveloper avatar May 01 '24 14:05 yzydeveloper

@mangui

yzydeveloper avatar May 01 '24 15:05 yzydeveloper

HLS.js does not provide methods for interacting with HTMLMediaElement that are already available as part of the Web API.

robwalch avatar May 03 '24 15:05 robwalch

HLS.js does not provide methods for interacting with HTMLMediaElement that are already available as part of the Web API.

Is there any other way for hls.js to play on browsers that do not support mse? Using canvas and audiocontext

yzydeveloper avatar May 03 '24 15:05 yzydeveloper

Is there any other way for hls.js to play on browsers that do not support mse?

HLS.js only uses MSE.

robwalch avatar May 05 '24 01:05 robwalch

drawing VideoFrame via canvas + AudioFrame (PCM data) using WebAudio does not directly related to hls.js (or any other streaming lib on MSE), and it is achievable througth various way depending on your actual needs. The only problem is you get YUV data only when the frame being rendered, which is suitable for postprocessing.

  1. WebCodec API provides a way to directly get currently rendered VideoFrame from video element const frame = new VideoFrame(HTMLVIdeoElement), then you can use VideoFrame.copyTo with VideoFrame.format and VideoFrame.allocationSize to get YUV data for most of the content (normally 8bit 420 content should be fine)

  2. From (1), WebGPU also allows you to directly import VideoFrame as texture, then you can use simple marix to transform rgb back to YUV to do some custom processing in shader, or directly render the texture into canvas. For normal 2d canvas, you should also be able to directly draw ImageBitmap to canvas.

  3. For audio data, ScriptNode & AudioWorklet of WebAudio API should be enough for you, the data provider can directly be the HTMLVideoElement as well.

kedanielwu avatar May 06 '24 03:05 kedanielwu

Can I obtain a buffer for processing during hls.js decoding?

yzydeveloper avatar May 06 '24 08:05 yzydeveloper

Can I obtain a buffer for processing during hls.js decoding?

first of all, hls.js or other similar library does not provide "decoding" functionality, video decoding is not something directly exposed to js context by normal approach. MSE on the other hand, also not standalone "decoder", you can think of it as a source provider, for web devloper to customize the way of "streaming" media data to browser.

from your previous descrption I think you are on a wrong track. If device/os does not support MSE, it is likely does not support any of the newer API for decoding/rendering, rely on native hls support probably is your only choice.

And if you really just want to control the decoding/rendering process:

simple solution: no

complex solution: yes, you can build custom MSE and custom HTMLVideoElement using WebCodec or even WASM, as long as you followed MSE spec, then modify hls.js to use your custom modules. In that case hls.js will push remuxed FMP4 segment to your MSE interface, and you can do your work after that (e.g using WebCodec for decoding then output YUV data and manage your own frame buffer, before sending to canvas)

kedanielwu avatar May 06 '24 08:05 kedanielwu