-
Notifications
You must be signed in to change notification settings - Fork 76
Description
It bugs me that the native HTML implementation of a video player does not support adaptive streaming. I end up using Vimeo all the time. But the wagtailmedia package might be the place to prepare a video for adaptive streaming in combination with HLS.js on the frontend side. Here are the required steps:
Backend
- Upload video file via the existing React based upload form
- Convert the uploaded video file into multiple variants with different resolutions and/or bit rates.
- Split each variant of the video file into small segments
- Create a master playlist file that references the different variants of the video, and create individual playlist files that reference the segments of each variant
- Store the video segments media files and playlist files
- Store information about the video file in database, such as the hierarchy and locations of all segments and playlist files
Frontend
- Serve the video files via a template tag that returns the master playlist
- Not sure if these playlists have to be files or could be generated on the fly via template tags as well
Encoder
An encoder might be the bottleneck here as FFmpeg is not available via some hosters (or at least my preferred hoster to be honest). Are there any Python packages that could natively replace services that support HLS? ... like Zencoder, Coconut, Mux and AWS Transcoder.
Laziness
The conversion and splitting of the video file and the generation of the playlists could be done on demand like the Wagtail image renditions following Django’s laziness philosophy. For longer video files on weaker machines that might take a long time.
A client side JavaScript encoder as part of the React upload form might be the solution to both problems here, encoder and time.
Overall this could be a killer feature of the Wagtail CMS. I offer my help here with anything apart from React if there is an reliable encoder that could be integrated.