Hi,
I'm very new to Xamarin Forms and .Net and need some help.
I send a https request to a server and get a multipart/x-mixed-replace response with frames of type image/x-h264.
I read the content length of each frame, extract the content and store it in a byte array.
What do I have to do next with the frames stored in byte arrays? How can I convert them to images?
Is this approach right/wrong?
Is there an easier way?
Thanks in advance
Colin
Answers
@Colin_123
Please check the following link:
https://stackoverflow.com/questions/41003182/convert-a-byte-array-to-image-in-xamarin-forms
Unfortunately, it's not that easy. The H.264 video live stream is encoded and not all frames represent whole images. I've tried to use FFmpegInterop but I can't get it to work with live streams.
To get h.264 live stream working you need to work on 2 steps. First, get data from network by the protocol your live stream is using such as RTSP. You data will be stored at local as byte array (aka local streaming cache) normally. Then, the byte array need to be decoded by a decoder, in Android and windows, you can use FFMPEG, but it seems in iOS you can directly use the phone's hardware to decode the video.
I'm able to get the first part resolved but stuff in iOS for the 2nd part: decoding.
Have anyone succeed on this part?
For UWP I tried Ffmpeginterop but the methods didn't work with my live stream for some reason. I then just edited one of the Ffmpeginterop files and wrote my own class that uses ffmpeg to convert the video. I got a video with that solution but there were small green artefacts at the bottom of the image. I didn't try android or iOS.
https://www.ffmpeg.org/doxygen/3.4/group__lavc__encdec.html
@Colin_123 got you. Thank you very much. I have to use ffmpeg on iOS but can't find the way to make it work, I did try native binding but something went wrong and I can't figure it out.