As part of my university thesis, I want to develop a system which can capture video frames and stream it live over WiFi or LTE. The bandwidth for streaming is to be as low as possible and hence I want to use h.264 encoding for compression of captured video frames and transmitting to the client. At the client side, these h.264 encoded data will be decoded and streamed on a PC, or a similar Cyclone V SoC with h.264 decoder and a monitor. In short I am looking to build an embedded video streaming server on a Altera Cyclone V SoC.
I could not find any good resources so far, or details about someone who has already done this on the Cyclone V SoC. However, the i.MX6 processors have a very good support for GStreamer that uses the hardware encoding features of the i.MX6 and does this seamlessly. However, with these processors, there is an issue of latency when it comes to streaming live video. I would like to have a streaming latency of < 500ms. The usecase is to use these systems as FPV for Drones.
So, my idea is to implement a video capture through HPS, and have a h/w decoder in the FPGA and then pass this data through WiFi/LTE. The following chain explains the flow of data:
Video capture -> HPS -> FPGA (h.264 encoding) -> HPS(compressed data from FPGA) -> WiFi/LTE
I need some advice from the Cyclone V SoC experts as to if its a feasible design solution. I think using GStreamer is a good option, provided that the ease of use and documents available for Gstreamer. Has anyone does this before? I would really appreciate if you could share some links or resources for this usecase.
Also, are there any plugins for GStreamer that utilizes the Cyclone V SoC architecture for media applications?