Streaming

RTSP
https://github.com/bluenviron/mediamtx and (streaming script) which can be used with https://github.com/blakeblackshear/frigate for Tensorflow based object detection or Deep sort, (Object_tracking tracks any pixel cluster) multiple object tracking(79). See LXF293 Linux format magazine, sept.2022 for docker usage and Mqtt install. rtsp-simple-server is a simple, ready-to-use and zero-dependency RTSP server and RTSP proxy, a software that allows multiple users to publish, read and proxy live video and audio streams over time

stream the rasp rtsp stream over a reverse ssh link to the desktop pc with ffmpeg. If both the home desktop and raspberry are behind a nat, then use a jump server in between them. See ssh for local and reverse port forwarding of applications over encrypted ssh tunnel. See the luke smith posix piping tutorial on youtube. on the target pc

Or run each raspberry with its own rtsp server and TCP into it with mplayer from desktop pc. from the client pc to raspberry target and rtsp server on raspberry.

Stream the rasp rtsp stream over a reverse ssh link to the desktop pc. The latency is nearly the same as a direct tcp link. See # https://pastebin.com/mLSbqdjk

client pc connects to localhost, tunneling to raspberry(rtsp server). Stream the rasp rtsp stream over a reverse ssh link to the desktop pc. If both the home desktop and raspberry are behind a NAT, then use a jump server in between them. See ipfs for alternative solution.

socat
Don't do reverse ssh from the same usual ssh login into raspberry back to desktop pc, as it creates a tunnel in a tunnel confusing socat. See superuser.com 1 and superuser 2 See ssh udp over ssh using netcat. This is a live stream recording about how I figured out how to share my screen to Twitch with only FFMPEG, ref http://ffmpeg.org/ffmpeg.html from art channel https://github.com/kkroening/ffmpeg-python/blob/master/examples/README.md#tensorflow-streaming https://phoboslab.org/log/2013/09/html5-live-video-streaming-via-websockets websockets rtsp: https://trac.ffmpeg.org/wiki/StreamingGuide

https://pastebin.com/GPtLpNj0 MIPI is problematic for usage on raspberry clones and the raspberry itself maxes out at 1080p streaming in anycase. Use USB Cameras streaming in MJPEG. Collect multiple 10meg usb camera streams using cat5 cable to single desktop pc. use socat command to create USB to Ethernet IP bridge from each embedded device such as an Orange Pi. The idea is to avoid using low power raspberry clone processing the stream, it must pass it through to the pc, which will use its nvidia 1080 gpu to convert the mjpeg to mp4 for each usb camera.

Zoom
Run the rtsp server on the jump server. Each of say eight clients ssh local port forwards their camera feed to the server. Using socat and a single reverse ssh open up eight MplaYer instances(executed from each client pc). All mplayer gui screens are tiled into eight blocks. This is basically what the Zoom company does, they paid lots of money to use proprietary GPL and BSD and Ffmpeg code and wrapped these commands into a gui app. With ssh, rtsp, mplayer and ffmpeg you get the same functionality but at least some privacy, as not all your conversations are being recorded by them. Use ffmpeg for streaming the screen to Object tracking app.

drone streaming
https://github.com/stephendade/Rpanion-server Depth perception github repo on OAK sense and avoid with raspberry on drone.

analogue video
https://www.youtube.com/watch?v=RCQNu01sKIs

Tall Paul Tech
https://www.youtube.com/watch?v=e2VcxhH_k7Y

OBS Ninja streaming
https://github.com/stream-labs/streamlabs-obs/

https://github.com/steveseguin/obsninja Direct peer to peer browser based video streaming.

https://www.youtube.com/watch?v=GVTMZ5PqOAA and https://obsproject.com/forum/threads/obs-raspberry-pi-build-instructions.115739/ build instructions

https://www.youtube.com/watch?v=VRzAsEBZ3Vw All the steps to set up OBS Studio for live streaming on Raspberry Pi 4.

https://www.pyimagesearch.com/2019/09/02/opencv-stream-video-to-web-browser-html-page/ Flask streaming

https://github.com/Chocobozzz/PeerTube/blob/develop/FAQ.md#should-i-have-a-big-server-to-run-peertube and https://worldofvids.com video upload server, one of many for peer to peer video streaming

Streampi
https://www.youtube.com/watch?v=jj5zoLIEn3s and https://stream-pi.com/
 * https://github.com/rnayabed/streampi_server
 * https://github.com/rnayabed/streampi_client
 * Motion simply motion filter in golang.
 * usb streaming with raspberry: https://www.reddit.com/r/raspberry_pi/comments/kfle91/for_those_asking_how_to_use_android_as_rp_monitor/

UDP streaming
https://medium.com/@fromtheast/fast-camera-live-streaming-with-udp-opencv-de2f84c73562

https://github.com/ancabilloni/udp_camera_streaming

https://putuyuwono.wordpress.com/2015/05/29/multi-thread-multi-camera-capture-using-opencv/ and github https://github.com/ancabilloni/vision_projects/tree/master/multithreading_cameras

ngrok
https://github.com/HackerShackOfficial/Smart-Security-Camera use ngrok for streaming over internet. (raspberry) or do reverse ssh via ssh.

Jan Schmidt(thaytan)
https://github.com/thaytan/gst-rpicamsrc gst-rpicamsrc is a GStreamer wrapper around the raspivid/raspistill functionality of the RaspberryPi, providing a GStreamer source element capturing from the Rpi camera.

https://github.com/thaytan/gst-tutorial-lca2018 This repository contains some examples for the GStreamer Tutorial I am presenting at Linux.conf.au 2018 in Sydney.

https://www.youtube.com/watch?time_continue=794&v=ZphadMGufY8

https://www.youtube.com/watch?v=MCRKfXipAkU

https://www.youtube.com/watch?v=LjLBD_-s4aQ

https://www.youtube.com/watch?v=wclBfVbu_Ds

https://www.youtube.com/watch?v=0HxibICR8zo Mqtt messenger

raspvid
https://www.youtube.com/watch?v=sYGdge3T30o

ezwifi broadcast
https://github.com/bortek/EZ-WifiBroadcast Ez-WifiBroadcast is a radically simple digital data transmission system. A bidirectional data link is established using commercial off-the-shelf hardware like WiFi modules and a pair of Raspberry pi computers. Coupled with special software this unique system allows transmission of low latency HD video, telemetry and control data between endpoints. In comparison to a classical wireless connection Ez-WifiBroadcast tries to imitate the famous properties of an analog link like graceful signal degradation and no association between the endpoints.

ezwifi wiki faq What is the minimum hardware I need to try this? Two Raspberry Pi, two WiFi dongles, one Pi cam, and 2 micro sd cards. Please read the rest of the wiki and do your homework when you make your purchases! Some Pi aren’t compatible, most WiFi dongles won’t work, and you need big enough micro sd cards.

Issues: The problem with Ez-wifibroadcast is that most of the WiFi dongles it used are no longer available. TPLink has unfortunately upgraded the WiFi chipsets in its dongles and routers to more proprietary Mediatek/Realtek chips which don’t have the right drivers for injection mode. That is correct, you can boost the range by upgrading the WiFi router. A Cantenna might be a cheap and interesting option to boost the range (especially after pairing it with Ardupilot’s AntennaTracker). Latency is configurable in APStreamline and the trade-off is that the jitter will increase as the latency is reduced.

esp32
esp32 camera

summer of code
https://discuss.ardupilot.org/t/introducing-apstreamline/31723 The APSync 25 project currently offers a basic video streaming solution for the Raspberry Pi camera. APStreamline aims to complement this project by adding several useful features: Automatic quality selection based on bandwidth and packet loss estimates, Selection of network interfaces to stream the video,    Options to record the live-streamed video feed to the companion computer,    Manual control over resolution and framerates Multiple camera support using RTSP,   Hardware-accelerated H.264 encoding for the Raspberry Pi Camera settings configurable through the APWeb GUI,    Compatible with the Pi camera and several USB cameras such as the Logitech C920 and others.

Due to the limited range of 2.4GHz Wi-Fi, the Quality-of-Service (QoS) progressively gets worse when the robot moves further away from the receiving computer. This project aims to fixes problem by dynamically adjusting the video quality in realtime. Over UDP we can obtain estimates of QoS using RTCP packets received from the receiver. These RTCP packets provide helpful QoS information (such as RTT and packet loss) which can be used for automatically changing the bitrate and resolution of the video delivered from the sender.
 * uav submarine and Raspberry_pi
 * https://github.com/shortstheory/APWeb
 * https://github.com/shortstheory/adaptive-streaming
 * https://github.com/peterbarker/APWeb/pull/2

solilink
https://diydrones.com/profiles/blogs/sokillink-all-in-one-wireless-link

https://sokil.aero/products/sokillink

latency free
http://unmannedbuild.yconst.com/2017/01/28/controlling-wifibroadcast-with-an-rc-switch/ Wifibroadcast is a project aimed at the live transmission of HD video (and other) data using wifi radios. One prominent use case is to transmit camera images for a first person view (FPV) of remote controlled aircrafts. In contrast to a normal wifi connection wifibroadcast tries to mimic the advantageous properties of an analog link (like graceful signal degradation, unidirectional data flow, no association between devices).

https://docs.emlid.com/navio2/common/dev/wbc/ See Emlid.com, linked from http://diydrones.com/profiles/blogs/new-emlid-raspbian-for-raspberry-pi-and-navio-navio2

Youtube streaming
http://hackaday.com/2016/11/25/low-cost-video-streaming-with-a-webcam-and-raspberry-pi/, http://videos.cctvcamerapros.com/raspberry-pi/ip-camera-raspberry-pi-youtube-live-video-streaming-server.html Ffmpeg install script.

Tower app streaming
https://github.com/DroidPlanner/Tower/wiki/Custom-video-stream I am reporting how to stream Raspberry pi camera video to Tower video widget. On the last weekend, I attended a drone meeting. I brought my IRIS which was given from my friend. It flew well and I was very satisfied with my first flight. In the meeting, other people brought their brand new drones like SOLO, Phantom, and Inspire. I had a chance to fly them and I felt something was missing in my old IRIS. It was FPV! I live in Japan and because of a regulation, we can't use a 5.8GHz video transmitter without a licence. Furthermore, SOLO costs twice as in USA. However, I noticed a video widget(small screen) in Android "Tower" app and it was said that it receive custom video streaming. In that article, the author used Ubuntu as the source. Since I had a raspberry pi 2 with pi camera, I had tried to use it as a FPV camera server. The followings are how to set up video streaming. I hope it will work with your environment. Google cardboard support thanks to the integration of the DronePro app by Shawn Fraser https://play.google.com/store/apps/details?id=meavydev.DronePro Improved Mission Editor. Added support for the Reset ROI mission item. UX update for the telemetry connection preferences Removal of the droneshare account integration pending rehabilitation of the site api. Material design update for the navigation drawer thanks to jandrop. Improved spanish, german, chinese, and portuguese translations. Numerous bug fixes and under the hood improvements!
 * http://theiopage.blogspot.jp/2013/04/enabling-hardware-h264-encoding-with.html?m=1
 * http://www.ideasonboard.org/uvc/ Support for FPV video streaming using UVC Devices by Guilherme Blanco . This allows anyone to experiment with and/or send a digital video stream to Tower for viewing through the video widget https://github.com/DroidPlanner/Tower/wiki/Custom-video-stream
 * https://doc.arcgis.com/en/arcgis-online/reference/what-is-agol.htm

from diydrones post Streaming raspi camera video to Tower video widget

http://airsoc.com/articles/view/id/573f5c733139446e038b4567/streaming-raspi-camera-video-to-tower-video-widget?ev=10&evp=tl

http://uavmatrix.com/Blog/13 Gstreamer app https://en.wikipedia.org/wiki/GStreamer on Rasberry Pi
 * http://uavmatrix.com/Blog/17 New complete RPI2 Image and software script package (UAVcast) ready for implementation. Read THIS POST   for furhter information. If you succeed you should be able to fly your aircraft by cellular network and simultaneous have live video feed and telemetry. Imagine flying your drone by USB gamepad and as loong as the battery holds.
 * https://github.com/UAVmatrix/UAVcast
 * https://github.com/thaytan/gst-rpicamsrc

zcopter
https://www.kickstarter.com/projects/215849561/airstring-gsm-telemetry-for-your-drone GSM telemetry over mobile.

Globalarc
http://diydrones.com/profiles/blogs/how-to-control-your-drone-from-anywhere-on-the-planet-using-3dr

https://globalarc.us/pricing

flytpod wi-fi
It has an inbuilt High Power WiFi, RC Receiver Support, External 3G/4G dongle provisions and a Serial Port for Telemetry using external Radio. This enables seamless communication between the Drone and ground applications running on mobile devices or laptop. It also allows for communication with cloud apps as well as between multiple Drones in a fleet (Swarm).

FlytPOD is the most advanced flight computer available in the market today. It has the complete flight stack for reliable navigation and can be attached to any type of drone. This includes inertial navigation sensors, barometer, GPS, communication system and several payload interfaces. It also contains a powerful processor for developing custom high level applications for agriculture, industrial inspection, surveys, search & rescue and delivery, etc.

It boasts of a Samsung Exynos5422 Octa-core processor, with 2 GB Ram, a powerful Integrated GPU and 32 GB storage (upgradable). This allows for onboard processing of computationally intensive algorithms like image processing, obstacle avoidance and mapping the environment. It embodies an external Safety Switch, RGB LED for system status, USB3.0, USB2.0, HDMI and user configurable I/Os. It has 3-axis Accelerometer, 3-axis Magnetometer, 3-axis Gyroscope, external dual GPS and Magnetometer. The dual GPS system helps in accurate positioning of the drone and data from all the above sensors ensures reliable flight. This makes it easy to integrate a variety of custom payloads and provides a unified interface for the drone developer to monitor system status.

http://diydrones.com/profiles/blogs/flight-computer-with-built-in-openwrt-wifi-router

http://flytbase.com/flytpod/ We have tested it for 250 meters with ping time of 5ms with ASUS router on ground. Baud rate wouldn't be a good major to rate it on, but we have tested file transfer successfully at 250 meters with ~ 100 Kilo Byte per second. Within 100 meters the performance is great with video and telemetry data. With onboard 20 dBm output power and decent ground router we expect the range to be more than 500 meters. We are planning to test for 500 meters soon.

pandaboard
https://diydrones.com/forum/topics/i-achieved-a-640x360-live-streaming-over-cellular-network-with-a code download scripts


 * EmbeddedPc:Pandaboard Base for my UAV experience. The transmission is incredibly stable, the video is really smooth. Here is what i used: -Logitech C920, a good quality camera.

-Pandaboard: has an embedded video encoder that can take the raw YUV video from the camera and encode it in h264. You can set the encoding quality/bitrate and other parameters.

-HSDPA usb dongle; 60 hours of high speed connection for 10€/month, i have a 2Mbps upload speed with this little beast. Ping increases during workload and reaches 250ms, when unused, it's as low as 50ms.

-The traffic is sent from the pandaboard to a VPN i connect with my client. This is only a way to bypass firewalls. 864x480 works but it's not as smooth as 360p, and i consider the 360p streaming to be more than enough for this. 1280x720 lags considerably. If there is interest maybe i can publish my code and a video of how it works. Anyway, i'm a happy guy and i wanted to share it. I've never tried radiofrequency video transmission, but was worried about the range.

See FpGa IP cores for converting a PAL/NTSC analogue video into YUYV format. This enables the low lux 0.003 Watec CCD camera to be used at night(fit laser or Infrared leds illuminator to a Gimbal(Alex moss or Martinez code)

tcp
Another project http://diydrones.com/profiles/blogs/homebrew-tcp-ip-digital-secured-video-fpv-telemetry-control

Streaming sites
https://www.bitchute.com/

3g modem on linux
Raspberry_pi

Links
https://github.com/webview/webview alternative to browser for viewing pages Imagezmq http://diydrones.com/profiles/blogs/andruav-video-streaming-capabilities,  https://andruav.com/arcs/andruavweb.html The above video shows how you can use Andruav for streaming video from your Drone(s), and receiving video simultaneously on different types of Andruav-GCS. Although the video shows devices in one place, but you can make the same scenario using devices in different locations, even different continents, as long as they have Internet connection via ADSL 3G or 4G. Video quality depends on sender mobile, network and the receiver device. The streaming protocol handles network bandwidth intelligently and adapt video quality according to bandwidth availability, also processing power of the receiving device determine video smoothing and glitches. ImageProcessing PAL NTSC USB converter converts analogue PAL from a Watec 0.003 lux camera to digital usb webcam format. Hi3518E IP Camera Raspberry_pi  v4l2 video for linux, modprobe See Ffmpeg for protocols and command line options https://github.com/aler9/gst-darknet GStreamer element to use Darknet(Yolo alexeyAB) (neural network framework) inside GStreamer https://samy.pl/pwnat/, (https://github.com/samyk/pwnat) drlex bitrate calculator and https://wiki.jmk.hu/wiki/Main_Page for AMZ instance creation script.