This post shows how to broadcast data over 802.11 devices with a true unidirectional data flow. No need to be associated to a network (thus no risk of being disassociated) and no acknowledgements are sent from receiver to transmitter. Please note that this is ongoing work and just a proof of concept.
EDIT: For a complete overview, refer to this page: https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/
——
My plan for this spring: Fly FPV! There are already a lot devices on the market for transmitting analog video signals over 2.4GHz or 5.8GHz. But to me that always seemed a bit outdated. Of cause, several people thought that too and tried to send their video data over Wifi. A good example for doing so: https://sparkyflight.wordpress.com/2014/02/22/raspberry-pi-camera-latency-testing-part-2/
“Sparky flight” took a RaspberryPi, encoded the video stream as h264 and sent it over Wifi to a PC. He was able to get a latency of down to 85ms glass-to-glass, which is quite nice! But: All Wifi solutions have the same problem: When you loose your Wifi connection, you are immediately blind. That is of cause not acceptable in a FPV setup. This is where the old analog link has a really big advantage. When you are getting out of range, the signal slowly degrades. You would have still time to react and turn your plane around.
So I thought: Wouldn’t it be possible to archive the same advantage of a slowly degrading link over Wifi? Not out of the box but I’ll show you how.
The basic approach is: The video transmitter sends its data as a true broadcaster into the air. Regardless of who is listening. The receiver listens all the time and when he is in range of the transmitter he’ll receive the data. When he starts getting out of range he’ll eventually receive not all packets but still some. This behaviour is comparable to that of an analog signal path.
The main problem is that the Wifi standard does not support such a mode. Devices always need to know to whom they are sending their data. This relationship is created by the “Wifi association”. If your PC is associated with your router, both devices know to whom they are talking. One of the reasons for this association is to make the data transfer more reliable. A receiver of a packet always acknowledges the reception to the transmitter. If no acknowledgement has been received, the transmitter has the chance to re-transmit the packet. Once they loose their association, the cannot exchange data anymore.
The Wifi ad-hoc mode comes pretty close to an unassociated “broadcast style” way of transmitting data. Unfortunately, it seems as if modern 802.11 standards aren’t supporting ad-hoc mode anymore. If you buy a 802.11ac card and put it into ad-hoc mode, it most likely falls down to 11MBPS. It is not required by the standard to support ac rates in ad-hoc mode 😦
To solve this issue I wrote two small programs that serve as a raw transmitter and raw receiver. They are pretty much hacked together out of a program called “packetspammer” http://wireless.kernel.org/en/users/Documentation/packetspammer
You find my programs here: https://bitbucket.org/befi/wifibroadcast
After compiling the sources with “make”, you’ll have two programs called “tx” and “rx”. Both take as a mandatory argument a wlan interface that has been put into monitor mode. The tx program reads data over stdin and sends it with a raw 802.11 packet into the air. On the other side the rx program listens on a device and outputs received data to stdout. The packets of the transmitter are recognized by their fake MAC address (13:22:33:44:55:66). The packets only contain a valid 802.11 header (so that they are not rejected by the wifi card). The rest of the packet is filled with raw data.
Following is an example on how to use the programs:
Receiver:
sudo ifconfig wlan0 down sudo iwconfig wlan0 mode monitor sudo iwconfig wlan0 channel 1 sudo ifconfig wlan0 up sudo ./rx mon0Transmitter
sudo ifconfig wlan0 down sudo iwconfig wlan0 mode monitor sudo iwconfig wlan0 channel 1 sudo iwconfig wlan0 rate 54M sudo ifconfig wlan0 up sudo ./tx mon0Everything you’ll type into the tx shell should now appear on the rx shell. The tx program also supports as parameters the maximum length of the packets and the number of retransmissions. A retransmission rate of 3 for example will cause the transmitter to transmit each packet three times. This increases the chances that the receiver receives one of them correctly. To avoid that the data is being received 3 times each packet contains a 32bit sequence number. If a sequence number is received more than one time the subsequent packages with identical sequence numbers will be ignored. I admid that this type of redundancy is rather stupid. The problem is that most Wifi cards discard packets with wrong FCS (frame check sequence) completely. So the more classical approaches of redundancy (hamming-codes, …) are not so easy to use. There is definetely still some work to do here! Feel free to participate 🙂
Writing text over a true broadcast-connection is nice. But how about video? Actually it is really simple. GStreamer is a nice tool for this purpose. My test-setup looks as follows:
(usb webcam) <—> (raspberry pi) <—> (wifi dongle) ~~~~airgap~~~~> (wifi dongle)<—>(PC)
On the raspberry pi I execute:
gst-launch-1.0 -v v4l2src ! 'video/x-raw, width=640, height=480, framerate=30/1' ! omxh264enc target-bitrate=500000 control-rate=variable periodicty-idr=10 ! h264parse ! fdsink | sudo ./tx -r 2 -f 1024 wlan0In words: Receive raw video from V4L2, transform it into 640×480 30fps, encode it as h264 with 500kbit/s and increased keyframerate (this helps if packets get dropped), directly write the video data (without container) to stdout. The video data is then piped into the tx program with two retransmissions and a packet length of 1024 bytes.
And to receive and display it on my PC:
sudo ./rx wlan0 | gst-launch-1.0 fdsrc ! h264parse ! avdec_h264 ! xvimagesink sync=falseIn words: Receive data on interface wlan0 and pipe it into gstreamer. Gstreamer receives the data, parses the raw h264 data, decodes it and displays the video.
The video quality is ok, maybe a bit too low for actual flying. In the settings above the latency is quite ok, maybe between 100 and 200ms. I noticed that when I increased the encoder bitrate that the latency was bigger. But I still need to look into that. I think by using the original raspi-cam it should be possible to archive the ~100ms of the “Sparky Flight” guy.
Dropped packets turn out to behave as expected. The video image is partly disturbed but continues to run. A rough estimate: Up to a loss of 5-10% of the packets the video should still be usable to rescue your plane. See below an example of a transmitted video with a packet loss rate of approximately 2.5%:
Unfortunately I wasn’t able to change the power of the transmitted packets. There is a field in the radiotap header which I have set but it seems to be ignored. Otherwise my solution would be perfect for FPV. You could (as a Bolivian of cause 😉 ) buy one of these cheap Wifi cards with 1W output power and have an extremely cheap long distance video link (and since this is a true unidirectional link you would only need a single high power card in your plane). You could also use the Raspi to gather and transfer GPS information or battery capacity live. Of cause this would then be realized as a side-channel and written into the image on the receiving device. In contrast to those (shitty) analog devices that write directly onto the transmitted image…
If you are interested in participating, please share your experiences in the comments, take my code, modify it, improve it 🙂 My gut feeling is that there are only little things left to do for having a true digital (possibly HD) FPV system with low cost equipment.
My next step: Make some range experiments…
.png)

