Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Wifibroadcast – Analog-like transmission of live video data (befinitiv.wordpress.com)
129 points by infodroid on Dec 18, 2015 | hide | past | favorite | 39 comments


Does this allow encryption of the video stream? Graceful signal degradation is a great feature, but I don't see how that would work when there is encryption.


Encryption would work just fine if you used a stream cipher, like ChaCha20.

(With a stream cipher, a single bit error in the encrypted data results in a single bit error in the decrypted data).


If the receiver miss a chunk of data, the stream cipher is not able to re-synchronize (as opposed to a video-stream which synchronizes on I-frames)


Yes, to allow resynchronization you have to send the stream position at the start of each packet.


A block cipher doesn't make errors all that big. Look at the two layer error correction on a CD. That would play extremely well with a block cipher if you wanted to.


Of course it is possible to encrypt an unreliable unidirectional bitstream. (Tons of systems do this, eg. satellite video links.)

Contrary to what other commenters suggest, a proper stream cipher like ChaCha20 is not even needed. You could just use CTR mode, which turns any block cipher (like AES) in a stream cipher and prevents ciphertext bit errors from creating more plaintext bit errors. Also, transmit the counter every once in a while so that dropped packets don't prevent you from decrypting subsequent packets.


Encryption won't work for this scenario as it reintroduces the problem of all or no data. Degraded encrypted data can't be decrypted successfully (yet?)


A stream cipher like ChaCha20 would work. Most stream ciphers (including ChaCha20 and RC4) work by generating a pseudorandom bitstream based on the key which is then xor'd with the cleartext to produce the ciphertext. Since neither the cleartext nor the ciphertext are used to generate the pseudorandom bits any individual bit flip in the ciphertext will only result in that bit being flipped in the cleartext, the same as if the data were not encrypted.


From my armchair position, I'd think that this is what error correcting codes are for, aren't they?


OFDM 802.11 rates already have a significant amount of Forward Error Correction. 802.11g uses a convolution encoder and Viterbi Decoder [1], and 802.11 HT rates (n, ac, ad) can also use Low Density Parity Codes (LDPC) [2]. The problem with Forward Error Correction is that it can't deal with too many sequential errors, so many modems use something known as an interleaver. An reorders bits as they are sent over the air, so instead of sending LSB to MSB or vice versa you are sending bits in a random yet mutually known order. This causes link quality issues such as interference to not interfere with continuous bits (to the benefit of the FEC decoder). The problem with an interleaver is that it causes latency to go up. If you interleave based on 256 bits of 2048bits, you can't decode blocks of data until you've received all of this bits. So the 288 bit interleaver that 802.11 uses won't cause many problems if you're streaming but if you are interleaving data across multiple packets you will notice a spike in video latency.

[1] https://en.wikipedia.org/wiki/Viterbi_algorithm [2] https://en.wikipedia.org/wiki/Low-density_parity-check_code


Error correction (or redundant parity data as est mentioned in another reply) just kicks the can down the road.

Let's say you're using a 20/40 erasure encoding. You break a piece of data up into 20 pieces and create 20 extra parity pieces. Now you only need 20 out of the 40 to recreate the original data.

Are we encoding the encrypted data? Ok well we need at least 20 good pieces, and that's to decode the original data. This method doesn't allow for seamless degradation but allows for some data loss in the transmission (while effectively doubling the amount we're trying to push in the first place).

Let's say we're breaking up the original data, creating parity pieces and encrypting each little piece. Then it could decrypt each piece it got and use it and if it couldn't decrypt a piece just throw it away. This could potentially work but parity pieces are useless unless you are trying to recreate the original file neglecting the ability to degrade quality. So redundancy is more important in this scenario than parity.

But, if we make the encrypted pieces small enough, say each packet body, then that could probably work but be resource intensive. Encode/decode every packet, if successful insert into feed, else throw the packet away. This would work a lot like the existing technology just requiring some middle step of decrypting each packet body.


> But, if we make the encrypted pieces small enough, say each packet body, then that could probably work

So in other words you can use what's basically the default mode of encryption, CBC. Each encrypted byte only depends on the adjacent 32 bytes, so you can allow errors through and they affect a couple pixels instead of a single pixel.


I wasn't aware of CBC at the time of writing this comment, thanks for pointing it out to me.

It seems if you lose any 32 bytes though you've lost the trail of encryption as you can't decrypt any subsequent pieces.

After reading other comments I think the only reliable solution is chacha20 where each packet can be encrypted/decrypted independently of others.


https://upload.wikimedia.org/wikipedia/commons/2/2a/CBC_decr...

Let's assume CBC with AES. It encrypts in 16 byte blocks. If you slightly corrupt one block, you will fail to decrypt it entirely, and it will slightly corrupt the block after, but everything else will be fine.

There are modes of encryption where losing one bit will corrupt all subsequent bits.

There are also modes like GCM or stream ciphers like ChaCha20 where one corrupted bit will not corrupt any other bits at all.

In short: There are many options, and half of them are suitable for this.


How much security do you lose if you encrypt each frame individually?


encrypt with a redundant parity data?


How would you implement analog parity? Parity doesn't translate well as a concept into the analog space.

You can take an analog signal and "quantize" it into sixteen possible values so that you can apply a parity algorithm that returns sensible results and doesn't fail with expected noise, but you're digitizing the signal.


I don't understand what analog has to do with it? The video is digitized first, then error-correction (parity?) information added before transmission, so all parity would be related to the digital bitstream -- unless I missed something?


Even then I think it would still be all-or-nothing. That may increase the chances of "all" over "nothing" but not allow graceful degradation of the video.


Hierarchical coding: send a low quality version of the video with high redundancy, and a higher quality refinement of the video with lesser redundancy.


Just encrypt each packet individually in CBC or CTR mode with a new random IV per packet.

Hopefully things have structured such that 802.11 packets are divisible by video frames.


This is awesome project. I'm currently building a fixed wing drone with dual wifibroadcast links, downstream for (stereo) video and upstream for control.


> Note: Before using wifibroadcast you have to check if the regulatories of your country allow such a use of wifi hardware.

Any guesses as to whether this would be legal in the US?


Monitor mode is a MAC level concept, and doesn't alter any of the RF characteristics that would make operating this device illegal. (This is not legal advice. Consult with an attorney, etc etc)


Here's a page with info on operating at 2.4 GHz: http://www.afar.net/tutorials/fcc-rules


From the advantages quoted in the article, this seems useful for recording wifi-enabled security camera feeds too.


Cool stuff. Good to see them recognize and use advantages of analog. I could tell them digital transmission of data has always been done with analog circuits but analog's invisible ubiquity is beside the point. ;)


Amen. Now if someone could get DVDs or streaming services to offer fast-forward and rewind that works as well as a VCR we'd be caught up to the 1980s!


Hell yes! Or even hardware acceleration for starting in random parts of videos on my PC so they don't break up and delay as often. This is the digital era. It's supposed to work more reliably than old, analog tech. I used to be able to stop my rewind or fast-forward on VCR within a few seconds of the target moment with clean play. Still not reliable with digital streaming. (sighs)

Note: I do like how, even with errors, it still takes under 20 seconds for me to get to any random part of the vid. That's an improvement over the rewind/fast-forward speeds. :)


Modern video codecs use data in frames before and sometimes after the current frame to compress more:

https://en.wikipedia.org/wiki/Group_of_pictures

whereas analog media stored full resolution versions of every frame.

Seeking is much easier when you can pick any random location and have all the data right there ready to use, and you don't have to backtrack and try to recreate things from previous frames.


That's true. Hence me asking for "hardware acceleration" to improve the speed to real-time if possible.


Isn't a lot of that related to how the video is encoded/decoded though? I have only a basic understanding of how it works through working with video (not developing or tweaking actual codecs) but when I'm doing VJ-type stuff in Resolume (for example) I use a setup where all clips are encoded with a keyframe every frame. It makes for some huge clips but I can scrub or jump around or manipulate clips with no delay at all.

On the opposite end you've got streaming video or even many common formats and settings for ripped/downloaded video. Those do a keyframe every (x) frames and then between keyframes the file only contains the data for changes to the keyframe. This gets you smaller files so your downloads are quicker and your streams look nicer while using less bandwidth.

I know there's a lot more to it but at least with locally stored files, I thought it mostly had to do with keyframes. With streaming I'd imagine it's related more to how the stream is managed to download in chunks and maximize quality versus bandwidth (rather than focusing on quick scrubbing or quick access to random points of the video).

I'm interested in this stuff so if there's something I'm missing or flat-out wrong about, feel free to educate me.


Oh I'm not a subject matter expert on video compression. I just know the difference between using a general-purpose CPU and the ASIC version of decoders is huge. It's why you have all these weak, low-power SOC's that can do 720/1080p etc. That's the hardware acceleration.

I doubt it's designed for the random access I'm describing, though. So, one for that might solve the problem. Might also need to be integrated with a good, storage subsystem if that causes any difficulties. Cool, though, that yours lets people jump around at will. :)


> Note: I do like how, even with errors, it still takes under 20 seconds for me to get to any random part of the vid. That's an improvement over the rewind/fast-forward speeds. :)

Don't forget not having to rewind after watching/before watching. Glad that whole class of annoyance is gone, much more annoying than poor random access :)


That's what this meant:

"Note: I do like how, even with errors, it still takes under 20 seconds for me to get to any random part of the vid. That's an improvement over the rewind/fast-forward speeds. :)"


This isn't analog video. It's one-way broadcast digital video. Like current broadcast TV.


I know man. It's a semi-joking and semi-serious post where I point that people step from one mental, model (reliable/p2p/digital) to another (lossy/continuous/analog) to come up with a good solution to a problem. In this case, kind of reinventing one but with a medium that has plenty of cheap HW supporting it. It's technically digital but most like analog for the average person's experience with the attributes.


I don't think that most people need to be reminded that digital data is transmitted with electricity.


Not electricity but analog circuits. Many people think they have one or the other while they have a mix of both leaning heavily toward digital. So many misconceptions about the topic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: