Originally Posted by Sean W on the Dean blog article
Alright, people, sit down a second and let's do some math. Read the article: You need a consistent stream of 2 megabits per second to transmit standard-definition. I have a Comcast cable modem, advertised at 6 megs a second, and *sometimes*, depending on the time of day, I get 2 megs a second --- and sometimes I get barely one. Fiber gives you faster, to be sure, but no DSL line is good enough.
So let's say you have a fast connection. What does that 2 megs a second give you? That's 2,000,000 bits per second, or 250,000 bytes per second --- 250 KB per second of incoming data, assuming no sync bits or other internal communication that would be required in a realistic network protocol.
What does 250KB per second give you? Let's say they want you to see a framerate of at least 25 frames per second. (Don't even ask about 60 FPS.) That's then 10KB of data per frame, on average.
Now let's assume they really *do* have a magical compression scheme that will consistently give them 90% compression for every frame. It's possible --- wavelet compression, for example, can give you pretty good images at 90%, and might even be suitable for video if you alternate several delta frames with an occasional sync frame. 90% compression means that every 10KB of compressed data expands out to 100KB of uncompressed data --- or 100,000 bytes.
So what can you do with 100,000 uncompressed bytes per frame? Well, you *can't* do a full frame of data every 1/25 of a second at standard definition. Standard def has a resolution that varies, but it's around 320x240, or 230,400 bytes (RGB color gives 320*240*3 bytes per pixel = 230,400 bytes). Under good compression (90%) that's 23KB of data. Which is bigger than the 10KB per frame that the network connection has room for. That means that for every full frame sent in their scheme, there are probably at least two delta frames --- two frames partially composed of pieces of other full frames they've already sent.
So there you have your answer. If their system works at all (which is possible), your steady two-megabit connection --- if you even have one --- will give you 320x240 resolution with no more than every third frame being a full one. That's lower quality than traditional TV, and much, much lower quality than hi-def or anything you're used to looking at on your PC. And that's assuming there's a *lot* of good-quality magic in their code.
And that doesn't even get into issues like network latency and lag and packet sizes and QoS.
Maybe they've got some secret sauce to get past those bandwidth limits. But I've been coding for 25 years, I have a CompSci degree, and I've heard claims like this before. It's easy to make claims like this because most of the public doesn't really understand things like information complexity theory. My math says their system may work for some users but not for all, and is likely to be crappy quality for most. Maybe I'm wrong. I hope so. But I doubt it.