:lol:Quote:
Originally Posted by shidoshi
It isn't that technology isn't evolving. We have RPLCD which is getting better every year, and DLP which is just about to move from single chip to 3 chip, and TI's even improving the single chip method. The problem with the thought that plasmas can become burn-in proof is that we've been using phosphor based tech for quite a while now and those old dinosaur rear projection CRT's, even though they've been around for decades, STILL burn in.Quote:
I'm not saying that you're wrong, and I certainly don't claim to know a lot about plasma. I just have a problem believing that something can't change. There have been too many times when we (as in mankind) was absolutely certain we knew the limits of something, only to be proven wrong. Do I know what they could do to make plasma suffer less burn-in? Not at all. But with the way technology has been advancing, I really hesitate when it comes to saying that we know all there is to know about the technology and what its limits are.
I mean, news is now breaking about the first face transplant - that sure as hell wasn't something I was expecting to see anytime soon. I believe, in my heart, that if we can transplant faces, we can make better plasma TV technology.
You can take measures to prevent it, like lowering the contrast and brightness, not watching the same thing all the time, etc. Problem is that most TV is still in 4:3. I watch 4:3 in its original aspect ratio on my Sony with black bars and everything about 90% of the time, because burn-in isn't an issue. I got pretty sick of fat people on my old Mits. :p
