NVIDIA IS BANGING the 3D drum with their new glasses called GeForce 3D Vision, but they are entirely missing the point. In typical fashion they are going about it not only the wrong way, but trying to counter the direction of the entire consumer electronics industry.
The technology they are using is active shutter glasses, something that went out of style when sane companies realised the costs involved. This is not to say it doesn't work, it does, but they are jacking the costs up for consumers in a way that doesn't make sense, halving the effective frame rate, and inserting a host of other problems as well.
Stepping back a little, there are several types of 3D technology that can be used. Some put the technology in the glasses, others put it in the monitor itself. Some require active components, others do not. Each has ups and downs, and depending on the situation, may or may not make sense for the specific application at hand.
You may be familiar with the old red and blue glasses, one of the first types of consumer 3D. It worked somewhere between badly and really badly, was monochrome, and had a lot of ghosting. The technical term for this is anaglyph, and it is dirt cheap to implement. Since these came out, things have gotten a lot better.
The next most common version is polarised glasses, with each eye being polarised at 45 degrees to each other or circularly in opposite directions. You have probably seen these before at Imax or other 3D cinemas. They work pretty well, but usually require two projectors in a theatre or a few additional layers in a monitor. This adds cost, sometimes a lot, but it is a one time event, and the glasses cost marginally more than anaglyph glasses.
Closely following that is the active glasses technology. These are basically glasses that have LCDs in each eye, and turn black every other frame. They also work pretty well, but you get half the frames in one eye, and half the frames in the other. This can lead to headaches as your eyes try to compensate for the on/off light, low frame rates for gamers and synch problems.
Active glasses need a transmitter that is synced to the frame rate, usually through an IR transmitter. If you are out of range, turn your head, or have any obstructions, it may stutter or simply not work. Worse yet, they are battery operated, so you have to replace batteries or charge them, and in general spend time and effort keeping them working. The cost of active glasses is many times that of passive glasses, tens of dollars vs dollars or even cents for polarised or anaglyph. The more glasses you need, the more expense climbs.
Next up, we have screens that are 3D in and of themselves. There are several technologies used to do this and they are usually lumped together, but they all have two traits in common. The first is that they use multiple screen layers like the polarised glasses, adding to cost, but you don't need glasses. The drawback is that they have a very narrow viewing angle, so the sweet spot is generally good for only one person, a handful at best. While they sound nice in theory, implementation troubles have confined it to a shrinking niche.
The last one, Dolby 3D Digital Cinema, is not applicable to PCs because current monitors don't work in a compatible way. This is a shame because it is by far the best of breed, and you will be seeing more of it in the future. The short story is that it slices up light into spectrum bands, and each eye gets half of the colours. There are about 50 bands, so you don't see anything other than a very clear and clean picture. It works amazingly well if you get a chance to see a movie using it, it is well worth the extra few dollars.
With that not-so brief overview in mind, Nvidia chose the active 3D glasses route, quite possibly the worst of the modern lot as far as the consumer is concerned, especially if the end result is gaming. Why? For several reasons, from cost to headaches.
First there is cost. To buy the setup, you need to purchase a transmitter and glasses, and sync it up to the video. This isn't terribly complex, and the drivers will likely be bundled with NV's video card drivers. Positioning the transmitter, basically a simple IR LED isn't that tough either, you put it on the monitor or near it, like a Nintendo Wii bar.
The problem here? Well, the transmitter blinks in sync to the monitor is it attached to, and since it is very simple to keep costs down, it uses a set frequency. They have to make it reasonably powerful, enough for viewing in a normal TV environment, say 10-20 feet, and with a decent broadcast angle so you can get the couch and a chair or two on either side.
Think about that at a Lan party, or with more than one PC in the same room. Twenty-five transmitters blinking on and off within the same broadcast area is not a terribly bright idea, don't you think? If gamers are not the target audience, I am not sure what kind of creature is, and the first Lan party where they try and use this will end up in hilarity. To quote the philosopher B. Bunny, what maroons.
Next up is cost. A decent 22-inch 1680 by 1050 monitor costs about $159 right now, so the panel itself is not that expensive. Adding a second one or a few coatings to the monitor won't add that much cost if volume is there. iZ3D and Zalman have been at the forefront of this technology, with iZ3D selling its 22-inch version for $349 even without huge Samsung-esque volumes. Basically, they can be had now, quite cheaply. Scaling up the size is only a matter of adding a bigger coating layer to an existing panel, not exactly rocket science. They also require no syncing, no transmitter, and are entirely passive. Currently, the only cost is a $200 or so monitor premium and extra glasses have an MSRP starting at $.60 retail.
The shutter glasses on the other hand are fairly cheap, adding an estimated $20-30 per pair to the cost. If you have a family of four, this basically evens out the cost between an inherently 3D PC monitor, and a bunch of shutter glasses plus transmitter. Batteries are probably included, but the next set won't be.
So far it is a wash, unless you step on a pair of glasses, drop them, or have a dog with a taste for LCDs. Each additional set jacks up the cost by an estimated $20-30 for the NV active glasses, $1 or less for the passive polarised ones, going up to $10 for the real fancy versions. Not a good start.
Then there are the monitors that you need for the shutter glasses. Nvidia seems to have once again taken the moronic route and required 120Hz monitors. While this does reduce flicker and headaches a bit, it is almost assuredly done because of frame rates. Remember, shutter glasses effectively halve the effective frame rate so if you have a standard 60Hz monitor, you are stuck with gaming at 30 FPS, aka chunky and barely playable. The other option is to shell out many dollars for a fast panel. Requiring 120Hz panels effectively blows out the cost of shutter glasses systems. Dumb dumb dumb.
Last up is headaches. The shutter glasses blink on and off at half your frame rate, 60Hz or 120Hz depending, and this can lead to shooting pains in the eyes and migraine headaches. Think 60Hz CRT monitor under fluorescent lighting, blink blink blink blink... I know they will say it doesn't happen, but what do you expect them to say, they won't even tell you what GPUs are faulty after six months.
This is a real problem, and after using the glasses for a bit, you will realise why no rational company has deployed the technology over the polarised or Dolby systems. Things have come a long way since I got my first pair of StereoTek glasses on the Atari ST in the early 90s, but the fundamental problems remain. You will get a headache if you use it for long periods, like say raiding in WoW. Once again, what were these people thinking, if they were at all?
So, there has to be an upside to the glasses, right? Sure, they work fairly well for short periods of time, and the price tag isn't all that crushing on the store shelf. As long as you have a 120Hz monitor, you don't need to add much at all. The best part is that Nvidia can sell it, while they don't have a monitor line. This is the real reason they picked the technology, it is the one that makes them the most money. I never said the upsides were to you, the consumer, you lose on this one if you take the Nvidia route.
If you are even marginally following the consumer electronics world, you will know that these Nvidia glasses will debut at CES in just over a week. So will 3D monitors from just about everyone under the sun. I have seen press releases from iZ3D, Samsung, and about a dozen others out there touting their new 3D monitors for CES. Basically everyone in the world that makes monitors is going to make a 3D version, and prices are going to plummet.
All of these are going to be polarised-glasses versions, incompatible with the Nvidia way. Guess which system will have the higher volume? Guess which one will crater prices faster? Guess which one will be compatible with the majority of peripherals? Guess which one will use cheap glasses that you can buy everywhere? Guess which one won't need batteries? Guess which one will have easy clip-ons for existing eyeglasses? Guess which one won't give you blinding headaches? Shall I go on?
The answer to all of the questions is not Nvidia. Nvidia is using a technology that is not only inferior in several key ways, but also is bucking the prevailing consumer electronics trend. When you have almost every panel maker on the planet going one way, and Nvidia the other, one side is going to be left out. Call me crazy, but I am willing to bet that side is not going to be Samsung, Sony, Panasonic, Mitsubishi, Casio and the rest.
Why are they foisting a second-rate technology that halves a game's frame rate on consumers? Because they can make money directly from it. They are once again selling you short to line their pockets. Would you expect anything less? µ
Manual camera controls, user accounts, Apple Pay improvements and more
How does Canonical's Ubuntu OS fare on mobile?
The top 10 stories from the past seven days
SoC will debut in Google Daydream-compatible devices