Search the Community
Showing results for tags 'commercial software quality'.
I'm not entirely convinced about that, to be honest. From when I started with personal computing in the Spectrum ZX age until about now it's all been about increasing colors (from black and white -> 4 colors, then 16, 128, 256, ... 16M, now HDR and extended RGB); it was about increasing resolution (from 160x240 > 640x48 > 1920x1080, now 4K, and growing), improving sounds (we kinda reached high fidelity standards a while ago), and polycount. Now, granted, there's Oculus. I'm just not convinced that VR helmets really are the wave of the future simply because of the lack of convenience. I'm not sure how much room there is for further miniaturization while improving tracking quality and reducing latencies. First and foremost however it's a bad idea to isolate the player from his surroundings; you occasionally want to grab your glass of cola (or other carbonated sugar water) --- which is a terrible, terrible idea to do in the presence of pricey electronics while you are effectively blind and disoriented. What we'd really need are autostereoscopic monitors. Possible, I've seen them 30 years ago on a computer fair already, but they are rather costly to produce and, by necessity, cut the horizontal resolution in half. Also you need to keep your head level. So we'd really need a true 3D image projection, but the prototypes that were demonstrated so far (laser projection into aerosolic cloud, laser projection on fast rotating worm gear) do not promise the large, brilliant, high resolution screens to which we have become accustomed; unless some breakthrough comes along I expect nothing but economic failure. You don't only need a technology that can do something new, it also needs to be leaps and bounds better to justify the investment on the consumer side (which is why PhysX largely failed). Looking at the graphics since DirectX 9 came along, the looks of top of the line game titles used to be massive improvements over titles two years older up until about 2005. Over the last ten years the noticeable growth in visual quality has slowed down to ever smaller improvements. I mean improvements that are clearly recognizable by an ignorant layman. Looking at Deus Ex Human Revolution from five years ago, it's still looking reasonably good except that the locations are relatively small - and the plastic faces straight from the uncanny valley. So, I'm not sure if we should expect massive improvements in the visual field. Richer worlds, bigger worlds, all right. But if you just look at a low-end Windows 95 PC from the mid 90s with its tiny 14" monitor and basic soundblaster card - in essence, it's all there what we're still using today. If the concept has held up well for 20 years (or even 30 years, if we go back to the original Apple computers (and their cheaper Atari rip-offs)), chances are that the concept will still be valid in 2035 ... even if we may all have hybrid tablets/mobile devices with some sort of a desk interface/docking station for "real work" by then.