Tweet of the Day: Screaming for Vengeance
Have you ever watched an old sci-fi movie, even a classic like Star Wars and thought,”Wait a minute, computers don’t look like that!”
Welcome to Our Graphics Will Suck in the Future!
The obvious answer is that movies like Star Wars, Blade Runner, and Alien (and its sequel) were made at a time when, while home electronics existed, such as video game consoles and home computers, not even the most powerful supercomputers at the time could project smooth round shapes, let alone animated 3D models (outside of wireframe vector graphics). Fully rendered 3d consumer graphics would not reach Western homes until the second part of the 1990s and they looked something like this:
Virtua On screengrab from the SEGA Saturn, 1995
Very blocky by today standards. In fact, most consumer computers, be they PCs, home computers, or video game consoles could display text or graphics but not a lot of both. So either games had lots of text but few graphical flourishes or they had very colorful characters and backgrounds with little or no text. The reasons go beyond just the power of the processors at the time. Storage media had a lot to do with it as well. The switch from solid state or magnetic media to optical media allowed for bigger software packages and improved graphics. In fact, movies like TRON, set in an virtual world inside a computer network (a form of the internet, although it was not called that way at the time) relied on hand drawn animation with a handful of computer graphics for a few shots.
Today movies shower us with bright displays full of fast moving graphics and animations, but these exist not for the benefit of the characters in the fiction, but the viewers since a) it allows to convey a lot of information in a few frames, b) it fits their expectations of what the future would look like. However, most business and military applications, even today lack the kind of focus on graphic fidelity you find in modern games. The first reason is cost. If you need 100 machines to run basic business applications such as word processors, spreadsheets and data base software, why would you spend upwards of $1,000 per machine, when a $400 machine can do all that and still run the most graphically demanding software the machine would need, a web browser, with ease.
Military users are even more limited. It takes years to set the requirements and acquire the equipment, equipment that needs to operate under the types of strains that would nuke your iPad in two. So while these systems are digital, in some cases they are so streamlined in function that they work like analog systems and years out of date compared to their commercial competitors.
And then you have the odd case of sequels to the very movies above. Going back to the well decades after the original means attempting to emulate the look and feel of the earlier iterations. That means paying careful attention to such details as computer displays otherwise your prequel is going to look more advanced that product that canonically precedes.