Back when I was in Uni, so late 80s or early 90s, my dad was Project Manager on an Air Force project for a new F-111 flight simulator, when Australia upgraded the avionics on their F-111 fighter/bombers.
The sim cockpit had a spherical dome screen and a pair of Silicon Graphics Reality Engines. One of them projected an image across the entire screen at a relatively low resolution. The other projector was on a turret that pan/tilted with the pilot's helmet, and projected a high resolution image but only in a perhaps 1.5m circle directly in from of where the helmet was aimed.
It was super fun being the project manager's kid, and getting to "play with it" on weekends sometimes. You could see what was happening while wearing the helmet and sitting in the seat if you tried - mostly ny intentionally pointing your eyes in a different direction to your head - but when you were "flying around" it was totally believable, and it _looked_ like everything was high resolution. It was also fun watching other people fly it, and being able to see where they were looking, and where they weren't looking and the enemy was speaking up on them.
Somewhere between '93 and '95 my father took me abroad to Germany and we visited a gaming venue. It was packed with typical arcade machines, games where you sit in a cart holding a pistol and you shoot things on the screen while cart was moving all over the place simulating bumpy ride, etc.
But the highlight was a full 3D experience shooter.
You got yourself into a tiny ring, 3D headset and a single puck hold in hand. Rotate the puck and you move. Push the button and you shoot. Look around with your head. Most memorable part - you could duck to avoid shots!
Game itself, as I remember it, was full wireframe, akin to Q3DM17 (the longest yard) minus jump pads, but the layout was kind of similar. Player was holding a dart gun - you had a single shot and you had to wait until the projectile decayed or connected with other player.
I'm not entirely sure if the game was multiplayer or not.
I often come back to that memory because shortly after within that time frame my father took me to a computer fair where I had the opportunity to play doom/hexen with VFX1 (or whatever it was called) and it was supposed to revolutionize the world the way AI is suppose to do it now.
Then there was a P5 glove with jaw dropping demo videos of endless possibilities of 3D modelling with your hands, navigating a mech like you were actually inside, etc.
That sounds like you're describing dactyl nightmare. [1] I played a version where you were attacking pterodactyls instead of other players, but it was more or less identical. That experience is what led me to believe that VR would eventually take over. I still, more or less, believe it even though it's yet to happen.
I think the big barrier remains price and experiences that are focusing more on visual fidelity over gameplay. An even bigger problem with high end visual fidelity tends to result in motion sickness and other side effects in a substantial chunk of people. But I'm sticking to my guns there - one day VR will win.
It is precisely that! My version was wireframe and I can't recall the dragon, but everything else is exactly like I remembered it!
For me this serves as an example.
Few years later VFX1 was the hype, years later Occulus, etc.
But 3D graphics in general - as seen in video games - are similar, minus recent lumen, it's still stuff from graphics gems from 80-90s, just on silicone.
Same thing is happening now to some degree with AI.
I expect part of it is that the contemporary recommendations for VR are extremely meaty - something like 2160x2160 and 120hz with stereoscopic rendering meaning you're rendering every frame twice.
That's more than 1.1 billion pixels per second. At 24 bits a pixel that's something like 26Gb/s of raw data. And that's just in bandwidth - you also need to hit that 120hz of latency, in an environment where hiccups or input lag can cause physical discomfort for a user. And then even if you remote everything you need the headset to have enough juice to decompress and render all of this and hit these desired throughputs.
I'm napkin mathing all of this, and so I'm sure there have been lots of breakthroughs to help along these lines, but it's definitely not a straightforward problem to solve. Of course it's arguable I'm also just falling victim to the contemporary trappings of fidelity > experience, that I was just criticizing.
I played that game in Berlin in the late 90s. There were four such pods, iirc, and you could see the other players. The frame rate was about 5 frames per second, so it was borderline unplayable, but it was fun nevertheless.
Later, I found out that it was a game called ”Dactyl Nightmare” that ran on Amiga hardware:
The booth depicted on the 1000CS image looks exactly how I recall it, and the screenshot looks very similar to how I remember the game (minus dragon, and mine was fully wireframe), but the map layout looks very similar. It has this Q3DM17 vibe I was talking about.
Isn't this crazy, that we had this tech in ~'91 and it's still not just there yet?
On similar note - around that time, mid 90s, my father also took my to CEBIT. One building was almost fully occupied by Intel or IBM and they had different sections dedicated to all sorts of cool stuff. One of I won't forget was straight out of Minority Report, only many years earlier.
They had a whole section dedicated to showcasing a "smart watch". Imagine Casio G-Shock but with Linux. You could navigate options by twisting your wrist (up or down the menu) and you would press the screen or button to select an option.
They had different scenarios built in form of an amusement park - from restaurant where you would walk in with your watch - it would talk to the relay at the door and download menu for you just so you could twist your wrist to select your meal and order it without a human interaction and... leave without interaction as well, because the relay at the door would charge you based on your prior selection.
Or - and that was straight out of Minority Report - a scenario of an airport, where you would disembark at your location and walk past a big screen that would talk to your watch and display travel information for you, prompting question if you'd like to order a taxi to your destination, based on your data.
I remember a guy I know went to japan/asia around 1985ish and came back with a watch. It had hands, but also a small LCD display. You could draw numbers on the face with your finger, like 6 then X then 3 then = and the LCD would show the values, and finally 18
This is completely uninteresting now, but this was 40 years ago
It was a really interesting and weird time growing up when Japan was the king of tech. I had a friend who's dad was often over there and bringing all sorts of weird stuff back. There was this NES/Famicon game where you played with a sort of gyroscope. I have no idea how you were supposed to play the game, but found the gyroscope endlessly fascinating. Then of course there were the pirated cartridges with 100 in 1 type games. Oh then we found the box full of his dad's "special" games. Ah, good times.
There were some licensed games in Japan that they'd never release in the West, and also a relatively large scene for unlicensed/'bootleg' games. Fun slightly related factoid - the Game Genie was an unlicensed hardware mod and they actually got sued by Nintendo, and won.
I somehow suspect in modern times they'd have lost.
Everything you described and more is available from modern home Vr devices you can purchase right now.
Mecha, planes, skyrim, cinema screens. In VR, with custom controllers or a regular controller if you want that. Go try it! It’s out and it’s cheap and it’s awesome. Set IPD FIRST.
William Gibson's 1984 novel Neuromancer, about 2 AIs with the same creator, locked in conflict, is actually prophetic. About Microsoft Bob and Clippy in the 1990s.
That’s reality cool. My first job out of college was implementing an image generator for the simulator for the landing signal officer on the USS Nimitz, also using SGI hardware. I would have loved to have seen the final product in person but sadly never had the chance.
Back when I was in Uni, so late 80s or early 90s, my dad was Project Manager on an Air Force project for a new F-111 flight simulator, when Australia upgraded the avionics on their F-111 fighter/bombers.
The sim cockpit had a spherical dome screen and a pair of Silicon Graphics Reality Engines. One of them projected an image across the entire screen at a relatively low resolution. The other projector was on a turret that pan/tilted with the pilot's helmet, and projected a high resolution image but only in a perhaps 1.5m circle directly in from of where the helmet was aimed.
It was super fun being the project manager's kid, and getting to "play with it" on weekends sometimes. You could see what was happening while wearing the helmet and sitting in the seat if you tried - mostly ny intentionally pointing your eyes in a different direction to your head - but when you were "flying around" it was totally believable, and it _looked_ like everything was high resolution. It was also fun watching other people fly it, and being able to see where they were looking, and where they weren't looking and the enemy was speaking up on them.