Who here sees one of those great sci-fi user interface scenes and is instantly so geeked out by the potential that they could hardly form a thought more coherent than 'WANT'?
I really hope there are more of you out there and it's not just some sort of seizure I keep having.
The thing is, we should have this already. . . and quite a bit more. In fact, we could've had a lot of this for years (and would have if we had an actual consumer driven market . . . but that's for another article!). We have to assemble a couple of ideas together, but when we're done I think you'll agree that we've all been done a grave disservice.
Let's start with two pretty well known commodities. . . Google Glass and the Oculus Rift. As devices they're pretty different technologically, but at their little hearts of hearts they're both expressions of an improving ability to project an interface right in front of somebody's eyes.
While one is just a tiny translucent screen and the other is a good half of the headgear needed for some Hellraiser Cosplay (is that a thing?) the better features of both will likely be merged as technology advances and we'll end up with a range of advanced visual interfaces. Sure, we might have to offload the processing to a local network or to something we wear on our bodies or backs . . . but that's where we're going to end up the moment there's some real resources dedicated to those technologies.
As that happens, the need to project visual information anywhere beyond one's eyes or audible information beyond one's own ears becomes less a necessity and more an extravagance.
If we'd dedicated more resources to these technologies years ago, these techs would be old news. I know that we're not talking about having your own J.A.R.V.I.S. or anything, but let's at least let that sink in for a moment. . .
You should have been reading this while you danced around your room, with a symphony of personalized interfaces whirling around you.
I hope you're a little upset, because I am. We've been waiting far too long. The delay of ubiquitous jetpacks and flying cars was bad enough . . . this we *deserve*, right?
And we're just getting started.
Let's add something to the mix. Haptics. Not 'ooh, my phone vibrates!' (hush, you!), but full-fledged, wearable ('feel'-able) interfaces.
Our fingers and limbs are amazingly precise parts of us, our motor control is nothing short of fantastic. . . but to really choose a precise point in space they need to FEEL something. We struggle with keyboards on slick screens because they're not ergonomically optimized for us humans. Similarly, we're not very good at interacting with things that aren't actually there . . . for some reason we never needed to evolve that until now (presumably having been more worried about getting eaten by things that are).
I don't know about you, but I'm not sure how I feel about a world of purely holographic interfaces. The mimes would have too much of an advantage.
Again, they're all (albeit more vaguely) illustrating another line in that they all allow us to replicate sensations without having to actually create whatever we're interacting with. . . be they keyboards, puppies, or gravity.
In combination, you have everything from the ability to feel a floating keyboard to the opportunity to simulate scrambling for a handhold while an unfairly big tentacle has a hold of your boot. The floating keyboard would also be on the low end of the cumbersome spectrum but there's no real technological advances required to make a very light exoskeleton that could cover the full range of motion and offer some degree of physical feedback (proper interaction with imaginary physical objects with your hands gets bonus awesome if there's a bridge across your shoulders)
Again, there are no real fancy technological advances here . . . just variations on a theme. We would have this and more if we focused our efforts elsewhere.
You should have been listening to this being read to you in your favorite voice while your hands and eyes designed a great work of art
I'm going to add one more thing to the mix . . . something that isn't on as many radars.
Sure, they were just trying to make a better welder's helmet . . but they did SO much more (and to their credit they see the potential).
If you fast forward to 2 minutes you'll see the bit where they managed to merge the views of several different cameras in order to provide a view that no human eye should ever be able to see.
If you fast forward to 3 minutes you'll see where they show us how to make MODs for reality.
If you're really not hooked yet, at 5 minutes they're talking about using the same technique to enhance people's views while driving or just to make the world more beautiful.
So. . . let's weave this all together. . . what do we have?
This article should never have been written.
We should have all been too busy playing an amazing live LARP/MMO together with our favorite people. In this game our powers and skills should require us to make selections in ways that cleverly support distributed processing of tasks like protein folding. Then those who didn't want to work/game anymore could go home and do other things, while others could grind away cancer.
. . . or something better. Seriously, Tony Stark has it BORING. That's a tiny little dream.
In my research I came across this lovely video, and it was the first to offer up a couple of real interface challenges. I do not have words for the awesomeness that is contained within . . . I wouldn't do it justice.
I know. . . right?
There are only two things that aren't logical extensions of these technologies. Scent . . . and being able to connect to loved ones who can't speak back to us.
Sure, the second one might be really hard, but I say we shoot for it anyway . . . even if we fail who knows what we'll discover along the way.
So . . . how about you? What would we be able to do if we took these technologies a little further? What other technologies can you weave in to make it better?