Losing all control-lers: Kinect
For quite a while, we wanted our computers to get smarter and faster.
Our computers are capable of incredible processing feats, compared to even ten years ago.
Now, what we want is for them to be easier.
That’s coming, in several big ways. In a recent post, I wrote about IBM’s Watson, which has language processing capabilities that show a path to being able to have normal verbal conversations with our computers.
Ever since we got past keyboard only computer games, we’ve relied on various kinds of controllers. ..Atari joysticks to Wiimotes.
In November of this year, Microsoft is changing that.
It’s a device (formerly known as Project Natal) that connects to your XBOX…and then reads you.
No wires, nothing to hold.
It looks at you, to see how you are moving.
It looks at your face…it will recognize individuals.
It listens to your voice…even detecting emotions.
If it works as advertised, and as the videos suggest, it will revolutionize the way we interact with computers (eventually).
Here’s a change for you: no passwords to remember. Just as the Kinect can recognize individuals, facial recognition can be used at work. The Kinect suggests it can be done inexpensively.
Imagine this one: you walk in front of a wall-size screen. What you see is like your reflection in a mirror…only it isn’t you. You interact with other characters “through the mirror”. Their images do exactly what they do: sit down, stand up, wave. No controllers, no button-mashing combinations to learn.
You may have friends, and have no idea what race or gender they are.
Yes, we have online avatars now, but this ratchets it up to a whole different level.
This positional analysis that the Kinect is able to do (for only $150)…I can see a lot of medical uses and sports uses that could be made out of that. Your doctor needs to know the range of motion in your right knee? You can do it at home and get an accurate number. You want to improve your tennis forehand? Complete analysis…and recommendations.
You might wonder what happens if you are playing a fighting game and you can’t physically get that nunchuk thing right. It would be easy to have it amplify your moves. It could recognize that when you move your hand a little, it’s the equivalent of an expert moving a hand a lot.
Imagine that one…a disabled person could make movements that the system could reinterpret. Lift your left hand, and it could show you taking a step with your left leg. A paralyzed person could appear to walk…which could eliminate another prejudice in virtual interactions.
There could also be real therapeutic uses, especially for those with phantom limbs.
In a sense, this could be like the holodeck on Star Trek: The Next Generation…and you wouldn’t need to put on a costume first.
It will require special games…and those are being produced. One of them is a Star Wars game…put your hands in the right position and see a light sabre? There is one with animals…you can call your pet, and it comes…and knows you are you.
Is it possible it won’t work as advertised? Sure.
But if it does, we may mark this as a turning point in our interactions with computers.
Why am I mentioning this if it isn’t coming out until November?
It certainly may sell out…and it’s one to a household.
I’d consider ordering one now. Even if you don’t want it, it may make a great gift for someone who has an Xbox…although you could get one of those, too. 🙂
This post by Bufo Calvin originally appeared in the The Measured Circle blog.