Um, this is actually what my SHIFZ-partner Chris and I have been talking about frequently over the past years: a robot remote controlled by a Brain-Computer Interface (BCI).
The guys at University of Washington's Neural Systems Group have developed a good demonstration of a simple application of that principle: a small humanoid robot paired with a BCI that lets the user choose objects to be picked up by concentrating on the virtual representation and to be placed onto a destination table chosen the same way.
The brain activity measured by the EEG is used to control a simple set of parameters/"choices", while taking advantage of a variety of pre-programmed skills in the avatar.
(via MAKE:blog)
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment