AlterEgo is a non-invasive, wearable, peripheral neural interface that allows humans to converse in natural language with machines, artificial intelligence assistants, services, and other people without any voice—without opening their mouth, and without externally observable movements—simply by articulating words internally. The feedback to the user is given through audio, via bone conduction, without disrupting the user’s usual auditory perception, and making the interface closed-loop. This enables an human-computer interaction that is subjectively experienced as completely internal to the human user—like speaking to one’s self. AlterEgo seeks to combine humans and computers—such that computing, the Internet, and AI would weave into human personality as an internal “second self” and augment human cognition and abilities.
The wearable system captures peripheral neural signals when internal speech articulators are volitionally and neurologically activated, during a user’s internal articulation of words. This enables a user to transmit and receive streams of information to and from a computing device or any other person without any observable action, in discretion, without unplugging the user from her environment, without invading the user’s privacy.
Filed Under: AI • machine learning, Product design