A new hyperdimensional computing theory could improve the way robots ‘remember’ information, potentially changing the way artificial intelligence works.
The hyperdimensional computing theory will help to integrate robotic perceptions with motor capabilities to improve the way that AI performs.
The University of Maryland write: “The Houston Astros’ José Altuve steps up to the plate on a 3-2 count, studies the pitcher and the situation, gets the go-ahead from third base, tracks the ball’s release, swings … and gets a single up the middle. Just another trip to the plate for the three-time American League batting champion. Could a robot get a hit in the same situation? Not likely.
It explains: “Altuve has honed natural reflexes, years of experience, knowledge of the pitcher’s tendencies, and an understanding of the trajectories of various pitches. What he sees, hears, and feels seamlessly combines with his brain and muscle memory to time the swing that produces the hit. The robot, on the other hand, needs to use a linkage system to slowly coordinate data from its sensors with its motor capabilities. And it can’t remember a thing. Strike three!”
Using hyperdimensional computing to help robots ‘remember’
In the new hyperdimensional computing theory, the operating system of a robot would be based on hyperdimensional binary vectors, which represent disparate, discrete things. The example given by University of Maryland is a single image, a concept, a sound or an instruction; sequences made up of discrete things; and groupings of discrete things and sequences.
The hyperdimensional binary vectors account for these types of information in a construct way by binding each modality together in long vectors of 1s and 0s in equal dimension. ”
How does hyperdimensional computing theory work?
A hyperdimensional computing framework is able to turn a sequence of “instants” into new hyperdimensional binary vectors (HBVs) and group existing HBVs together, all in the same vector length. This is a way of creating semantically significant ‘memories’. As more information is encoded, signals become vectors, which translates them to memory, and learning happens through clustering.