HDC starts by mapping data into a high-dimensional space, similar to how neurons in the brain process information. For nominal data (e.g., categories), HDC uses an Item Memory (IM) that maps each category to a randomly chosen atomic hypervector (HV). These HVs are pseudo-orthogonal in high-dimensional spaces, which converge to exact orthogonality with increasing dimensionality. This mapping allows the brain-inspired algorithm to process and compare information efficiently.
Encoding: A Linear Approach
In the case of ordinal or discrete data (e.g., levels of categorization), HDC uses a Continuous Item Memory (CIM) that applies linear mapping of levels to atomic HVs. This approach preserves the natural ordering of categories, ensuring similarities between closer levels are mapped to more similar HVs than further apart levels.
Initial Prototype Construction: Building the Foundation
The next step in HDC is building an initial prototype construction, which involves creating a set of prototypes that represent the categories or levels of data. These prototypes serve as reference points for subsequent computations.
Training: Learning and Adaptation
In the training phase, HDC uses a supervised learning algorithm to adapt the prototype representations based on the given data. This process involves adjusting the prototypes’ weights to minimize the distance between the prototypes and the data points, ensuring accurate classification or regression.
Inference: Making Sense of Complex Data
The final step in HDC is inference, which involves using the trained prototype representations to classify or predict new data points. This process leverages the brain-inspired approach by mapping complex data into a high-dimensional space, where it can be efficiently processed and compared with the prototypes.
Summary: A Brain-Inspired Approach to Efficient Artificial Intelligence
Hyperdimensional computing offers a unique approach to artificial intelligence, inspired by the brain’s efficient processing of complex information. By mapping data into high-dimensional spaces, preserving natural orderings, and adapting prototype representations through supervised learning, HDC provides an energy-efficient and lightweight solution for various applications, including text classification, speech recognition, and more. While HDC is still a relatively new technology, its potential to demystify complex AI concepts and provide practical solutions makes it an exciting area of research to watch.