Code Monkey home page Code Monkey logo

peterovermann / triadicmemory Goto Github PK

View Code? Open in Web Editor NEW
24.0 4.0 14.0 13.31 MB

Cognitive Computing with Associative Memory

License: MIT License

C 3.35% Mathematica 94.18% Python 0.75% Julia 0.20% Scheme 0.73% Odin 0.33% JavaScript 0.42% Makefile 0.04%
neural-networks artificial-intelligence associative-memory cognitive-science sparse-distributed-memory hyperdimensional-computing vector-symbolic-architectures content-addressable-storage machine-learning self-supervised-learning

triadicmemory's Introduction

Triadic Memory: Cognitive Computing with Associative Memory Algorithms

This repository is a collection of associative memory algorithms operating on sparse distributed representations, which is our brain's data structure.

Our objective is to build a library of highly efficient software components, enabling practical applications of associative content-addressable memories.

The Triadic Memory algorithm, a new kind of tridirectional associative memory, was discovered in 2021. Dyadic Memory, which is based on the same algorithmic idea, is a hetero-associative memory also known as Sparse Distributed Memory.

Machine learning applications can be realized by creating circuits from the algorithmic components in this repository. An example is the Deep Temporal Memory algorithm, a recurring neural network based on multiple Triadic Memory instances and feedback lines.

Resources

  • Usage and application examples
  • Performance benchmarks for different implementations and computer systems
  • Discussion of Triadic Memory at Numenta's HTM Forum
  • Triadic Memory paper

Implementations

Triadic Memory

Triadic Memory is an associative memory that stores ordered triples of sparse binary hypervectors (also called sparse distributed representations, or SDRs).

As a content-addressable triple store, Triadic Memory is naturally suited for storing semantic information.

Triadic memory allows tridirectional queries: Any part of an SDR triple can be recalled from the other two parts. The algorithm learns new information in one shot. Stored data can be recalled from incomplete or noisy input data.

A Triadic Memory has the capacity to store (n/p)^3 random triples of hypervectors with dimension n and sparse population p. At a typical sparsity of 1 percent, it can therefore store and perfectly retrieve one million triples. The usable capacity is even higher, as associative memories are inherently tolerant to noise and errors.

The original Mathematica code can be found here. The plain C implementation can be compiled as a command line program or as a library. It's also a good starting point for people wanting to port the algorithm to another programming language.

Dyadic Memory

Dyadic Memory realizes a hetero-associative memory for sparse hypervectors which has the functionality of a Sparse Distributed Memory (SDM) as proposed by Pentti Kanerva in 1988.

The present, highly efficient algorithm was discovered in 2021 and is based on a neural network with one hidden layer and combinatorial connectivity. The original implementation was written in Mathematica language and consists of just 10 lines of code.

The memory stores and retrieves heteroassociations x -> y of sparse binary hypervectors x and y. Sparse binary hypervectors are also known as Sparse Distributed Representations (SDR).

The plain C implementation best illustrates the algorithm in procedural language. This version works with vector dimensions up to n = 20,000 and can be used in an asymmetric configuration where x and y have different dimensions.

Deep Temporal Memory

A temporal memory processes a stream of SDRs, at each step making a prediction for the following step based on previously seen information. It can also be used for learning separate terminated sequences.

Temporal Memory algorithms are based on circuits of two or more Triadic Memory instances with at least one feedback loop, resembling the architecture of recurrent neural networks.

The Elementary Temporal Memory uses two Triadic Memory units arranged in the form of an Elman network.

The Deep Temporal Memory algorithm is a circuit of hierarchically arranged Triadic Memory units with multiple feedback loops. It can recognize longer and more complex temporal patterns than the elementary version based on just two memory units.

Trained with a dataset from the SPMF project, Deep Temporal Memory achieves a prediction accuracy of 99.5 percent.

A plain C implementation can be found here.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.