How RAG Will Usher In the Next Generation of LLMs and Generative AI
Retrieval-augmented generation may provide a big step forward in addressing many of the issues that keep enterprises from adopting AI.
Retrieval-augmented generation may provide a big step forward in addressing many of the issues that keep enterprises from adopting AI.
Retrieval-augmented generation may provide a big step forward in addressing many of the issues that keep enterprises from adopting AI.
Training dense networks requires large numbers of GPUs or TPUs, and can take days or even weeks, resulting in large carbon footprints and spiraling costs. We believe the solution to this problem lies in the brain’s efficiency and power to learn, which arises from sparse connections and activations of neurons.
From releasing a book to publishing scientific papers, 2021 was quite a busy year for Numenta. If you haven’t had a chance to catch up on what we’ve been up to, I’ve rounded up our top content of the past year.
Six years ago, we wrote a blog about Classic AI, Simple Neural Networks, and Biological Neural Networks. Fast forward to today and it’s no surprise that the terms have continued to evolve. In this blog post, we’ll revisit these approaches, look at how they hold up today, and compare them to each other. We’ll also explore how each approach might address the same real-world problem.
How can we take a step towards the brain’s efficiency without sacrificing accuracy? One strategy is to invoke sparsity. Today I’m excited to share a step in that direction – a 10x parameter reduction in BERT with no loss of accuracy on the GLUE benchmark.
In our new pre-print titled “Going Beyond the Point Neuron: Active Dendrites and Sparse Representations for Continual Learning”, we investigated how to augment neural networks with properties of real neurons, specifically active dendrites and sparse representations.
Our research meetings are the cornerstone of everything we do. It’s where we share hypotheses, review papers, and often invite other researchers to share their work. Here are our most popular research meetings from the previous 12 months – just in case you missed them!