I built a Hyper Dimensional Computing (HDC) Encoder and Decoder and I also built out all of the code needed to replace the Encoder and Decoder of a Llama model with this HDC model, then train the Llama model on the HDC Encoder/Decoder. All MIT licensed. Here is a video where I break it all down. I can answer any questions about this project or help anyone out where I can. I am not a super developer or anything and I don't have access to enough compute to train this on a large dataset: https://youtu.be/4VsZpGaPK4g
Ever wondered how neural networks actually work under the hood?
In my latest video, I break down the core mathematical concepts behind neural networks in a way that's easy for IT professionals to understand. We'll explore:
- Neurons as logic gates - Weighted sums and activation functions - Gradient descent and backpropagation
No complex equations or jargon, just clear explanations and helpful visuals!