word2vec cbow negative sampling feature image

Journey Through Embeddings: CBOW with negative sampling

In previous post, we explored the fundamentals of Word2Vec and implemented the Continuous Bag of Words (CBOW) model, which efficiently predicts target words based on their surrounding context. While CBOW is powerful, scaling it to large vocabularies can be computationally challenging due to the softmax function. This is where negative sampling comes in—a clever optimization […]

Journey Through Embeddings: CBOW with negative sampling Read More »

Feature image

Mastering Backpropagation: Math and Implementation

Artificial intelligence (AI) has rapidly advanced, transforming industries and pushing technological boundaries. Central to this progress is backpropagation, the essential algorithm that enables neural networks to learn and adapt. Often considered the heartbeat of AI, backpropagation drives the learning process, allowing machines to improve with experience. In this blog post, we’ll demystify backpropagation, exploring its

Mastering Backpropagation: Math and Implementation Read More »

Optax Feature Image

Unlocking Neural Networks with JAX: optimizers with optax

In our previous exploration within the “Unlocking in JAX” series, we covered the foundational concepts of neural networks and their implementation using JAX. Building on that knowledge, this post, shifts focus to a crucial component that significantly enhances learning efficiency: optimizers. Optax, a gradient processing and optimization library tailored for JAX, provides the tools necessary

Unlocking Neural Networks with JAX: optimizers with optax Read More »

MLP In JAX

Unlocking Neural Networks with JAX

Welcome back to our Unlocking with JAX series! Today, we’re getting hands-on with neural networks by building a multilayer perceptron (MLP) using JAX. JAX sits perfectly between the lower-level, details of CUDA and the higher-level abstractions offered by frameworks like Keras, offering both clarity and control. This balance helps immensely in understanding the inner workings

Unlocking Neural Networks with JAX Read More »

Scroll to Top