• About
  • Blog
  • London
Categories
All (19)
CIFAR10 (1)
activation function (1)
algorithm (1)
animation (1)
attention (1)
books (1)
career (2)
classifier (1)
confidence interval (1)
confusion matrix (1)
convolutional neural network (2)
data science (1)
data visualization (1)
deep learning (3)
deeplearning (1)
embedding (1)
feature engineering (1)
gradient descent (1)
graphs (1)
hypothesis testing (1)
inferential statistics (1)
llm (1)
machine learning (4)
machine learning framework (1)
model evaluation (1)
neural networks (1)
polars (1)
precision (1)
pytorch (1)
recall (1)
research engineer (1)
research scientist (1)
sampling (1)
seq2seq (1)
statistics (2)
supervised learning (1)
tokenization (1)

 

LLM From Scratch Part 1 - Data Preparation

In the part 1 of LLM series we will understand about data preparation which inludes tokenization, sampling and embeddings.
Mar 1, 2025
Vidyasagar Bhargava

AI Research Scientist/Engineer Program

Guide to become AI Research Scientist/Engineer focusing on NLP and LLMs in companies like OpenAI, Meta, Anthropic etc.
Aug 16, 2024
Vidyasagar Bhargava

Introduction to Graph Neural Networks

A Graph is the type of data structure that contains nodes and edges. A node can be a person, place, or thing, and the edges define the relationship between nodes. The edges can be directed and undirected based on directional dependencies.
Aug 6, 2024
Vidyasagar Bhargava

Sequence to sequence learning with Neural Networks

A sequence-to-sequence model is a type of model that takes input sequence of items like letter, words, images etc. and outputs another sequence of items. These types of models achieved lot of success in tasks like language translation, text summarization and image captioning.
May 4, 2024
Vidyasagar Bhargava

 

Accelerate computation on Mac using PyTorch and gpu support

Running a experiment using pytorch tensors on cpu vs leveraging gpu support on M1 mac and see how much gain we get in terms of speed.
May 1, 2024
Vidyasagar Bhargava

Polars for Feature Engineering

Polars is a high-performance DataFrame library, designed to provide fast and efficient data processing capabilities. Inspired by the reigning pandas library, Polars takes things to another level, offering a seamless experience for working with large datasets that might not fit into memory.
Jan 3, 2024
Vidyasagar Bhargava

 

Introduction to Apple’s Machine learning Framework- MLX

Apple’s machine learning research team recently released a Machine Learning framework called MLX, a NumPy-like array framework designed for efficient and flexible machine learning on Apple silicon.
Dec 24, 2023
Vidyasagar Bhargava

Choosing Activation Functions

An activation function decides whether a neuron should be activated or not which helps neural network to use important information while suppressing the irrelevant data points.
Jun 8, 2023
Vidyasagar Bhargava

 

Statistical Learning Theory

A framework for analysing the inside of blackbox of machine learning algorithms.
Feb 1, 2023
Vidyasagar Bhargava

Best Data Science Books

Listing the best books available in the market right now for data science.
Jan 16, 2023
Vidyasagar Bhargava

Understanding Confidence Interval

Confidence Intervals are useful tool for expressing the uncertainity around an estimate.
Dec 5, 2022
Vidyasagar Bhargava

Successful delivering of Machine Learning Projects

Principles and process for democratizing ML projects
Sep 7, 2022
Vidyasagar Bhargava

Alexnet

AlexNet is the first deep architecture which was introduced by Alex Krizhevsky and his colleagues in 2012. It was designed to classify images for the ImageNet LSVRC-2010 competition where it achieved state of the art results. It is a simple yet powerful network architecture, which helped pave the way for groundbreaking research in Deep Learning as it is now.You can read more about the model in original research paper here
May 12, 2022
Vidyasagar Bhargava

LeNet-5

LeNet-5 is introduced by Yann LeCun, Leon Bottou, Yoshua Bengio and Patrick Haffner in the year 1998 in the paper Gradient-Based Learning Applied to Document Recognition. LeNet is a classic convolutional neural network employing the use of convolutions, pooling and fully connected layers. It was used for the handwritten digit recognition task with the MNIST dataset.
May 8, 2022
Vidyasagar Bhargava

Hypothesis Testing

A hypothesis testing is a way to test an assumption about a population parameter.
Apr 4, 2022
Vidyasagar Bhargava

Most Popular programming languages 2004-2021

Using animation lets see how different programmming language rise in last couple of decades.
Feb 10, 2022
Vidyasagar Bhargava

 

Linear regression using gradient descent

In this post we will implement gradient descent algorithm from scratch using numpy for linear regression.
Dec 10, 2021
Vidyasagar Bhargava

Fbeta-Measure

A generalization of the F-measure that adds a configuration parameter called beta
Mar 12, 2021
Vidyasagar Bhargava

 

Nearest Neighbour Classifier

Nearest neighbor classifiers are defined by their characteristic of classifying unlabeled examples by assigning them the class of similar labeled examples.
Mar 6, 2020
Vidyasagar Bhargava
No matching items
Back to top