Skip to content

Grada is an interactive tool that allows you to observe real-time changes as you train a multilayer perceptron. Built from scratch, it doesn't rely on any machine learning libraries and features a custom tensor-based engine for handling computations.

Notifications You must be signed in to change notification settings

saliherdemk/Grada

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Grada

Grada is an interactive tool for observing real-time changes while training a multilayer perceptron, built entirely from scratch without libraries like TensorFlow, PyTorch, or NumPy. An extension of Karpathy's micrograd, Grada originally used a scalar-based engine, but has now been reengineered with a custom tensor-based engine. The scalar version is available on the scalar-value branch.

With a simple drag-and-drop interface, you can easily construct neural networks and watch how training affects parameters and outputs in real time. Grada also features a component for handwritten digit recognition, enabling you to test your model interactively by drawing digits and visualizing predictions.

The website welcomes you with a quick manual that guides you through using the app.

Live Demo

Available here

Resources

https://www.youtube.com/watch?v=VMj-3S1tku0

https://github.com/karpathy/micrograd

https://github.com/ixartz/handwritten-digit-recognition-tensorflowjs

https://cs.stanford.edu/people/karpathy/convnetjs/demo/mnist.html

https://people.ece.ubc.ca/bradq/ELEC502Slides/ELEC502-Part5VectorizedBackpropagation.pdf (***)

Contributing

Contributions, issues and feature requests are welcome.

About

Grada is an interactive tool that allows you to observe real-time changes as you train a multilayer perceptron. Built from scratch, it doesn't rely on any machine learning libraries and features a custom tensor-based engine for handling computations.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published