Skip to content

Latest commit

 

History

History
74 lines (42 loc) · 1.28 KB

File metadata and controls

74 lines (42 loc) · 1.28 KB

Neural Networks Library from Scratch

the goal in this project is to create a small and modular library to implements a classic neural networks. I was insipred by the architecure of pytorch and tensorflow to create my classes and functions.

I'm doing this project because I'm still using libraries like pytorch and tensorflow and I wanted to implement a neural network myself from scratch.

Features

Layers

  • ✅ Linear Layer

Activation Functions

  • ✅ ReLU
  • ✅ Sigmoid
  • ✅ Tanh
  • ✅ Softmax

Loss Functions

  • ✅ MSE
  • ✅ CrossEntropy
  • ✅ BinaryCrossEntropy

Optimizers

  • ✅ SGD
  • ✅ Adam

Example of classification (see more in the example directory)

Multi classification



Multi classification on the mnist dataset



Binary classification

Example of regression

Linear regression