A simple JavaScript neural network implementation from scratch, designed to learn logical gates (AND, OR, XOR) and recognize handwritten digits (MNIST dataset).
- Single-layer Perceptron: For simple linearly separable gates (AND, OR)
- Multi-Layer Perceptron (MLP): For complex patterns like XOR and MNIST digit recognition
- Custom Matrix Library: Basic matrix operations for neural network computations
- Pre-built Training Models: Ready-to-use training configurations for different tasks
- Web API: Express server with training endpoint and static file serving
npm installRun the default example (MNIST digit recognition):
npm startStart the web API server:
npm run start:apisrc/
├── index.js # Entry point
├── sl-neural-network.js # Single-layer perceptron
├── ml-neural-network.js # Multi-layer perceptron
├── server.js # Express API server
├── training/
│ ├── and-gate-model.js # AND gate training
│ ├── xor-gate-model.js # XOR gate training
│ └── mnist-model.js # MNIST digit recognition training
└── utils/
├── matrix.js # Matrix operations library
└── training-metrics.js # Training metrics tracking
models/
└── mnist-model.json # Pre-trained MNIST model
public/
└── index.html # Web interface
The AND gate is a simple linearly separable problem that can be solved with a single-layer perceptron.
import { getTrainedModelForANDGateLogic } from "./training/and-gate-model.js";
// Get a trained AND gate model
const nn = getTrainedModelForANDGateLogic();
// Make predictions
console.log(nn.predict(0, 0)); // ~0 (expected: 0)
console.log(nn.predict(0, 1)); // ~0 (expected: 0)
console.log(nn.predict(1, 0)); // ~0 (expected: 0)
console.log(nn.predict(1, 1)); // ~1 (expected: 1)XOR is not linearly separable and requires a hidden layer to learn the pattern.
import { getTrainedModelForXORGate } from "./training/xor-gate-model.js";
// Get a trained XOR gate model
const nn = getTrainedModelForXORGate();
// Make predictions (returns a 2D matrix)
console.log(nn.feedforward([0, 0])); // ~[[0]] (expected: 0)
console.log(nn.feedforward([0, 1])); // ~[[1]] (expected: 1)
console.log(nn.feedforward([1, 0])); // ~[[1]] (expected: 1)
console.log(nn.feedforward([1, 1])); // ~[[0]] (expected: 0)Recognizes handwritten digits (0-9) using a 784→64→10 neural network architecture.
import { getTrainedModelForMNIST } from "./training/mnist-model.js";
// Get trained model and test data
const { nn, testSet } = getTrainedModelForMNIST();
// Test with a sample from the test set
const output = nn.feedforward(testSet[0].input);
console.log("Expected Output:", testSet[0].output); // One-hot encoded label
console.log("Neural Network Output:", output); // Network predictions
// Find the predicted digit (index of highest value)
const predicted = output.indexOf(Math.max(...output.flat()));
console.log("Predicted Digit:", predicted);The project includes an Express server for training models via HTTP.
npm run start:apiGET /- Serves the web interface frompublic/index.htmlGET /train- Trains the MNIST model and returns training metrics
curl http://localhost:3000/trainimport { SlNeuralNetwork } from "./sl-neural-network.js";
const nn = new SlNeuralNetwork();
// Train the network
for (let i = 0; i < 20000; i++) {
nn.train(input1, input2, target);
}
// Make predictions
const result = nn.predict(input1, input2);import { MlNeuralNetwork } from "./ml-neural-network.js";
// Create network: inputNodes, hiddenNodes, outputNodes
const nn = new MlNeuralNetwork(2, 4, 1);
// Adjust learning rate (default: 0.1)
nn.learningRate = 0.05;
// Train the network
for (let i = 0; i < 50000; i++) {
nn.train(inputArray, targetArray);
}
// Make predictions
const output = nn.feedforward(inputArray);- Input: 2 values
- Output: 1 value (sigmoid activated, 0-1)
- Activation: Sigmoid
- Use case: AND, OR gates
- Layers: Input → Hidden → Output
- Activation: Sigmoid
- Training: Backpropagation with gradient descent
- Use case: XOR gate, MNIST, complex patterns
express: Web server for API endpointsmnist: MNIST dataset for digit recognition training
ISC