Skip to content

a multi-layered (2 -> 2 ->1) perceptron made without pytorch or other deep learning frameworks; classifies the XOR samples.

Notifications You must be signed in to change notification settings

adheep04/numpy-xor-neuralnet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

simple 2-layered xor perceptron

a numpy implementation of a simple neural network solving the xor problem with documented code

overview

this small neural net is implemented without using any deep learning frameworks like pytorch or tensorflow to classify the 4 xor samples. xor refers to an exclusive-or boolean expression where this OR that is true but this and that is not. If we take true as 1 and false as 0, then we can generate samples and labels. the xor problem is non-linearly separable (which becomes more clear if you plot the xor samples shown below), requiring a hidden layer and non-linear activation functions to solve

implementation details

  • built using only numpy for computations and matplotlib for visualization
  • custom implementation of:
    • feed-forward logic
    • backpropagation algorithm
    • sigmoid activation function
    • loss calculation
  • total trainable parameters: 9
  • modular design for clear separation of components

results

training progression:

training plot

initial predictions:

initial state

final predictions:

final state

how to use

requirements:

  • numpy
  • matplotlib

usage:

open model.py and run

note: due to random weight initialization, you might need to run the model multiple times to avoid local minima.

the model classifies the following xor truth table:

(0,0) -> 0
(1,0) -> 1
(0,1) -> 1
(1,1) -> 0

About

a multi-layered (2 -> 2 ->1) perceptron made without pytorch or other deep learning frameworks; classifies the XOR samples.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages