Single Layer Perceptrons are the fundamental of Neural Networks. They are very effective on linearly separable classes.
-
Updated
Jul 17, 2021 - C++
Single Layer Perceptrons are the fundamental of Neural Networks. They are very effective on linearly separable classes.
🧬This repository contains implementations of various bio-inspired optimization algorithms, along with example notebooks and resources for demonstration.
Emoji and MNIST digit Classfication
Predictive model that divides the customers into groups based on common characteristics so companies can market to each group effectively and appropriately
Practice
Textbook Chapter about how Neural Network Models and Principal Component Analysis (PCA) can be used together to better understand data.
A lab assignment of training feed forward neural networks and optimization algorithms
[RU] Обучение однослойного перцептрона дельта-правилом. [EN] Training of a single-layer perceptron by the delta rule.
🗨️ This repository contains a collection of notebooks and resources for various NLP tasks using different architectures and frameworks.
Language recognition using Perceptrons
Add a description, image, and links to the single-layered-neural-network topic page so that developers can more easily learn about it.
To associate your repository with the single-layered-neural-network topic, visit your repo's landing page and select "manage topics."