Perceptron, also known as single layer perceptrion, is a type of machine learning model that can be tracked back to the 1950s and 1960s. It is a simple algorithm for binary classification just like logistic regression. The reason we discuss perceptrons is that their core structural components, specifically linear transformations $\mathbf{w} \cdot \mathbf{x}+b$ and activation functions $\sigma$, are the fundamental compoent in modern neural networks.

Perceptron ● Origin

The original perceptron was developed for classification tasks. An perceptron with 3 inputs and 3 weights is visualized below.

embed - 2024-06-19T032625.096.svg

Left Side (Expanded View):

Right Side (Simplified View):

Application to Iris Dataset:

The perceptron can be used to classify the flowers in the Iris dataset based on their features (e.g., sepal length, sepal width, petal length, and petal width). Since the original perceptron handles binary classification, you would typically use it to classify between two classes, such as "Setosa" vs. "Versicolor" or "Versicolor" vs. "Virginica."

In the case of the Iris dataset:

A perceptron would learn the weights and bias during training by minimizing the classification error. Once trained, the perceptron can classify new flowers based on their features.

The full math equation of the above diagram is given as follows: