2.6 Perceptron Convergence and Linear Separability
Perceptron Convergence and Linear Separability The perceptron algorithm converges only when the dataset is linearly separable . If the data cannot be separated by a straight line or hyperplane, the algorithm will not converge. Algorithm: The perceptron convergence theorem states that, for any data set which is linearly separable, the perception learning rule or algorithm will converge to a solution in finite no of iterations or find a solution in a finite number of steps. The perceptron learning algorithm updates its weight vector w using the following rule: The update is performed only when the perceptron misclassifies a data point. The theorem guarantees that: If the data is linearly separable , the perceptron algorithm will converge in a finite number of steps . If the data is not linearly separable , the perceptron will continue updating indefinitely. Linear Separability A dataset is linearly separable if we can separate the two classe...