The problem we had with the perceptron were:
only converged for linearly separable problems
stopped places that did not look like they would generalize well

We need an algorithm that takes a more balanced approach: - finds a "middle ground"
decision boundary - can make decisions even when the data is not separable
for binary classification for positive (y=1) vs negative (y=0) class
for a probability p to belong to the positive class, the odds ratio is given by

if r>1 we are more likely to be in the positive class
if r<1 we are more likely to be in the negative class
to make it more symmetric we consider the log of r



#machinelearning #python #machinelearningproject #machinelearningassignments
We are also offering machine learning, Data science and others related assignment help services, if you need any help then please comment below or contact us.