Maximum likelihood estimate

logistic_regression

#1

We use MLE to estimate co-effecients in linear and logistic regression. As far as my understanding The basic intuition is that ’which parameter is most likely given the data’. It is mathematically defined as follows:

L(parameter | data) = P(data | parameter)
where L() is the likelihood function and P() is the probability.

I’m having trouble in understanding the above equation as it goes against the basic intuition. Shouldn’t it be P(parameter | data)?


#2

In linear regression we use least square method while in logistic regression it uses maximum likelihood method for defining coefficient.


#3

Least squares method is a derivative or say a special case of the maximum likelihood estimate. Minimising likelihood gives the least squares expression.