Saturday, April 23, 2016

the basic idea of multi-class logistic regression.

The Equivalence of Logistic Regression and Maximum Entropy
Modeling
Mingjie Qian∗
University of Illinois at Urbana-Champaign
(Dated: October 26, 2013)
Abstract
In this technical note, we will show the equivalence of maximum entropy modeling and logistic
regression. The equivalence is built on the fact that the optimization problem of logistic regression
is to maximize the log-likelihood of model parameters knowing the exponential form of posterior
probability functions, which is actually the dual problem of maximum entropy modeling. It is
the maximum likelihood estimation (MLE) technique that brides maximum entropy modeling and
logistic regression.


the basic idea of multi-class logistic regression.
According to Eq.(5), the Lagrange dual function of maximum entropy model is the neg
ative log-likelihood
L(p∗,Λ,γ∗) = −L˜ p (pΛ (y|x)).
Some comments: We usually have two systems. In the first system, we know the analytical
form of the solution function, and we aim to optimize the objective function. For the
second system, we didn’t know the form of the solution, but instead we have a collection of
constraints (e.g., the balance equations). We seek to optimize the objective function under
the set of constraints. System 2 provides the primal problems, and system 1 plays the role
of dual problems. Since system 1 is an unconstrained optimization problem, it is easier to
solve. In the next section, we will show that maximum likelihood given pΛ (y|x) = exp
P i
λifi(x,y)
ZΛ(x)
is actually the basic idea of multi-class logistic regression.

No comments:

Post a Comment