Closed
Description
The current implementation of LMNN python version.
I read the paper of LMNN of NIPS05 version, jmlr08 version, I find that the objective of the optimization process in LMNN consisting of the hinge loss of impostors, namely
$$[1+||L(x_i - x_j)||^2-||L(x_i - x_l)||^2]_+$$
But the current implementation is
objective = total_active * (1-reg)
objective += G.flatten().dot(L.T.dot(L).flatten())
I guess the 2-nd row is calculating the trace of M * C, but I cannot figure where is the hinge loss?
Metadata
Metadata
Assignees
Labels
No labels