Skip to content

Preventing a crash for LSML in a verbose mode #62

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

searchivarius
Copy link

If LSML is created with verbose=True, the code crashes here https://github.com/searchivarius/metric-learn/blob/fac5ed5d4f0ae24d274e0956f4b7d2d6d09d2841/metric_learn/lsml.py#L95 due to uninitialized variable s_best. This is I believe a way to fix the problem.

@Callidior
Copy link
Contributor

It's an interesting case if the program gets there with l_best uninitialized, because it means that no single gradient descent step could improve the loss. Thus, maybe l_best = 0 is a more appropriate initialization?

@searchivarius
Copy link
Author

@Callidior I am torturing your software using substantially non-metric data :-) Whatever initialization is best in this case is up to you.

@perimosocordiae
Copy link
Contributor

l_best is recording the length of the best step size, so I agree that it makes sense to initialize it to zero.

@searchivarius
Copy link
Author

Hi guys, sure, fell free to make this modification.

perimosocordiae added a commit that referenced this pull request May 24, 2017
@perimosocordiae
Copy link
Contributor

Just pushed a fix. Thanks, @searchivarius!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants