Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tips if INLP projections go to near zero? #4

Open
cjlovering opened this issue Jun 25, 2021 · 1 comment
Open

Tips if INLP projections go to near zero? #4

cjlovering opened this issue Jun 25, 2021 · 1 comment

Comments

@cjlovering
Copy link

cjlovering commented Jun 25, 2021

When I run INLP, sometimes it ends up zeroing out the entire representation, rather than only removing the relevant information.

The first iteration is a reasonable projection (with average size 0.001), but thereafter, the average is near 0, like 1e-19. Visually, it seems that the representation after the first iteration could still be further collapsed without zeroing it out. Does anything come to mind?

@yanaiela
Copy link
Owner

I don't remember encountering such problem.
It may be related to the classifier you're using and the regularizer coefficient. If it's too large, the loss may drive the weights to near zero.
If indeed you're using such regularizer, try changing it's values (as a first step try to remove it completely).
If this doesn't work, try using other linear classifiers (we mainly played around with SGDClassifier, LogisticRegression and SVC).

Do follow up if it still doesn't work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants