You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I run INLP, sometimes it ends up zeroing out the entire representation, rather than only removing the relevant information.
The first iteration is a reasonable projection (with average size 0.001), but thereafter, the average is near 0, like 1e-19. Visually, it seems that the representation after the first iteration could still be further collapsed without zeroing it out. Does anything come to mind?
The text was updated successfully, but these errors were encountered:
I don't remember encountering such problem.
It may be related to the classifier you're using and the regularizer coefficient. If it's too large, the loss may drive the weights to near zero.
If indeed you're using such regularizer, try changing it's values (as a first step try to remove it completely).
If this doesn't work, try using other linear classifiers (we mainly played around with SGDClassifier, LogisticRegression and SVC).
When I run INLP, sometimes it ends up zeroing out the entire representation, rather than only removing the relevant information.
The first iteration is a reasonable projection (with average size 0.001), but thereafter, the average is near 0, like 1e-19. Visually, it seems that the representation after the first iteration could still be further collapsed without zeroing it out. Does anything come to mind?
The text was updated successfully, but these errors were encountered: