You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm running ruptures on biological data: I'm trying to detect genomic duplication breakpoints from depth of coverage variation in genome mapping data. I tried to use:
algo = rpt.Pelt(model="rbf").fit(d_subset['norm'].values)
refined_result = algo.predict(pen=X)
Making the penalty value vary from 1 to values in the thousands, but no matter the penalty value I still en up with more than 2k change points on a dataset of 12K values.
Is my data too noisy to reduce the number of changepoints or do I not get how penalty should be set? I expected higher penalty values to result in longer computing time and a smaller number of predicted change points.
I tried other algorithms like BottomUp and Window and got some nice predictions on smoothed (mean value of slidding windows of a thousand points) and normalised data but I would like to see if I can get more precise ones with Pelt algorithm.
Thanks for the amazing package you designed!
The text was updated successfully, but these errors were encountered:
Hi,
I'm running ruptures on biological data: I'm trying to detect genomic duplication breakpoints from depth of coverage variation in genome mapping data. I tried to use:
algo = rpt.Pelt(model="rbf").fit(d_subset['norm'].values)
refined_result = algo.predict(pen=X)
Making the penalty value vary from 1 to values in the thousands, but no matter the penalty value I still en up with more than 2k change points on a dataset of 12K values.
Is my data too noisy to reduce the number of changepoints or do I not get how penalty should be set? I expected higher penalty values to result in longer computing time and a smaller number of predicted change points.
I tried other algorithms like BottomUp and Window and got some nice predictions on smoothed (mean value of slidding windows of a thousand points) and normalised data but I would like to see if I can get more precise ones with Pelt algorithm.
Thanks for the amazing package you designed!
The text was updated successfully, but these errors were encountered: