Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Assertion error when evaluating a groundtruth with zeros and ones against a prediction that is only made of zeros #76

Open
nesrnesr opened this issue May 14, 2022 · 3 comments

Comments

@nesrnesr
Copy link

Hello, thank you for the implementation. However, as the title indicates, when I want to assess the performance of a prediction (that says that the full time-serie is normal) compared to a ground-truth that says otherwise (contains anomalies), the code raises an assertion error. Shouldn't the score be indicating 0 since the predictions failed to predict that there are anomalies that were not detected?

@nesrnesr nesrnesr changed the title When the groundtruth is 0 and 1 and the predictions are fully 0, an assertion error is raised Assertion error when evaluating a groundtruth with zeros and ones against a prediction that is only made of zeros May 14, 2022
@sbuse
Copy link

sbuse commented Aug 17, 2022

Unfortunately, I have the same issue and I think nesrnesr is absolutely right. If the prediction does not include any ones the precision should simply be zero. Luckily the case is easy to catch and fix.

@sbuse
Copy link

sbuse commented Aug 17, 2022

try something like ts_precision([0,1,0] , [0,0,0])

@nesrnesr
Copy link
Author

@sbuse if you are in a hurry, you can use the official C++ implementation https://github.com/IntelLabs/TSAD-Evaluator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants