Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a confusion matrix in flair for evaluating relation extraction models? #2907

Closed
geheim01 opened this issue Aug 16, 2022 · 3 comments
Labels
question Further information is requested

Comments

@geheim01
Copy link

We are currently evaluating our relation extraction model.
We achieve an accuracy of 28% with 47 relation classes that we use, many of which are not really selective.

Now, in order to determine which relations might be better merged, we would like to output a confusion matrix. Unfortunately, I could not find a way in Flair Training to do this.

Is there a method that can do this that I just haven't found yet?

@geheim01 geheim01 added the question Further information is requested label Aug 16, 2022
@grinay
Copy link

grinay commented Aug 25, 2022

@geheim01 did you found answer to your question?

@geheim01
Copy link
Author

@grinay I managed to build my own confusion matrix based on the output of the training test process in the test.tsv file.

@alanakbik
Copy link
Collaborator

@geheim01 we have a new relation extractor that should give better accuracy on this branch if you like to test it: #2748

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants