Skip to content
This repository has been archived by the owner on Sep 29, 2023. It is now read-only.

Understanding the Assembly Loss in the loss function #81

Open
vifi2021 opened this issue Apr 12, 2022 · 0 comments
Open

Understanding the Assembly Loss in the loss function #81

vifi2021 opened this issue Apr 12, 2022 · 0 comments

Comments

@vifi2021
Copy link

Dear Author,

Thank you so much for the code. I am trying to understand the following part of the loss function ("assembly loss" in the paper)

loss = -(target_pre * torch.log(input_all)).sum() / target_num_pre

In the paper (https://arxiv.org/pdf/1810.11780.pdf) equation (4). "L3" (target_union) is used in the assembly loss. But in the code implementation, "target_pre" is used.

Shouldn't we replace "target_pre" to "target_union" and "target_num_pre" to "target_num_union"?

Looking forward to your advice on this. Thank you.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant