Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Personalized Attention Details #1

Open
WillPowellUk opened this issue Jun 13, 2024 · 0 comments
Open

Personalized Attention Details #1

WillPowellUk opened this issue Jun 13, 2024 · 0 comments

Comments

@WillPowellUk
Copy link

WillPowellUk commented Jun 13, 2024

Dear @akaneUS and @yh0903 ,

Thank you for providing your code alongside your very interesting paper - it is unfortunately quite rare to have code available alongside a publication so this makes a change.

I was just going through your network and I cannot find the PA (personalized attention) framework. I.e. the implementation of a personalized SAB (self-attention block). Unless I have missed it, please could you provide more details on how this is implemented?

Also, could I ask how the 60 steps x 1 min segments are fed into the dataloader?

Cheers,
Will

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant