Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

我来回答下:始终不明白softmax_layer层反向传播函数backward_softmax_layer()中为什么不用对softmax函数求导? #32

Open
rockyzhengwu opened this issue Jul 5, 2020 · 0 comments

Comments

@rockyzhengwu
Copy link

我感觉作者的意图是,softmax_layer 通常都是最后一层的激活函数, 作者把 softmax 的求导和 loss 放到一起了,你去看loss 的求导就发现 softmax 其实已经算过了,所以 softmax_layer就不处理,所以 softmax 后一定是接上某个loss 的

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant