Replies: 4 comments
-
You can have a look at: https://github.com/open-mmlab/mmcv/blob/466a6c829c2289651212866bbe24cb3b4541a577/mmcv/cnn/bricks/generalized_attention.py#L34-L42 |
Beta Was this translation helpful? Give feedback.
-
Thank you for your reply! |
Beta Was this translation helpful? Give feedback.
-
You can try model = dict(
backbone=dict(plugins=[
dict(
cfg=dict(
type='NonLocal2d',
sub_sample=False,
conv_cfg=dict(type='Conv2d')),
stages=(False, False, True, True),
position='after_conv2')
])) |
Beta Was this translation helpful? Give feedback.
-
@ZwwWayne By the way, could you please give me a demo about how to use context_block? Thank you!!!!!! |
Beta Was this translation helpful? Give feedback.
-
Could you please tell me the Attention_type mean? May i make the attention_type='1101'?
mmdetection/configs/empirical_attention/faster_rcnn_r50_fpn_attention_0010_1x_coco.py
Line 9 in a616886
Beta Was this translation helpful? Give feedback.
All reactions