diffusion_
rs_
common
0.1.0
Module attention
Module Items
Functions
In diffusion_
rs_
common::
nn
diffusion_rs_common
::
nn
Module
attention
Copy item path
source
Functions
ยง
scaled_
dot_
product_
attention
Computes (softmax(QK^T*sqrt(d_k)) + M)V.
M
is the attention mask, and is a bias (0 for unmasked, -inf for masked).