Add GELU operator
Summary: This commit adds GELU activation to compute output via approximate or naive mode.
Showing
with
1860 additions
and
329 deletions
docs/api/python/dragon/nn/silu.rst
0 → 100644
docs/api/python/dragon/optimizers/AdamW.rst
0 → 100644
docs/api/python/dragon/roll.rst
0 → 100644
docs/api/python/tensorflow/nn/gelu.rst
0 → 100644
docs/api/python/tensorflow/nn/silu.rst
0 → 100644
docs/api/python/tensorflow/roll.rst
0 → 100644
docs/api/python/torch/nn/GELU.rst
0 → 100644
docs/api/python/torch/nn/SiLU.rst
0 → 100644
docs/api/python/torch/nn/functional/silu.rst
0 → 100644
docs/api/python/torch/optim/AdamW.rst
0 → 100644
docs/api/python/torch/roll.rst
0 → 100644
dragon/kernels/activation/gelu_op_kernel.cc
0 → 100644
dragon/kernels/activation/gelu_op_kernel.cu
0 → 100644
dragon/kernels/array/roll_op_kernel.cc
0 → 100644
dragon/kernels/array/roll_op_kernel.cu
0 → 100644
dragon/operators/activation/gelu_op.cc
0 → 100644
dragon/operators/activation/gelu_op.h
0 → 100644
dragon/operators/array/roll_op.cc
0 → 100644
dragon/operators/array/roll_op.h
0 → 100644
torch/core/nn/modules/channelshuffle.py
0 → 100644
-
Please register or sign in to post a comment