如何解决nn内的LayerNorm割炬中的顺序
我正在尝试在割炬的nn.Sequential内部使用LayerNorm。这就是我想要的-
import torch.nn as nn
class LayerNormCnn(nn.Module):
def __init__(self):
super(LayerNormCnn,self).__init__()
self.net = nn.Sequential(
nn.Conv2d(3,32,kernel_size=3,stride=2,padding=1),nn.LayerNorm(),nn.ReLU(),nn.Conv2d(32,64,)
def forward(self,x):
x = self.net(x)
return x
不幸的是,它不起作用,因为LayerNorm需要normalized_shape
作为输入。上面的代码引发以下异常-
nn.LayerNorm(),TypeError: __init__() missing 1 required positional argument: 'normalized_shape'
现在,这就是我实现它的方式-
import torch.nn as nn
import torch.nn.functional as F
class LayerNormCnn(nn.Module):
def __init__(self,state_shape):
super(LayerNormCnn,self).__init__()
self.conv1 = nn.Conv2d(state_shape[0],padding=1)
self.conv2 = nn.Conv2d(32,padding=1)
# compute shape by doing a forward pass
with torch.no_grad():
fake_input = torch.randn(1,*state_shape)
out = self.conv1(fake_input)
bn1_size = out.size()[1:]
out = self.conv2(out)
bn2_size = out.size()[1:]
self.bn1 = nn.LayerNorm(bn1_size)
self.bn2 = nn.LayerNorm(bn2_size)
def forward(self,x):
x = F.relu(self.bn1(self.conv1(x)))
x = F.relu(self.bn2(self.conv2(x)))
return x
if __name__ == '__main__':
in_shape = (3,128,128)
batch_size = 32
model = LayerNormCnn(in_shape)
x = torch.randn((batch_size,) + in_shape)
out = model(x)
print(out.shape)
是否可以在nn.Sequential内部使用LayerNorm?
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。