如何解决def__init__中的Optuna试用版中的其他参数
class ConvolutionalNetwork(nn.Module):
def __init__(self,in_features):
super().__init__()
self.in_features = in_features
# this computes num features outputted from the two conv layers
c1 = int(((self.in_features - 2)) / 64) # this is to account for the loss due to conversion to int type
c2 = int((c1-2)/64)
self.n_conv = int(c2*16)
#self.n_conv = int((( ( (self.in_features - 2)/4 ) - 2 )/4 ) * 16)
self.conv1 = nn.Conv1d(1,16,3,1)
self.conv1_bn = nn.Batchnorm1d(16)
self.conv2 = nn.Conv1d(16,1)
self.conv2_bn = nn.Batchnorm1d(16)
self.dp = nn.Dropout(trial.suggest_uniform('dropout_rate',1.0))
self.fc3 = nn.Linear(self.n_conv,2)
如您所见,def __init__
已经具有self
和in_features
作为变量。我正在考虑添加另一个变量trial
(这是Optuna软件包的一部分)以容纳
self.dp = nn.Dropout(trial.suggest_uniform('dropout_rate',1.0))
上述代码中的。请告知如何,大多数资源仅包含def __init__ (self,trial)
,这非常简单,但就我而言,我有3个要在目标中传递的变量。
解决方法
你可以这样做:
class ConvolutionalNetwork(nn.Module):
def __init__(self,trial,in_features):
# code for initialization
..........
def objective(trial):
# code to create in_features
in_features = ......
#generate the model
model = ConvolutionalNetwork(trial,in_features)
..................
# code to run the CNN and calculate the accuracy.
return accuracy
# Create study and optimise the objective function.
study = optuna.create_study()
study.optimize(objective,n_trails=100,timeout=600)
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。