微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

JMH 错误:java:方法参数应该是 @State 类 - 但为什么呢?

如何解决JMH 错误:java:方法参数应该是 @State 类 - 但为什么呢?

在我的代码中,我想对神经网络(模型)的训练进行基准测试,但是当我运行它时,我看到了标题中的错误。在指南中,它说“有时你会想要初始化一些你的基准代码需要的变量,但你不想成为基准测试代码的一部分。这些变量被称为状态变量”,我注释了@在我的主要方法上方说明,但它并没有改变结果。我做错了什么?

   public class IrisClassifier {
    private static Logger log = LoggerFactory.getLogger(IrisClassifier.class);


public static void main(String[] args) throws  Exception {

    //First: get the dataset using the record reader. CSVRecordReader handles loading/parsing
    int numLinesToSkip = 0;
    char delimiter = ',';
    RecordReader recordReader = new CSVRecordReader(numLinesToSkip,delimiter);
    recordReader.initialize(new FileSplit(new File(DownloaderUtility.IRISDATA.Download(),"iris.txt")));

    //Second: the RecordReaderDataSetIterator handles conversion to DataSet objects,ready for use in neural network
    int labelIndex = 4;     //5 values in each row of the iris.txt CSV: 4 input features followed by an integer label (class) index. Labels are the 5th value (index 4) in each row
    int numClasses = 3;     //3 classes (types of iris flowers) in the iris data set. Classes have integer values 0,1 or 2
    int batchSize = 150;    //Iris data set: 150 examples total. We are loading all of them into one DataSet (not recommended for large data sets)

    DataSetIterator iterator = new RecordReaderDataSetIterator(recordReader,batchSize,labelIndex,numClasses);
    DataSet allData = iterator.next();
    allData.shuffle();
    SplitTestAndTrain testAndTrain = allData.splitTestAndTrain(0.65);  //Use 65% of data for training

    DataSet trainingData = testAndTrain.getTrain();
    DataSet testData = testAndTrain.gettest();

    //We need to normalize our data. We'll use normalizeStandardize (which gives us mean 0,unit variance):
    Datanormalization normalizer = new normalizerStandardize();
    normalizer.fit(trainingData);           //Collect the statistics (mean/stdev) from the training data. This does not modify the input data
    normalizer.transform(trainingData);     //Apply normalization to the training data
    normalizer.transform(testData);         //Apply normalization to the test data. This is using statistics calculated from the *training* set


    final int numInputs = 4;
    int outputNum = 3;
    long seed = 6;


    log.info("Build model....");
    MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
        .seed(seed)
        .activation(Activation.TANH)
        .weightinit(Weightinit.XAVIER)
        .updater(new Sgd(0.1))
        .l2(1e-4)
        .list()
        .layer(new DenseLayer.Builder().nIn(numInputs).nOut(3)
            .build())
        .layer(new DenseLayer.Builder().nIn(3).nOut(3)
            .build())
        .layer( new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
            .activation(Activation.softmax) //Override the global TANH activation with softmax for this layer
            .nIn(3).nOut(outputNum).build())
        .build();


    //run the model
    MultiLayerNetwork model = new MultiLayerNetwork(conf);
    model.init();
    //record score once every 100 iterations
    model.setListeners(new scoreIterationListener(100));
    model.setListeners(new PerformanceListener(100));

    benchmarkingTraining(model,trainingData);

    //evaluate the model on the test set
    Evaluation eval = new Evaluation(3);
    Indarray output = model.output(testData.getFeatures());
    eval.eval(testData.getLabels(),output);
    log.info(eval.stats());

}

public static class BenchmarkRunner {
    public static void main(String[] args) throws Exception {
        org.openjdk.jmh.Main.main(args);
    }
}

@Fork(value = 1,warmups = 2)
@Benchmark
@BenchmarkMode(Mode.AverageTime)
public static void benchmarkingTraining(MultiLayerNetwork model,DataSet trainingData) {


    for (int i = 0; i < 1000; i++) {

        model.fit(trainingData);
    }

}

}

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。