微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

在 Optuna Training 中关闭警告

如何解决在 Optuna Training 中关闭警告

我完全意识到我可能会因为遗漏一些明显的东西而感到尴尬,但这让我很难过。我正在使用 Optuna 调整 LGBM 模型,而我的笔记本上充斥着警告消息,我该如何抑制它们留下错误(最好是试验结果)?代码如下

import optuna
import sklearn

optuna.logging.set_verbosity(optuna.logging.ERROR)

import warnings
warnings.filterwarnings('ignore')

def objective(trial):    
    list_bins = [25,50,75,100,125,150,175,200,225,250,500,750,1000]   

    param = {
        'lambda_l1': trial.suggest_loguniform('lambda_l1',1e-8,10.0),'lambda_l2': trial.suggest_loguniform('lambda_l2','colsample_bytree': trial.suggest_categorical('colsample_bytree',[0.3,0.4,0.5,0.6,0.7,0.8,0.9,1.0]),'subsample': trial.suggest_categorical('subsample',[0.4,'learning_rate': trial.suggest_categorical('learning_rate',[0.006,0.008,0.01,0.014,0.017,0.02,0.05]),'max_depth': trial.suggest_categorical('max_depth',[10,20,100]),'num_leaves' : trial.suggest_int('num_leaves',2,1000),'feature_fraction': trial.suggest_uniform('feature_fraction',0.1,1.0),'bagging_fraction': trial.suggest_uniform('bagging_fraction','bagging_freq': trial.suggest_int('bagging_freq',1,15),'min_child_samples': trial.suggest_int('min_child_samples',300),'cat_smooth' : trial.suggest_int('cat_smooth',256),'cat_l2' : trial.suggest_int('cat_smooth','max_bin': trial.suggest_categorical('max_bin',list_bins)
    }
    

    model = LGBMRegressor(**param,objective='regression',metric= 'rmse',boosting_type='gbdt',verbose=-1,random_state=42,n_estimators=20000,cat_feature= [x for x in range(len(cat_features))])
    
    
    model.fit(X_train,y_train,eval_set=[(X_test,y_test)],early_stopping_rounds=200,verbose=False)
    
    preds = model.predict(X_test)
    
    rmse = mean_squared_error(y_test,preds,squared=False)
    
    return rmse


study = optuna.create_study(direction="minimize")
study.optimize(objective,n_trials=300)

print("Number of finished trials: {}".format(len(study.trials)))

print("Best trial:")
trial = study.best_trial

print("  Value: {}".format(trial.value))

print("  Params: ")
for key,value in trial.params.items():
    print("    {}: {}".format(key,value))
    

我试图最小化的是这个

[LightGBM] [Warning] feature_fraction is set=0.7134336417771784,colsample_bytree=0.4 will be ignored. Current value: feature_fraction=0.7134336417771784
[LightGBM] [Warning] lambda_l1 is set=0.0001621506831365743,reg_alpha=0.0 will be ignored. Current value: lambda_l1=0.0001621506831365743
[LightGBM] [Warning] bagging_fraction is set=0.8231149550002105,subsample=0.5 will be ignored. Current value: bagging_fraction=0.8231149550002105
[LightGBM] [Warning] bagging_freq is set=4,subsample_freq=0 will be ignored. Current value: bagging_freq=4
[LightGBM] [Warning] lambda_l2 is set=0.00010964883369301453,reg_lambda=0.0 will be ignored. Current value: lambda_l2=0.00010964883369301453
[LightGBM] [Warning] feature_fraction is set=0.3726043373358532,colsample_bytree=0.3 will be ignored. Current value: feature_fraction=0.3726043373358532
[LightGBM] [Warning] lambda_l1 is set=1.4643061619613147,reg_alpha=0.0 will be ignored. Current value: lambda_l1=1.4643061619613147

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。