Skip to content

Commit d6aa3a2

Browse files
committed
skip doctest output for tuning results in DoubleML class
1 parent 4274f85 commit d6aa3a2

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

doubleml/double_ml.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1069,7 +1069,7 @@ def ml_l_params(trial):
10691069
... 'sampler': optuna.samplers.TPESampler(seed=42),
10701070
... }
10711071
>>> tune_res = dml_plr.tune_ml_models(ml_param_space, optuna_settings=optuna_settings, return_tune_res=True)
1072-
>>> print(tune_res[0]['ml_l'].best_params)
1072+
>>> print(tune_res[0]['ml_l'].best_params) # doctest: +SKIP
10731073
{'learning_rate': 0.03907122389107094}
10741074
>>> # Fit and get results
10751075
>>> dml_plr.fit().summary # doctest: +SKIP
@@ -1087,7 +1087,7 @@ def ml_l_params(trial):
10871087
... }
10881088
>>> tune_res = dml_plr.tune_ml_models(ml_param_space, scoring_methods=scoring_methods,
10891089
... optuna_settings=optuna_settings, return_tune_res=True)
1090-
>>> print(tune_res[0]['ml_l'].best_params)
1090+
>>> print(tune_res[0]['ml_l'].best_params) # doctest: +SKIP
10911091
{'learning_rate': 0.04300012336462904}
10921092
>>> dml_plr.fit().summary # doctest: +SKIP
10931093
coef std err t P>|t| 2.5 % 97.5 %

0 commit comments

Comments
 (0)