Table 4.

Parameters for XGBoost Models to Estimate Benchmark Welfare

ParameterRange
Maximum number of boosting iterationsbetween 50 and 200
Maximum depth of a tree2 or 4
Learning rate0.1 or 0.3
Subsample ratio of the training instance0.2,0.4, or 0.6
Subsample ratio of columns to construct each tree0.2,0.5, or 0.7
ParameterRange
Maximum number of boosting iterationsbetween 50 and 200
Maximum depth of a tree2 or 4
Learning rate0.1 or 0.3
Subsample ratio of the training instance0.2,0.4, or 0.6
Subsample ratio of columns to construct each tree0.2,0.5, or 0.7

Source: Author's calculations.

Note: This table reports the main parameters considered in XGBoost models to estimate the benchmark welfare. The values in these tables are the options considered in the selection of optimal hyperparameters.

Table 4.

Parameters for XGBoost Models to Estimate Benchmark Welfare

ParameterRange
Maximum number of boosting iterationsbetween 50 and 200
Maximum depth of a tree2 or 4
Learning rate0.1 or 0.3
Subsample ratio of the training instance0.2,0.4, or 0.6
Subsample ratio of columns to construct each tree0.2,0.5, or 0.7
ParameterRange
Maximum number of boosting iterationsbetween 50 and 200
Maximum depth of a tree2 or 4
Learning rate0.1 or 0.3
Subsample ratio of the training instance0.2,0.4, or 0.6
Subsample ratio of columns to construct each tree0.2,0.5, or 0.7

Source: Author's calculations.

Note: This table reports the main parameters considered in XGBoost models to estimate the benchmark welfare. The values in these tables are the options considered in the selection of optimal hyperparameters.

Close
This Feature Is Available To Subscribers Only

Sign In or Create an Account

Close

This PDF is available to Subscribers Only

View Article Abstract & Purchase Options

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Close