Monotonic Constraint는 Prediction에 영향을 주지 않는 제약조건이다.

Monotonic Constraint는 Prediction에 영향을 주지 않는 제약조건이다.
Photo by Kristīne Zāle / Unsplash
  • +1: when the feature increases, the prediction must be greater or equal;
  • 0: no monotonic constraint (default);
  • -1: when the feature increases, the prediction must be smaller or equal.
  • Python Code 예
decision_tree_with_constraints = DecisionTreeRegressor(  
max_depth=3,  
min_samples_leaf=10,  
monotonic_cst=[1,1] # this line of code adds monotonic constraints  
)  
  
decision_tree_with_constraints.fit(X_train, y_train)
catboost = CatBoostRegressor(  
silent=True   )  
  
catboost_with_constraints = CatBoostRegressor(  
silent=True,  
monotone_constraints={"square feet": 1, "overall condition": 1}   )

  • We can use CatBoost to simulate what-if scenarios. For example: what happens if we change the “overall condition” of houses with a square footage of respectively 500, 1,000, 2,000, or 3,000?
  • This is also called sensitivity analysis because it measures the sensitivity of an outcome (the selling price predicted by our model) based on a change in an input variable (the house’s overall condition).

References