logml.models.registry.linear
Classes
|
Base class for linear classification models without inner CV. |
|
Base class for linear models with/without CV. |
|
Wrapper for statsmodels.***. |
|
Wrapper for sklearn.linear_model.ElasticNet. |
|
Wrapper for sklearn.linear_model.LassoLarsIC with AIC criterion. |
|
Wrapper for sklearn.linear_model.LassoLarsIC with BIC criterion. |
|
Wrapper for sklearn.linear_model.Lasso. |
|
Wrapper for sklearn.linear_model.LogisticRegression. |
|
Wrapper for sklearn.linear_model.Ridge. |
- class logml.models.registry.linear.BaseLinearModel(params: Optional[dict] = None, logger=None)
Bases:
logml.models.base.BaseModel
Base class for linear models with/without CV. TASK and TAGS are set.
- TASK = 'regression'
- TAGS = ['linear', 'regularization']
- class logml.models.registry.linear.BaseLinearClassifierModel(params: Optional[dict] = None, logger=None)
Bases:
logml.models.base.BaseModel
Base class for linear classification models without inner CV. TASK and TAGS are set.
- TASK = 'classification'
- TAGS = ['linear', 'regularization']
- class logml.models.registry.linear.LogisticRegressionModel(params: Optional[dict] = None, logger=None)
Bases:
logml.models.registry.linear.BaseLinearClassifierModel
,logml.model_search.shap.ShapExplainable
Wrapper for sklearn.linear_model.LogisticRegression.
- F_MODEL
alias of
sklearn.linear_model._logistic.LogisticRegression
- DEFAULT_PARAMS = {'n_jobs': -1}
- PARAMS_SPACE = {'C': [0.1, 1, 10], 'fit_intercept': [True, False], 'l1_ratio': {'distribution': 'uniform', 'params': [0.0, 1.0]}, 'max_iter': [3000], 'multi_class': ['auto'], 'n_jobs': [2], 'penalty': ['elasticnet'], 'solver': ['saga'], 'tol': [0.0001]}
- get_shap_init_params(ctx: Optional[logml.model_search.shap.ShapExplainerContext] = None) dict
Parameters for shap explainer
- class logml.models.registry.linear.LassoModel(params: Optional[dict] = None, logger=None)
Bases:
logml.models.registry.linear.BaseLinearModel
,logml.model_search.shap.ShapExplainable
Wrapper for sklearn.linear_model.Lasso.
- F_MODEL
alias of
sklearn.linear_model._coordinate_descent.Lasso
- DEFAULT_PARAMS = {'max_iter': 3000, 'normalize': True, 'random_state': None}
- PARAMS_SPACE = {'alpha': {'distribution': 'loguniform', 'params': [-12, 2]}, 'normalize': [True]}
- get_shap_init_params(ctx: Optional[logml.model_search.shap.ShapExplainerContext] = None) dict
Parameters for shap explainer
- class logml.models.registry.linear.ElasticNetModel(params: Optional[dict] = None, logger=None)
Bases:
logml.models.registry.linear.BaseLinearModel
,logml.model_search.shap.ShapExplainable
Wrapper for sklearn.linear_model.ElasticNet.
- F_MODEL
alias of
sklearn.linear_model._coordinate_descent.ElasticNet
- DEFAULT_PARAMS = {}
- PARAMS_SPACE = {'alpha': [1.0, {'distribution': 'loguniform', 'params': [-5, 2]}], 'l1_ratio': {'distribution': 'normal', 'params': [0.5, 0.1]}, 'max_iter': [1000, 500, 1500], 'selection': ['cyclic', 'random']}
- get_shap_init_params(ctx: Optional[logml.model_search.shap.ShapExplainerContext] = None) dict
Parameters for shap explainer
- class logml.models.registry.linear.RidgeModel(params: Optional[dict] = None, logger=None)
Bases:
logml.models.registry.linear.BaseLinearModel
,logml.model_search.shap.ShapExplainable
Wrapper for sklearn.linear_model.Ridge.
- F_MODEL
alias of
sklearn.linear_model._ridge.Ridge
- DEFAULT_PARAMS = {'alpha': 1.0, 'max_iter': 200, 'random_state': None}
- PARAMS_SPACE = {'alpha': {'distribution': 'loguniform', 'params': [-12, 2]}, 'fit_intercept': [True, False], 'max_iter': [3000], 'normalize': [False, True]}
- get_shap_init_params(ctx: Optional[logml.model_search.shap.ShapExplainerContext] = None) dict
Parameters for shap explainer
- class logml.models.registry.linear.LassoLarsAICModel(params: Optional[dict] = None, logger=None)
Bases:
logml.models.registry.linear.BaseLinearModel
,logml.model_search.shap.ShapExplainable
Wrapper for sklearn.linear_model.LassoLarsIC with AIC criterion.
- F_MODEL = functools.partial(<class 'sklearn.linear_model._least_angle.LassoLarsIC'>, criterion='aic')
- DEFAULT_PARAMS = {'max_iter': 200}
- PARAMS_SPACE = {'fit_intercept': [True, False], 'max_iter': [3000], 'normalize': [True, False], 'positive': [True, False]}
- get_shap_init_params(ctx: Optional[logml.model_search.shap.ShapExplainerContext] = None) dict
Parameters for shap explainer
- class logml.models.registry.linear.LassoLarsBICModel(params: Optional[dict] = None, logger=None)
Bases:
logml.models.registry.linear.BaseLinearModel
,logml.model_search.shap.ShapExplainable
Wrapper for sklearn.linear_model.LassoLarsIC with BIC criterion.
- F_MODEL = functools.partial(<class 'sklearn.linear_model._least_angle.LassoLarsIC'>, criterion='bic')
- DEFAULT_PARAMS = {'max_iter': 200}
- PARAMS_SPACE = {'fit_intercept': [True, False], 'max_iter': [300], 'normalize': [True, False], 'positive': [True, False]}
- get_shap_init_params(ctx: Optional[logml.model_search.shap.ShapExplainerContext] = None) dict
Parameters for shap explainer
- class logml.models.registry.linear.BaseSMLinearModel(params: Optional[dict] = None, logger=None)
Bases:
logml.models.base.BaseModel
Wrapper for statsmodels.***.
- TAGS = ['linear', 'cv', 'sm']
- FE_MODEL_ATTRIBUTE = 'coef_'
- fit_fold_model(model: statsmodels.base.model.Model)
Fit internal model
- fit(dataset: logml.data.datasets.cv_dataset.ModelingDataset, fit_params: Optional[Dict] = None, train_final_model=False)
For each CV fold fits a model.