Sklearn genetic opt
0.11.1


Scikit-Learn使用进化算法模型的超参数调整和特征选择。
这本来可以替代Scikit-Learn中流行方法的一种替代方法,例如网格搜索和对超参数调整的随机网格搜索,以及从RFE(递归功能消除),从模型中选择以进行特征选择。
Sklearn-Genetic-Opt使用来自DEAP(Python中的分布式进化算法)的进化算法来选择优化(最大或最小)的一组超参数,可以将交叉验证得分用于回归和分类问题。
文档可在此处提供
可视化培训的进度:

实时指标可视化和跨行驶的比较:

采样超参数的分布:

文物记录:

安装Sklearn-Genetic-Opt
建议在Env内使用虚拟Env安装Sklearn-Genetic:
PIP安装Sklearn-Genetic-Opt
如果您想获得所有功能,包括绘图,张板和MLFlow日志记录功能,请安装所有额外的软件包:
PIP安装Sklearn-Genetic-Opt [全部]
from sklearn_genetic import GASearchCV
from sklearn_genetic . space import Continuous , Categorical , Integer
from sklearn . ensemble import RandomForestClassifier
from sklearn . model_selection import train_test_split , StratifiedKFold
from sklearn . datasets import load_digits
from sklearn . metrics import accuracy_score
data = load_digits ()
n_samples = len ( data . images )
X = data . images . reshape (( n_samples , - 1 ))
y = data [ 'target' ]
X_train , X_test , y_train , y_test = train_test_split ( X , y , test_size = 0.33 , random_state = 42 )
clf = RandomForestClassifier ()
# Defines the possible values to search
param_grid = { 'min_weight_fraction_leaf' : Continuous ( 0.01 , 0.5 , distribution = 'log-uniform' ),
'bootstrap' : Categorical ([ True , False ]),
'max_depth' : Integer ( 2 , 30 ),
'max_leaf_nodes' : Integer ( 2 , 35 ),
'n_estimators' : Integer ( 100 , 300 )}
# Seed solutions
warm_start_configs = [
{ "min_weight_fraction_leaf" : 0.02 , "bootstrap" : True , "max_depth" : None , "n_estimators" : 100 },
{ "min_weight_fraction_leaf" : 0.4 , "bootstrap" : True , "max_depth" : 5 , "n_estimators" : 200 },
]
cv = StratifiedKFold ( n_splits = 3 , shuffle = True )
evolved_estimator = GASearchCV ( estimator = clf ,
cv = cv ,
scoring = 'accuracy' ,
population_size = 20 ,
generations = 35 ,
param_grid = param_grid ,
n_jobs = - 1 ,
verbose = True ,
use_cache = True ,
warm_start_configs = warm_start_configs ,
keep_top_k = 4 )
# Train and optimize the estimator
evolved_estimator . fit ( X_train , y_train )
# Best parameters found
print ( evolved_estimator . best_params_ )
# Use the model fitted with the best parameters
y_predict_ga = evolved_estimator . predict ( X_test )
print ( accuracy_score ( y_test , y_predict_ga ))
# Saved metadata for further analysis
print ( "Stats achieved in each generation: " , evolved_estimator . history )
print ( "Best k solutions: " , evolved_estimator . hof ) from sklearn_genetic import GAFeatureSelectionCV , ExponentialAdapter
from sklearn . model_selection import train_test_split
from sklearn . svm import SVC
from sklearn . datasets import load_iris
from sklearn . metrics import accuracy_score
import numpy as np
data = load_iris ()
X , y = data [ "data" ], data [ "target" ]
# Add random non-important features
noise = np . random . uniform ( 5 , 10 , size = ( X . shape [ 0 ], 5 ))
X = np . hstack (( X , noise ))
X_train , X_test , y_train , y_test = train_test_split ( X , y , test_size = 0.33 , random_state = 0 )
clf = SVC ( gamma = 'auto' )
mutation_scheduler = ExponentialAdapter ( 0.8 , 0.2 , 0.01 )
crossover_scheduler = ExponentialAdapter ( 0.2 , 0.8 , 0.01 )
evolved_estimator = GAFeatureSelectionCV (
estimator = clf ,
scoring = "accuracy" ,
population_size = 30 ,
generations = 20 ,
mutation_probability = mutation_scheduler ,
crossover_probability = crossover_scheduler ,
n_jobs = - 1 )
# Train and select the features
evolved_estimator . fit ( X_train , y_train )
# Features selected by the algorithm
features = evolved_estimator . support_
print ( features )
# Predict only with the subset of selected features
y_predict_ga = evolved_estimator . predict ( X_test )
print ( accuracy_score ( y_test , y_predict_ga ))
# Transform the original data to the selected features
X_reduced = evolved_estimator . transform ( X_test )有关Sklearn-Genetic-Opt的变化的注释,请参见Changelog
您可以使用命令检查最新的开发版本:
git克隆https://github.com/rodrigo-arenas/sklearn-genetic-opt.git
安装开发依赖性:
pip install -r dev -quirements.txt
检查最新的开发文件:https://sklearn-genetic-opt.readthedocs.io/en/latest/
贡献非常欢迎!正在进行的项目中有几个机会,因此,如果您愿意提供帮助,请联系。确保检查当前问题以及贡献指南。
非常感谢正在为这个项目提供帮助的人们!
安装后,您可以从源目录外启动测试套件:
pytest sklearn_genetic