[英]Ensemble learning using neural network as combination technique
I wanna implement and ensemble learning model using the power of 3 base learners KNN; 我想使用3个基础学习者KNN的功能来实施和集成学习模型; DT and RF then combine results of prediction using a weighted technique in the following example neural network with perceptron was an used as a combination techniques based on optimizing the weights till finding the best weight and therefore determining the performance of the model. 然后,DT和RF在以下示例中使用加权技术将预测结果组合在一起,将神经网络与感知器结合使用,基于优化权重直至找到最佳权重,从而确定模型的性能,以此作为组合技术。 i got this error while implementing the model : 我在执行模型时收到此错误:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-36-bd170b55dfe3> in <module>()
35 p = Perceptron(random_state=42, max_iter=10)
36 #fit the model
---> 37 p.fit(pred, y_test)
38 for value in pred:
39 pr = p.predict([value])
~\Anaconda3\lib\site-packages\sklearn\linear_model\stochastic_gradient.py in fit(self, X, y, coef_init, intercept_init, sample_weight)
584 loss=self.loss, learning_rate=self.learning_rate,
585 coef_init=coef_init, intercept_init=intercept_init,
--> 586 sample_weight=sample_weight)
587
588
~\Anaconda3\lib\site-packages\sklearn\linear_model\stochastic_gradient.py in _fit(self, X, y, alpha, C, loss, learning_rate, coef_init, intercept_init, sample_weight)
416 self.classes_ = None
417
--> 418 X, y = check_X_y(X, y, 'csr', dtype=np.float64, order="C")
419 n_samples, n_features = X.shape
420
~\Anaconda3\lib\site-packages\sklearn\utils\validation.py in check_X_y(X, y, accept_sparse, dtype, order, copy, force_all_finite, ensure_2d, allow_nd, multi_output, ensure_min_samples, ensure_min_features, y_numeric, warn_on_dtype, estimator)
571 X = check_array(X, accept_sparse, dtype, order, copy, force_all_finite,
572 ensure_2d, allow_nd, ensure_min_samples,
--> 573 ensure_min_features, warn_on_dtype, estimator)
574 if multi_output:
575 y = check_array(y, 'csr', force_all_finite=True, ensure_2d=False,
~\Anaconda3\lib\site-packages\sklearn\utils\validation.py in check_array(array, accept_sparse, dtype, order, copy, force_all_finite, ensure_2d, allow_nd, ensure_min_samples, ensure_min_features, warn_on_dtype, estimator)
449 if not allow_nd and array.ndim >= 3:
450 raise ValueError("Found array with dim %d. %s expected <= 2."
--> 451 % (array.ndim, estimator_name))
452 if force_all_finite:
453 _assert_all_finite(array)
ValueError: Found array with dim 3. Estimator expected <= 2.
Here is the code of the ensemble model 这是集成模型的代码
import pandas
from sklearn import model_selection
from sklearn.ensemble import RandomForestClassifier
from sklearn.cross_validation import train_test_split
import numpy as np
from sklearn import tree
from sklearn.neighbors import KNeighborsClassifier
from sklearn.datasets import load_iris
from sklearn.linear_model import Perceptron
iris = load_iris()
np.random.seed(1)
X=iris.data
y=iris.target
Y = (iris.target==0).astype(np.int8)
X_train, X_test, y_train, y_test =model_selection.train_test_split(
X,Y, test_size=0.3, random_state=123)
#Build ensemble model using neural netowork as combination
model1 = tree.DecisionTreeClassifier(random_state=1)
model2 = KNeighborsClassifier()
model3 = RandomForestClassifier()
model1.fit(X_train,y_train)
model2.fit(X_train,y_train)
model3.fit(X_train,y_train)
pred1=model1.predict(X_test)
pred2=model2.predict(X_test)
pred3=model3.predict(X_test)
#Combination of results and detmination of weights using neural network
#First trial using simple perceptron
#input layer containing the three neurons representing the results of prediction
pred=[[pred1,pred2,pred3]]
#output layer containg y_test
out=y_test
#creating a perceptron model
p = Perceptron(random_state=42, max_iter=10)
#fit the model
p.fit(pred, y_test)
for value in pred:
pr = p.predict([value])
print([pr])
I did the correction i made mistakes on the splitting length but unfortunately i got a terrible results i have to work on to improve it the model overfitted and need a lot of correction elow i share the code: 我进行了校正,但在分割长度上却犯了错误,但不幸的是,我得到了一个糟糕的结果,必须改进它,因为模型过于拟合,需要大量的校正才能共享代码:
import pandas
from sklearn import model_selection
from sklearn.ensemble import RandomForestClassifier
from sklearn.cross_validation import train_test_split
import numpy as np
from sklearn import tree
from sklearn.neighbors import KNeighborsClassifier
from sklearn.datasets import load_iris
from sklearn.linear_model import Perceptron
iris = load_iris()
np.random.seed(1)
X=iris.data
y=iris.target
Y = (iris.target==0).astype(np.int8)
X_train, X_test, y_train, y_test =model_selection.train_test_split(
X,Y, test_size=0.3, random_state=123)
#Build ensemble model using neural netowork as combination
model1 = tree.DecisionTreeClassifier(random_state=1)
model2 = KNeighborsClassifier()
model3 = RandomForestClassifier()
model1.fit(X_train,y_train)
model2.fit(X_train,y_train)
model3.fit(X_train,y_train)
pred1=model1.predict(X_test)
pred2=model2.predict(X_test)
pred3=model3.predict(X_test)
#Combination of results and detmination of weights using neural network
#First trial using simple perceptron
#input layer containing the three neurons representing the results of prediction
pred=np.array([pred1,pred2,pred3]).T
#split dataset
from sklearn.model_selection import train_test_split
X_train1, X_test1, y_train1, y_test1 = train_test_split(pred, out)
#output layer containg y_test
out=y_test
#creating a perceptron model
perceptron = Perceptron(random_state=42, max_iter=10)
#fit the model
perceptron.fit(X_train1,y_train1)
predictions = perceptron.predict(X_test1)
from sklearn.metrics import classification_report,confusion_matrix
print(confusion_matrix(y_test1,predictions))
print(classification_report(y_test1,predictions))
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.