简体   繁体   English

sklearn PLSRegression-X的方差由潜在向量解释

[英]sklearn PLSRegression - Variance of X explained by latent vectors

I performed a partial least squares regression using Python's sklearn.cross_decomposition.PLSRegression 我使用Python的sklearn.cross_decomposition.PLSRegression执行了部分最小二乘回归

Is there a way to retrieve the fraction of explained variance for X, ie R 2 (X) , for each PLS component? 有没有一种方法可以为每个PLS组件检索X的解释方差分数,即R 2 (X) I'm looking for something similar to the explvar() function from the R pls package. 我正在从R pls包中寻找类似于explvar()函数的东西。 However, I'd also appreciate any suggestions on how to compute it myself. 但是,我也很感激任何有关自己计算的建议。

There is a similar question and there is one answer that explains how to get the variance of Y. I guess, that "variance in Y" is what was asked for in that case. 有一个类似的问题 ,有一个答案解释了如何获得Y的方差。我猜想,在这种情况下,需要“ Y的方差”。 That's why I opened a new question - hope that's OK 这就是为什么我提出一个新问题-希望没问题

I managed to find a solution for the problem. 我设法找到问题的解决方案。 The following gives the fraction of variance in X explained by each latent vector after PLS regression: 以下给出了PLS回归后每个潜在向量解释的X方差分数:

import numpy as np
from sklearn import cross_decomposition

# X is a numpy ndarray with samples in rows and predictor variables in columns
# y is one-dimensional ndarray containing the response variable

total_variance_in_x = np.var(X, axis = 0)

pls1 = cross_decomposition.PLSRegression(n_components = 5)
pls1.fit(X, y) 

# variance in transformed X data for each latent vector:
variance_in_x = np.var(pls1.x_scores_, axis = 0) 

# normalize variance by total variance:
fractions_of_explained_variance = variance_in_x / total_variance_in_x

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Sklearn PCA解释方差和解释方差比差 - Sklearn PCA explained variance and explained variance ratio difference 使用sklearn在PCA中恢复explain_variance_ratio_的特征名称 - Recovering features names of explained_variance_ratio_ in PCA with sklearn SKLearn PCA explained_variance_ration cumsum 给出数组 1 - SKLearn PCA explained_variance_ration cumsum gives array of 1 sklearn.discriminant_analysis中的explained_variance_ratio_ - explained_variance_ratio_ in sklearn.discriminant_analysis Python sklearn PCA.explained_variance_ratio_不等于1 - Python sklearn PCA.explained_variance_ratio_ doesn't sum to 1 为什么 Sklearn TruncatedSVD 的解释方差比不是按降序排列的? - Why Sklearn TruncatedSVD's explained variance ratios are not in descending order? 为什么 y_pred = X @coef_ + intercept_ 不用于 sklearn PLSRegression? - why doesn't y_pred = X @ coef_ + intercept_ for sklearn PLSRegression? 在sklearn 0.15.0中,随机化PCA .explained_variance_ratio_总和大于1 - Randomized PCA .explained_variance_ratio_ sums to greater than one in sklearn 0.15.0 PCA解释方差分析 - PCA Explained Variance Analysis sklearn - 如何检索PCA组件并解释传递给GridSearchCV的Pipeline内部的差异 - sklearn - How to retrieve PCA components and explained variance from inside a Pipeline passed to GridSearchCV
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM