简体   繁体   中英

Generate diagonal matrix from regression coefficient

I am trying to generate a diagonal matrix using a linear regression coefficient. First I generated an empty matrix. Then I extract the coefficient from the regression model. Here's my code:

P = np.zeros((ncol, ncol), dtype = int)
intercep = np.zeros((1, ncol), dtype = int)

my_pls = PLSRegression(n_components = ncomp, scale=False)
model = my_pls.fit(x, y)

#extract pls coeffeicient:
coef = model.coef_
intercep = model.y_mean_ - (model.x_mean_.dot(coef))

P[(i-k):(i+k), i-k] = np.diag(coef[0:ncol])

But I got zero matrices after running the code. Can anyone please help me out with how to get the diagonal matrix from the regression coefficient?

Not sure why you need to declare P .

You can get diagonal matrix with zeros directly from the 1D list/vector using numpy.diag

x=[3,5,6,7]
numpy.diag(x)

Output:

array([[3, 0, 0, 0],
       [0, 5, 0, 0],
       [0, 0, 6, 0],
       [0, 0, 0, 7]])

For your case, try P=np.diag(coef)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM