簡體   English   中英

Python,Numpy和OLS

[英]Python, Numpy, and OLS

下面的代碼按預期工作,但它不是我需要的。 我想將c[1]改為c[1:]這樣我就可以對所有的x變量進行回歸,而不僅僅是一個。 當我進行更改(並添加適當的x標簽)時,我收到以下錯誤: ValueError: matrices are not aligned 有人可以解釋為什么會發生這種情況並建議修改代碼嗎? 謝謝。

from numpy import *
from ols import *

a = [[.001,.05,-.003,.014,.035,-.01,.032,-.0013,.0224,.005],[-.011,.012,.0013,.014,-.0015,.019,-.032,.013,-.04,-.05608],
 [.0021,.02,-.023,.0024,.025,-.081,.032,-.0513,.00014,-.00015],[.001,.02,-.003,.014,.035,-.001,.032,-.003,.0224,-.005],
 [.0021,-.002,-.023,.0024,.025,.01,.032,-.0513,.00014,-.00015],[-.0311,.012,.0013,.014,-.0015,.019,-.032,.013,-.014,-.008],
 [.001,.02,-.0203,.014,.035,-.001,.00032,-.0013,.0224,.05],[.0021,-.022,-.0213,.0024,.025,.081,.032,.05313,.00014,-.00015],
 [-.01331,.012,.0013,.014,.01015,.019,-.032,.013,-.014,-.012208],[.01021,-.022,-.023,.0024,.025,.081,.032,.0513,.00014,-.020015]]


c = column_stack(a)
y = c[0]
m = ols(y, c[1], y_varnm='y', x_varnm=['x1'])
print m.summary()

編輯:我提出了部分解決方案,但仍有問題。 下面的代碼適用於9個解釋變量中的8個。

c = column_stack(a)
y = c[0]
x = column_stack([c[i] for i in range(1, 9)])
m = ols(y, x, y_varnm='y', x_varnm=['x1','x2','x3','x4','x5','x6','x7','x8'])
print m.summary()

但是,當我嘗試包含第9個x變量時,我得到以下錯誤: RuntimeWarning: divide by zero encountered in double_scalars 知道為什么嗎? 這是代碼(注意len(a) = 10):

c = column_stack(a)
y = c[0]
x = column_stack([c[i] for i in range(1, len(a))])
m = ols(y, x, y_varnm='y', x_varnm=['x1','x2','x3','x4','x5','x6','x7','x8','x9'])
print m.summary()

我對你正在使用的ols模塊一無所知。 但是,如果您使用s cikits.statsmodels嘗試以下操作 ,它應該工作:

import numpy as np
import scikits.statsmodels.api as sm

a = np.array([[.001,.05,-.003,.014,.035,-.01,.032,-.0013,.0224,.005],[-.011,.012,.0013,.014,-.0015,.019,-.032,.013,-.04,-.05608],
 [.0021,.02,-.023,.0024,.025,-.081,.032,-.0513,.00014,-.00015],[.001,.02,-.003,.014,.035,-.001,.032,-.003,.0224,-.005],
 [.0021,-.002,-.023,.0024,.025,.01,.032,-.0513,.00014,-.00015],[-.0311,.012,.0013,.014,-.0015,.019,-.032,.013,-.014,-.008],
 [.001,.02,-.0203,.014,.035,-.001,.00032,-.0013,.0224,.05],[.0021,-.022,-.0213,.0024,.025,.081,.032,.05313,.00014,-.00015],
 [-.01331,.012,.0013,.014,.01015,.019,-.032,.013,-.014,-.012208],[.01021,-.022,-.023,.0024,.025,.081,.032,.0513,.00014,-.020015]])

y = a[:, 0]
x = a[:, 1:]
results = sm.OLS(y, x).fit()
print results.summary()

輸出:

     Summary of Regression Results
=======================================
| Dependent Variable:            ['y']|
| Model:                           OLS|
| Method:                Least Squares|
| # obs:                          10.0|
| Df residuals:                    1.0|
| Df model:                        8.0|
==============================================================================
|                   coefficient     std. error    t-statistic          prob. |
------------------------------------------------------------------------------
| x0                     0.2557         0.6622         0.3862         0.7654 |
| x1                    0.03054          1.453         0.0210         0.9866 |
| x2                     -3.392          2.444        -1.3877         0.3975 |
| x3                      1.445          1.474         0.9808         0.5062 |
| x4                    0.03559         0.2610         0.1363         0.9137 |
| x5                    -0.7412         0.8754        -0.8467         0.5527 |
| x6                    0.02289         0.2466         0.0928         0.9411 |
| x7                     0.5754          1.413         0.4074         0.7537 |
| x8                    -0.4827         0.7569        -0.6378         0.6386 |
==============================================================================
|                          Models stats                      Residual stats  |
------------------------------------------------------------------------------
| R-squared:                     0.8832   Durbin-Watson:              2.578  |
| Adjusted R-squared:          -0.05163   Omnibus:                   0.5325  |
| F-statistic:                   0.9448   Prob(Omnibus):             0.7663  |
| Prob (F-statistic):            0.6663   JB:                        0.1630  |
| Log likelihood:                 41.45   Prob(JB):                  0.9217  |
| AIC criterion:                 -64.91   Skew:                      0.4037  |
| BIC criterion:                 -62.18   Kurtosis:                   2.405  |
------------------------------------------------------------------------------

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM