简体   繁体   中英

Numpy/Pandas: Error converting ndarray to series

I have the ndarray "diffTemp":

diffTemp = np.diff([df.Temp])

Where Temp are temperature values whose differences I compute using the difference operator. In this case using print() I get:

print(diffTemp) = [[-0.16 -0.05]]

To convert it into a column vector I use:

diffTemp = diffTemp.transpose() 

And then convert is from ndarray into Series using:

diffTemp = pd.Series([diffTemp]) 

(This allows me later to concatenate diffTime with its corresponding Series dates (diffDates).)

Unfortunately this outputs that diffTemp is:

print(diffTemp) = 0    [[-0.16000000000000014], [-0.05000000000000071]]

If I instead use (ie without hard brackets [ ]), such that instead:

diffTemp = pd.Series(diffTemp)

I instead get the error message:

Exception: Data must be 1-dimensional

Totally new to Python and have tried google the last few days without any success. Any help is much much appreciated.

The issue here is that you are trying to convert a two-dimensional array into a 1-dimensional series. Notice that there are two brackets around [[-0.16 -0.05]]. You can write the following to get back a series by just grabbing the 1-d array that you want:

diffTemp = pd.Series(diffTemp[0])

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM