简体   繁体   中英

Python linear interpolation between rows in pandas dataframe

I have a dataframe

data=pd.DataFrame({'data1':[10,20,30], 'data2':[15,25,35], 'data3':[20,30,40], 'data3':[25,35,35]})

I wanted to interpolate between the rows, depending on my input n. here n=2

so my new_data must constain n*len(data)

As I tried np.linspace but I am getting new length as n*len(data)-1

expected to get

{ data1 data2 data3 10.0 15.0 25.0 15.0 20.0 30.0 20.0 25.0 35.0 25.0 30.0 35.0 30.0 35.0 35.0 20.0 25.0 30.0}

I thought of taking the last row as the interpolation between the last and first row from the input data set

Using np.interp() to do the interpolation:

data = pd.DataFrame({
    'data1': [10, 20, 30],
    'data2': [15, 25, 35],
    'data3': [20, 30, 40],
    'data4': [25, 35, 35]})
n = 2

df = data.append(data.iloc[0]).reset_index(drop=True)  # add first column for interpolation only

x = np.linspace(df.index.min(), df.index.max(), (len(df) - 1) * n + 1)  # x axis values to calculate

result = df.apply(lambda c: np.interp(x, df.index, c), axis=0).iloc[:-1]  # drop last column

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM