简体   繁体   中英

user defined function returns error in python

    import pandas as pd
    d = {'a': [1,2,3],
         'b': [3,4,5],
         'c': [5,4,3]}
    df=pd.DataFrame(d)
    df

returns:

        a   b   c
    0   1   3   5
    1   2   4   4
    2   3   5   3

I create the following function to calculate m:

    def foo(x,y,z):
        m=x(y+z)
        return m

Then apply it to df:

    df['new']=df[['a', 'b', 'c']].apply(lambda x,y,z: foo(x,y,z))

but this gives following error:

    ("<lambda>() missing 2 required positional arguments: 'y' and 'z'", 'occurred at index a')

How can i solve it?

You have 2 problems, 1 is a syntax error where you seem to forget the * operator:

m=x(y+z)

should be:

m=x*(y+z)

The more important one is about how you are distributing the arguments into the foo function via lambda . You can fix it with this solution:

df['new']=df.apply(lambda x: foo(x['a'],x['b'],x['c']),axis=1)

see also Applying function with multiple arguments to create a new pandas column

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM