I keep on getting that error
message and I'm stumped and don't know how to fix it.
This is what I wrote:
job = {'fireman': 42600, 'programmer': 48700, 'clerk': 23000}
salary = float(job * 1.05 ** years_of_service)
return salary
For Question:
def salary(job, years_of_service):
'''(str, int) -> float
Return the salary (in dollars) of a person holding job for
years_of_service.
Each year, a person receives a 5% increase in salary over his/her previous
year. The starting salary for various jobs:
fireman $42 600
programmer $48 700
clerk $23 000
years_of_service will be at least 0.
>>> salary('clerk', 2)
25357.5
'''
The problem is that you're trying to multiply the entire dict. You need to get the starting salary out of the dict. Due something along these lines
salaries = {'fireman': 42600, 'programmer': 48700, 'clerk': 23000}
def salary(job, years_of_service):
return salaries[job] * (1.05 ** years_of_service)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.