简体   繁体   English

如何根据评论找到每部电影的平均分数 - Python

[英]How to find average score for each movie based on reviews - Python

I have dataframe like this.我有这样的 dataframe。

    UserID  Review      MovieID
0   10112   Good        MOV001
1   10112   Excellent   MOV002
2   10112   Average     MOV003
3   10113   Good        MOV001
4   10113   Bad         MOV002
5   10113   Good        MOV003
6   10113   Excellent   MOV004
7   10114   Good        MOV001
8   10114   Bad         MOV002
9   10114   Good        MOV003
10  10114   Excellent   MOV004

I've changed the reviews to int values.我已将评论更改为 int 值。

movies.Review[movies.Status == 'Average'] = 2
movies.Review[movies.Status == 'Good'] = 3
movies.Review[movies.Status == 'Excellent'] = 5
movies.Review[movies.Status == 'Very Good'] = 4
movies.Review[movies.Status == 'Okay'] = 1
movies.Review[movies.Status == 'Bad'] = 0
movies

Now my dataframe will look like this,现在我的 dataframe 看起来像这样,

UserID  Review      MovieID
0   10112   3           MOV001
1   10112   5           MOV002
2   10112   2           MOV003
3   10113   3           MOV001
4   10113   0           MOV002
5   10113   3           MOV003
6   10113   5           MOV004
7   10114   3           MOV001
8   10114   0           MOV002
9   10114   3           MOV003
10  10114   5           MOV004

Now how can I find the average score for each movie based on Review?现在如何根据评论找到每部电影的平均分数? Can anyone help me out?谁能帮我吗?

First, you don't need those movies.Review[movies.Status==...] =... .首先,你不需要那些movies.Review[movies.Status==...] =... Instead, use np.select or map :相反,使用np.selectmap

Status_convert = {'Bad':0, 'Okay':1, 'Average':2,
                   'Good':3, 'Very Good':4, 'Excellent':5}
movies['Review'] = movies.Status.map(Status_convert)

Then you can do:然后你可以这样做:

df.groupby('MovieID')['Review'].mean()

Output: Output:

MovieID
MOV001    3.000000
MOV002    1.666667
MOV003    2.666667
MOV004    5.000000
Name: Review, dtype: float64

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM