[英]How to replicate a pivot table procedure from excel in R?
I have a set of data with three columns.我有一组包含三列的数据。 I would like to "group by" my second column "location" on the left side.
我想在左侧“分组”我的第二列“位置”。 Imagine New York, London, Berlin and all the cities lined up on the left column.
想象一下纽约、伦敦、柏林和所有列在左栏上的城市。
I would like to "group by" the third column which is "race" but as new columns我想“分组”第三列是“种族”但作为新列
Location | White| Black |Asian|Grand Total
New York 700 465 323 1,488
London 1000 600 200 1,800
I have this code我有这个代码
Attempt<-table %>%
group_by(`Location`) %>%
summarise(n())
but it gets me this result但它让我得到这个结果
Location|Grand Total
New York 1,488
London 1,800
To do an example like this in excel is very easy.在excel中做这样的例子很容易。 I would like to do this in R. It is just a count of how many times the values appear in the table.
我想在 R 中执行此操作。它只是计算值在表中出现的次数。
Based on your description, maybe this might be what you're looking for.根据您的描述,这可能就是您要找的。
First would group_by
both Location
and Race
to get sub-total counts.首先将
group_by
Location
和Race
以获取小计计数。
Then you can use pivot_wider
to get final desired table in wide form.然后您可以使用
pivot_wider
以宽格式获得最终所需的表格。
A final rowSums
will get a Grand_Total
(where -1 removes the Location
column from the calculation).最后一个
rowSums
将获得一个Grand_Total
(其中 -1 从计算中删除Location
列)。
I made up some data for illustration.我编了一些数据来说明。
library(tidyverse)
df %>%
group_by(Location, Race) %>%
summarise(Total = sum(n())) %>%
ungroup() %>%
pivot_wider(id_cols = Location, names_from = Race, values_from = Total, values_fn = list(Total = sum), values_fill = list(Total = 0)) %>%
mutate(Grand_Total = rowSums(.[,-1]))
Output输出
# A tibble: 3 x 5
Location Black Asian White Grand_Total
<fct> <int> <int> <int> <dbl>
1 Berlin 1 0 0 1
2 London 0 1 2 3
3 New York 1 0 1 2
Data数据
df <- data.frame(
ID = 1:6,
Location = c("New York", "London", "Berlin", "London", "New York", "London"),
Race = c("White", "White", "Black", "Asian", "Black", "White")
)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.