简体   繁体   English

如何使用 dplyr 将组级别值(基于行级别值)分配给 df

[英]How do I assign group level value - based on row level values - to df using dplyr

I have the following decision rules:我有以下决策规则:

RELIABILITY LEVEL     DESCRIPTION
LEVEL I               Multiple regression
LEVEL II              Multiple regression + mechanisms specified (all interest variables)
LEVEL III             Multiple regression + mechanisms specified (all interest + control vars)

The first three columns are the data upon which the 4th column should be reproduced using dplyr.前三列是第 4 列应使用 dplyr 重现的数据。

The reliability level should be the same for the whole table (model)... I want to code it using dplyr.整个表(模型)的可靠性级别应该相同......我想使用 dplyr 对其进行编码。

Here is my try so far... As you can see, I can't get it to be the same for the whole model到目前为止,这是我的尝试......正如你所看到的,我无法让整个 model 都一样

library(tidyverse)
library(readxl)
library(effectsize)

df <- read_excel("https://github.com/timverlaan/relia/blob/59d2cbc5d7830c41542c5f65449d5f324d6013ad/relia.xlsx")

df1 <- df %>%
  group_by(study, table, function_var) %>%
  mutate(count_vars = n()) %>%
  ungroup %>%
  group_by(study, table, function_var, mechanism_described) %>%
  mutate(count_int = case_when(
    function_var == 'interest' & mechanism_described == 'yes' ~ n()
    )) %>%
  mutate(count_con = case_when(
    function_var == 'control' & mechanism_described == 'yes' ~ n()
    )) %>% 
  mutate(reliable_int = case_when(
    function_var == 'interest' & count_vars/count_int == 1 ~ 1)) %>%
  mutate(reliable_con = case_when(
    function_var == 'control' & count_vars/count_con == 1 ~ 1)) %>%
  # group_by(study, source) %>%
  mutate(reliable = case_when(
    reliable_int != 1 ~ 1,
    reliable_int == 1 ~ 2,
    reliable_int + reliable_con == 2 ~ 3)) %>%
  # ungroup() %>%

The code settled on is:确定的代码是:

library(tidyverse)
library(readxl)

df <- read_excel("C:/Users/relia.xlxs")
df <- df %>% select(-reliability_score)

test<-df %>% group_by(study,model,function_var) %>%
  summarise(count_yes=sum(mechanism_described=="yes"),n=n(),frac=count_yes/n) %>%
  mutate(frac_control=frac[function_var=="control"],
         frac_interest=frac[function_var=="interest"]) %>%
  mutate(reliability = case_when(
    frac_control == 1 & frac_interest != 1 ~ -99,
    frac_control != 1 & frac_interest != 1 ~ 2,
    frac_interest == 1 & frac_control != 1 ~ 3,
    frac_interest ==1 & frac_control == 1 ~ 4)) %>% group_by(study,model) %>% summarise(reliability=mean(reliability))

df_reliability<-left_join(df,test)
View(df_reliability)

However, I would prefer to do this all within one dplyr pipe. If anyone has a solution I would love to hear it...但是,我更愿意在一个 dplyr pipe 内完成所有这一切。如果有人有解决方案,我很想听听......

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用dplyr基于变量级别求和? - How to sum based on variable level using dplyr? 如何使用 dplyr 创建基于另一个值的列,而不必写下每个值? - How do I create a column based on values of another using dplyr without having to write down every value? 如何在`df2`中添加一个变量,使用`dplyr`或`data.table`从`df1`指定变量的特定级别的行数 - How to add a variable in `df2` that specify the number of rows of a specific level of a variable from `df1` using `dplyr` or `data.table` 如何使用 dplyr 根据组中其他值的存在来修改组中的值? - How to modify the values in a group based on the presance of some other value in the group using dplyr? 如何基于r中的另一行将值分配给数据框中的行? - How do I assign value to a row in a dataframe based on another in r? 如何使用dplyr根据向量中字符串的存在来突变和分配值 - How to mutate and assign value based on the presence of string in a vector using dplyr dplyr:我如何根据其他列中的值计算组内的倍数变化 - dplyr: How do i calculate fold-change within group based on values in other column 使用dplyr基于重复值在条件因子级别汇总的拆分数据帧 - Split data frame conditional on factor level summarise based on duplicated values using dplyr 如何在dplyr中使用group_by()和do()为每个因子级别应用函数 - How to use group_by() and do() in dplyr to apply a function for each factor level 使用 dplyr 和 forcats 包根据分组变量中的值更改因子级别 - Changing the factor level based on the value in a grouped variable using the dplyr and forcats packages
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM