我有一个熊猫数据框,如下所示。没有["sente"]值的所有行都包含更多信息,但它们尚未链接到["sente"]

id    pos      value       sente
1     a         I           21
2     b         have        21
3     b         a           21
4     a         cat         21
5     d         !           21
6     cat       N           Nan
7     a         My          22
8     a         cat         22
9     b         is          22
10    a         cute        22
11    d         .           22
12    cat       N           NaN
13    cute      M           NaN


现在,我希望["sente"]中没有值的每一行从上一行中获取其值。然后,我想按["sente"]将它们全部分组,并创建一个新列,其内容来自行,而["sente"]中没有值。

  sente      pos          value            content
   21     a,b,b,a,d   I have a cat !     'cat,N'
   22     a,a,b,a,d   My cat is cute .   'cat,N','cute,M'


这将是我的第一步:

df.loc[(df['sente'] != df["sente"].shift(-1) & df["sente"] == Nan) , "sente"] = df["sente"].shift(+1)

但它仅适用于另外的一行(如果有2个或更多行)。

这就像我想要的那样将一列分组:

df.groupby(["sente"])['value'].apply(lambda x: " ".join()

但是对于更多的列,它并不能像我想要的那样工作:

df.groupby(["sente"]).agr(lambda x: ",".join()

有没有不使用堆栈函数的方法吗?

最佳答案

采用:

#check NaNs values to boolean mask
m = df['sente'].isnull()
#new column of joined columns only if mask
df['contant'] = np.where(m, df['pos'] + ',' + df['value'], np.nan)
#replace to NaNs by mask
df[['pos', 'value']] = df[['pos', 'value']].mask(m)
print (df)
    id  pos value  sente contant
0    1    a     I   21.0     NaN
1    2    b  have   21.0     NaN
2    3    b     a   21.0     NaN
3    4    a   cat   21.0     NaN
4    5    d     !   21.0     NaN
5    6  NaN   NaN    NaN   cat,N
6    7    a    My   22.0     NaN
7    8    a   cat   22.0     NaN
8    9    b    is   22.0     NaN
9   10    a  cute   22.0     NaN
10  11    d     .   22.0     NaN
11  12  NaN   NaN    NaN   cat,N
12  13  NaN   NaN    NaN  cute,M


最后用正向填充NaNffill替换join,用NaN删除dropna

df1 = df.groupby(df["sente"].ffill()).agg(lambda x: " ".join(x.dropna()))
print (df1)
             pos             value       contant
sente
21.0   a b b a d    I have a cat !         cat,N
22.0   a a b a d  My cat is cute .  cat,N cute,M

07-24 22:12