问题描述
PySpark或至少Scala中的Apache Spark中是否有等效的Pandas Melt Function?
Is there an equivalent of Pandas Melt Function in Apache Spark in PySpark or at least in Scala?
到目前为止,我一直在python中运行示例数据集,现在我想对整个数据集使用Spark.
I was running a sample dataset till now in python and now I want to use Spark for the entire dataset.
谢谢.
推荐答案
没有内置功能(如果启用了SQL和Hive支持,则可以使用 stack
函数,但未在Spark中公开,也没有本机实现),但是小事一掷自己.所需的进口:
There is no built-in function (if you work with SQL and Hive support enabled you can use stack
function, but it is not exposed in Spark and has no native implementation) but it is trivial to roll your own. Required imports:
from pyspark.sql.functions import array, col, explode, lit, struct
from pyspark.sql import DataFrame
from typing import Iterable
示例实现:
def melt(
df: DataFrame,
id_vars: Iterable[str], value_vars: Iterable[str],
var_name: str="variable", value_name: str="value") -> DataFrame:
"""Convert :class:`DataFrame` from wide to long format."""
# Create array<struct<variable: str, value: ...>>
_vars_and_vals = array(*(
struct(lit(c).alias(var_name), col(c).alias(value_name))
for c in value_vars))
# Add to the DataFrame and explode
_tmp = df.withColumn("_vars_and_vals", explode(_vars_and_vals))
cols = id_vars + [
col("_vars_and_vals")[x].alias(x) for x in [var_name, value_name]]
return _tmp.select(*cols)
和一些测试(基于 Pandas doctests ):
And some tests (based on Pandas doctests):
import pandas as pd
pdf = pd.DataFrame({'A': {0: 'a', 1: 'b', 2: 'c'},
'B': {0: 1, 1: 3, 2: 5},
'C': {0: 2, 1: 4, 2: 6}})
pd.melt(pdf, id_vars=['A'], value_vars=['B', 'C'])
A variable value
0 a B 1
1 b B 3
2 c B 5
3 a C 2
4 b C 4
5 c C 6
sdf = spark.createDataFrame(pdf)
melt(sdf, id_vars=['A'], value_vars=['B', 'C']).show()
+---+--------+-----+
| A|variable|value|
+---+--------+-----+
| a| B| 1|
| a| C| 2|
| b| B| 3|
| b| C| 4|
| c| B| 5|
| c| C| 6|
+---+--------+-----+
注意:要与旧版Python一起使用,请删除类型注释.
Note: For use with legacy Python versions remove type annotations.
相关:
- r sparkR - equivalent to melt function
- Gather in sparklyr
这篇关于如何融化Spark DataFrame?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!