问题描述
无论如何要在Java Flink Job中使用python用户定义的函数,或者无论如何来交流例如flink与java完成的转换结果以及python用户定义的函数来应用一些机器学习的东西:
Is there anyway to use a python user defined function within a Java Flink Job or anyway to communicate for example the result of a transformation done by flink with java with a python user defined function to apply some machine learning things:
我知道从 pyFlink 你可以做这样的事情:
I know that from pyFlink you can do something like this:
table_env.register_java_function("hash_code", "my.java.function.HashCode")
但是我需要做类似的事情但是从java中添加python函数,或者我如何将java转换的结果直接传递给Python UDF Flink作业?
But I need to do something like that but add the python function from java, or how can I pass the result of a java transformation to a Python UDF Flink job directly?
我希望这些问题不要太疯狂,但我需要知道是否存在以某种方式将 Flink DataStream API 与以 Java 为主要语言的 Python Table API 进行通信?这意味着从 Java 我需要做:来源 ->转换 ->Sink,但其中一些转换可以触发 Python 函数,或者 Python 函数将等待某些 Java 转换完成以对 Stream 结果执行某些操作.
I hope these questions are not to crazy, but I need to know if exist somehow to communicate Flink DataStream API with Python Table API having Java as main language? this means that from Java I need to do:Source -> Transformations -> Sink, but some of these transformations can trigger a Python function or a Python function will be waiting for some Java transformation to finish to do something with the Stream result.
我希望有人理解我在这里要做的事情.
I hope someone understand what I'm trying to do here.
亲切的问候!
推荐答案
这种集成的示例:假设当前版本为 Flink 1.11,则 pom.xml 中需要此依赖项.
Example of this integration:This dependency is needed in your pom.xml, assuming that Flink 1.11 is the current version.
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_2.11</artifactId>
<version>1.11.2</version>
<scope>provided</scope>
</dependency>
创建环境:
private StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
private StreamTableEnvironment tableEnv = getTableAPIEnv(env);
/*this SingleOutputStreamOperator will contains the result of the consumption from the defined source*/
private SingleOutputStreamOperator<Event> stream;
public static StreamTableEnvironment getTableAPIEnv(StreamExecutionEnvironment env) {
final StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env);
tableEnv.getConfig().getConfiguration().setString("python.files", path/function.py);
tableEnv.getConfig().getConfiguration().setString("python.client.executable", path/python);
tableEnv.getConfig().getConfiguration().setString("python.executable", path/python);
tableEnv.getConfig().getConfiguration().setString("taskmanager.memory.task.off-heap.size", "79mb");
/*pass here the function.py and the name of the function into the python script*/
tableEnv.executeSql("CREATE TEMPORARY SYSTEM FUNCTION FunctionName AS 'function.FunctionName' LANGUAGE PYTHON");
return tableEnv;
}
从您想要进行的转换开始,例如:
Start with the transformations that you want to do, for example:
SingleOutputStreamOperator<EventProfile> profiles = createUserProfile(stream.keyBy(k -> k.id));
/*The result of that ProcessFunction `createUserProfile()` will be sent into the Python function to update some values of the profile and return them back into a defined function in Flink with Java: map function for example*/
profiles = turnIntoTable(profiles).map((MapFunction<Row, EventProfile>) x -> {
/*you custom code here to do the mapping*/
});
profiles.addSink(new yourCustomSinkFunction());
/*this function will process the Event and create the EventProfile class for this example but you can also use another operators (map, flatMap, etc)*/
private SingleOutputStreamOperator<EventProfile> createUserProfile(KeyedStream<Event, String> stream) {
return stream.process(new UserProfileProcessFunction());
}
/*This function will receive a SingleOutputStreamOperator and sent each record to the Python function trough the TableAPI and returns a Row of String(you can change the Row type) that will be mapped back into EventProfile class*/
@FunctionHint(output = @DataTypeHint("ROW<a STRING>"))
private DataStream<Row> turnIntoTable(SingleOutputStreamOperator<EventProfile> rowInput) {
Table events = tableEnv.fromDataStream(rowInput,
$("id"), $("noOfHits"), $("timestamp"))
.select("FunctionName(id, noOfHits, timestamp)");
return tableEnv.toAppendStream(events, Row.class);
}
最后
env.execute("Job Name");
在function.py
脚本中调用FunctionName
的python函数示例:
An example of the python function called FunctionName
into the function.py
script:
@udf(
input_types=[
DataTypes.STRING(), DataTypes.INT(), DataTypes.TIMESTAMP(precision=3)
],
result_type=DataTypes.STRING()
)
def FunctionName(id, noOfHits, timestamp):
# function code here
return f"{id}|{noOfHits}|{timestamp}"
这篇关于在 Java Flink 作业中使用 Python 用户定义函数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!