问题描述
我试图创建一个Dataflow作业以便在BigTable中插入行,但是当我在本地测试Dataflow作业时,出现以下错误:
I'm trying to create a Dataflow job in order to insert rows in BigTable, but while I'm testing the Dataflow job locally I get the following error:
Exception in thread "main" com.google.cloud.dataflow.sdk.Pipeline$PipelineExecutionException: java.lang.IllegalStateException: Neither Jetty ALPN nor OpenSSL via netty-tcnative were properly configured.
at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:186)
下面您可以找到我的主要代码:
Bellow you can find my Main code:
CloudBigtableOptions options =
PipelineOptionsFactory.fromArgs(args).withValidation().create().as(CloudBigtableOptions.class);
options.setProject("xxxxxxxxx");
options.setBigtableProjectId("xxxxxxxxx");
options.setBigtableInstanceId("xxxxxxxxx");
options.setBigtableTableId("xxxxxxxxx");
options.setZone("europe-west1-b");
options.setRunner(DirectPipelineRunner.class);
CloudBigtableTableConfiguration config =
CloudBigtableTableConfiguration.fromCBTOptions(options);
Pipeline p = Pipeline.create(options);
CloudBigtableIO.initializeForWrite(p);
FixedWindows window = FixedWindows.of(Duration.standardMinutes(1));
p
.apply(Create.of("Hello"))
.apply(Window.into(window))
.apply(ParDo.of(MUTATION_TRANSFORM))
.apply(CloudBigtableIO.writeToTable(config));
p.run();
另一种尝试是使用以下代码:
Another try has been with the following code:
CloudBigtableTableConfiguration config =
new CloudBigtableTableConfiguration.Builder()
.withProjectId("xxxxxxxxx")
.withInstanceId("xxxxxxxxx")
.withTableId("xxxxxxxxx")
.build();
Pipeline p = Pipeline.create(options);
CloudBigtableIO.initializeForWrite(p);
FixedWindows window = FixedWindows.of(Duration.standardMinutes(1));
p
.apply(Create.of("Hello"))
.apply(Window.into(window))
.apply(ParDo.of(MUTATION_TRANSFORM))
.apply(CloudBigtableIO.writeToTable(config));
p.run();
但是我遇到了同样的错误.
But I got the same error.
我做错什么了吗?
完整错误:
Exception in thread "main" com.google.cloud.dataflow.sdk.Pipeline$PipelineExecutionException: java.lang.IllegalStateException: Neither Jetty ALPN nor OpenSSL via netty-tcnative were properly configured.
at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:186)
at HubCache.main(HubCache.java:75)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.IllegalStateException: Neither Jetty ALPN nor OpenSSL via netty-tcnative were properly configured.
at com.google.bigtable.repackaged.com.google.cloud.grpc.BigtableSession.<init>(BigtableSession.java:236)
at org.apache.hadoop.hbase.client.AbstractBigtableConnection.<init>(AbstractBigtableConnection.java:123)
at org.apache.hadoop.hbase.client.AbstractBigtableConnection.<init>(AbstractBigtableConnection.java:91)
at com.google.cloud.bigtable.hbase1_0.BigtableConnection.<init>(BigtableConnection.java:33)
at com.google.cloud.bigtable.dataflow.CloudBigtableConnectionPool$1.<init>(CloudBigtableConnectionPool.java:72)
at com.google.cloud.bigtable.dataflow.CloudBigtableConnectionPool.createConnection(CloudBigtableConnectionPool.java:72)
at com.google.cloud.bigtable.dataflow.CloudBigtableConnectionPool.getConnection(CloudBigtableConnectionPool.java:64)
at com.google.cloud.bigtable.dataflow.CloudBigtableConnectionPool.getConnection(CloudBigtableConnectionPool.java:57)
at com.google.cloud.bigtable.dataflow.AbstractCloudBigtableTableDoFn.getConnection(AbstractCloudBigtableTableDoFn.java:96)
at com.google.cloud.bigtable.dataflow.CloudBigtableIO$CloudBigtableSingleTableBufferedWriteFn.getBufferedMutator(CloudBigtableIO.java:941)
at com.google.cloud.bigtable.dataflow.CloudBigtableIO$CloudBigtableSingleTableBufferedWriteFn.processElement(CloudBigtableIO.java:966)
pom.xml:
<dependencies>
<dependency>
<groupId>com.google.cloud.dataflow</groupId>
<artifactId>google-cloud-dataflow-java-sdk-all</artifactId>
<version>LATEST</version>
</dependency>
<dependency>
<groupId>com.google.cloud.bigtable</groupId>
<artifactId>bigtable-hbase-dataflow</artifactId>
<version>LATEST</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-api -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.21</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-simple -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>LATEST</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.netty/netty-tcnative-boringssl-static -->
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-tcnative-boringssl-static</artifactId>
<version>1.1.33.Fork13</version>
<classifier>${os.detected.classifier}</classifier>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</pluginManagement>
<extensions>
<!-- Use os-maven-plugin to initialize the "os.detected" properties -->
<extension>
<groupId>kr.motd.maven</groupId>
<artifactId>os-maven-plugin</artifactId>
<version>1.4.0.Final</version>
</extension>
</extensions>
</build>
推荐答案
问题是扩展 os-maven-plugin (用于初始化"os.detected"属性)没有初始化正确设置属性.
Problem was that the extension os-maven-plugin, to initialize the "os.detected" properties, didn't initialize the property correctly.
我已经正确设置了属性的测试,并且该测试已成功执行.
I've done a test setting the property corrrectly and the test has been executed without problems.
这篇关于如何解决“未正确配置通过netty-tcnative的Jetty ALPN或OpenSSL"?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!