问题描述
我运行启用了HBase和Phoenix的HDInsight群集.我可以通过将ssh-terminal连接到该计算机上并在本地运行sqlline来在该计算机上运行Phoenix查询:
I run a HDInsight Cluster with HBase and Phoenix enabled. I can run Phoenix queries on that machine by connecting a ssh-terminal to it an run a sqlline locally:
sqlline.py zk1-storm.<blahBlah>.ax.internal.cloudapp.net:2181:/hbase-unsecure
在另一个HDInsight Cluster Storm上运行.从这里部署的拓扑中,我喜欢写到运行Phoenix/HBase的远程HDInsight实例.如何做到最好? JDBC连接需要哪些参数?什么是 (公开?)端点URL?
On another HDInsight Cluster Storm is running. From within a topology deployed here I like to write to my remote HDInsight instance running Phoenix/HBase. How to do that best? What are the required Parameters for a JDBC-Connection? What are the (public?) endpoint URLs?
到目前为止,我了解到:
So far I learned:
- 我必须通过Phoenix写入数据(而不是直接写入HBase),因为Phoenix不容易读取
- 我可能会使用Phoenix查询服务器提供的JDBC连接
- 在Storm方面,我可以使用Adbches JdbcInsertBolt.
不幸的是,我无法正确获得JdbcInsertBolt设置.到目前为止,我有:
Unfortunately I cannot get the JdbcInsertBolt set-up correctly. So far I have:
Map hikariConfigMap = new HashMap();
hikariConfigMap.put("jdbcUrl",
"jdbc:phoenix:thin:url="
+ "zk1-storm.<blahBlah>.ax.internal.cloudapp.net:"
+ "2181:"
+ "/hbase-unsecure"
);
this.connectionProvider = new HikariCPConnectionProvider(hikariConfigMap);
this.simpleJdbcMapper = new SimpleJdbcMapper(this.tablename, connectionProvider);
new JdbcInsertBolt(this.connectionProvider, this.simpleJdbcMapper)
.withTableName(this.tablename)
.withQueryTimeoutSecs(30);
但是,这无法在我的本地DEV环境中建立或在Storm HDInsight群集中部署Connection.
But this is not able to establish a Connection eighter in my local DEV Environment or deployed in the Storm HDInsight Cluster.
可以很好地使用JDBC方法吗?如果是这样:我在哪里错了?在哪里可以找到正确的端点URL/端口?
Is the JDBC-Approach in gerneal ok? If so: Where am I wrong? Where to find the correct endpoint URLs / Ports?
如果没有:将数据从Storm写入Phoenix的推荐方法是什么?
If not: What is the recommended way to write data from Storm to Phoenix?
推荐答案
我无法找到特定于您请求的信息,但是我确实找到了一些可能有用的教程:
I was not able to find information that was specific to your request but, I did find a couple tutorial that might be useful:
从以下位置写入HDFS HDInsight上的Apache Storm
使用Storm-starter示例开始在HDInsight上使用Apache Storm
请,如果您还有其他特定问题,请告诉我们,我们可以尝试为您提供一些答案.
Please, let us know if you have additional specific questions are we can try and get you some answers.
此致
迈克
可能有用或无效的其他文档:
博客:HDinsight –如何使用Phoenix执行批量加载?
Blog: HDinsight – How to perform Bulk Load with Phoenix ?
GitHub存储库:Apache Spark-Apache HBase连接器
GitHub Repo: Apache Spark - Apache HBase Connector
这篇关于如何通过Storm Bolt写信给Phoenix?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!