我已经使用Apache Ambari安装了Apache KNOX。跟随以下链接
用于使用KNOX配置WEBHDFS的文件。KNOX安装在address1中。
https://knox.apache.org/books/knox-0-13-0/user-guide.html#WebHDFS

使用curl命令调用webhdfs时curl -i -k -u guest:guest-password -X GET 'https://address1:8443/gateway/Andromd/webhdfs/v1/?op=LISTSTATUS'
它引发以下错误

<title> 404 Not Found</title>
<p> Problem accessing /gateway//Andromd/webhdfs/v1/build-version.Reason:
<pre> Not Found </pre></p>

Andromd是集群名称,并将其添加到{GATEWAY_HOME}/conf/topologies/Andromd.xml下。
配置如下。
<service>
    <role>NAMENODE</role>
    <url>hdfs://address2:8020</url>
</service>
<service>
    <role>WEBHDFS</role>
    <url>http://address2:50070/webhdfs</url>
</service
/etc/hadoop/conf/hdfs-site.xml的内容如下。
<property>
    <name>dfs.webhdfs.enabled</name>
    <value>true</value>
</property>
<property>
    <name>dfs.namenode.rpc-address</name>
    <value>address2:8020</value>
</property>
<property>
    <name>dfs.namenode.http-address</name>
    <value>address2:50070</value>
</property>
<property>
    <name>dfs.https.namenode.https-address</name>
    <value>address2:50470</value>
</property>

让我知道,关于我的配置,我身边还有什么遗漏。

Gateway.log内容:
2017-10-13 18:24:24,586 ERROR digester3.Digester (Digester.java:parse(1652)) - An error occurred while parsing XML from '(already loaded from stream)', see nested exceptions
org.xml.sax.SAXParseException; lineNumber: 88; columnNumber: 18; The content of elements must consist of well-formed character data or markup.
        at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1239)
        at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:643)
        at org.apache.commons.digester3.Digester.parse(Digester.java:1642)
        at org.apache.commons.digester3.Digester.parse(Digester.java:1701)
        at org.apache.hadoop.gateway.services.topology.impl.DefaultTopologyService.loadTopologyAttempt(DefaultTopologyService.java:124)
        at org.apache.hadoop.gateway.services.topology.impl.DefaultTopologyService.loadTopology(DefaultTopologyService.java:100)
        at org.apache.hadoop.gateway.services.topology.impl.DefaultTopologyService.loadTopologies(DefaultTopologyService.java:233)
        at org.apache.hadoop.gateway.services.topology.impl.DefaultTopologyService.reloadTopologies(DefaultTopologyService.java:318)
        at org.apache.hadoop.gateway.GatewayServer.start(GatewayServer.java:312)
        at org.apache.hadoop.gateway.GatewayServer.startGateway(GatewayServer.java:231)
        at org.apache.hadoop.gateway.GatewayServer.main(GatewayServer.java:114)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.gateway.launcher.Invoker.invokeMainMethod(Invoker.java:70)
        at org.apache.hadoop.gateway.launcher.Invoker.invoke(Invoker.java:39)
        at org.apache.hadoop.gateway.launcher.Command.run(Command.java:101)
        at org.apache.hadoop.gateway.launcher.Launcher.run(Launcher.java:69)
        at org.apache.hadoop.gateway.launcher.Launcher.main(Launcher.java:46)
2017-10-13 18:24:24,587 ERROR hadoop.gateway (DefaultTopologyService.java:loadTopologies(250)) - Failed to load topology /usr/hdp/2.4.2.0-258/knox/bin/../conf/topologies/Andromeda.xml: org.xml.sax.SAXParseException; lineNumber: 88; columnNumber: 18; The content of elements must consist of well-formed character data or markup.
2017-10-13 18:24:24,588 INFO  hadoop.gateway (GatewayServer.java:handleCreateDeployment(450)) - Loading topology admin from /usr/hdp/2.4.2.0-258/knox/bin/../data/deployments/admin.war.15f15c171c0
2017-10-13 18:24:24,793 INFO  hadoop.gateway (GatewayServer.java:start(315)) - Monitoring topologies in directory: /usr/hdp/2.4.2.0-258/knox/bin/../conf/topologies
2017-10-13 18:24:24,795 INFO  hadoop.gateway (GatewayServer.java:startGateway(232)) - Started gateway on port 8,443.

添加到上面的查询中,有必要通过将Andromd.xml替换为我们的集群特定信息,在sandbox.hortonworks.com中包含以下标签。
<property>
    <name>dfs.webhdfs.enabled</name>
    <value>true</value>
</property>
<property>
    <name>dfs.namenode.rpc-address</name>
    <value>sandbox.hortonworks.com:8020</value>
</property>
<property>
    <name>dfs.namenode.http-address</name>
    <value>sandbox.hortonworks.com:50070</value>
</property>
<property>
    <name>dfs.https.namenode.https-address</name>
    <value>sandbox.hortonworks.com:50470</value>
</property>

最佳答案

2017-10-13 18:24:24,587 ERROR hadoop.gateway (DefaultTopologyService.java:loadTopologies(250)) - Failed to load topology /usr/hdp/2.4.2.0-258/knox/bin/../conf/topologies/Andromeda.xml: org.xml.sax.SAXParseException; lineNumber: 88; columnNumber: 18; The content of elements must consist of well-formed character data or markup.

查看此行,您的拓扑文件Andromeda.xml的格式似乎不正确(在lineNumber:88处缺少结束标记等)。

这就是为什么未部署拓扑而得到404的原因。修复拓扑文件后检查日志,并确保没有启动错误。

关于hadoop - 使用WEBHDFS配置Apache KNOX网关时出现问题,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/46747708/

10-12 04:53