问题描述
我希望能够在Docker中启动Hadoop集群,将Hadoop节点分发到不同的物理节点,方法是使用。
我发现了序列号图像,可以让我在Docker容器中运行hadoop,但这不允许我使用多个节点。我也看了Cloudbreak项目,但它似乎需要一个OpenStack安装,这似乎有点矫枉过正,因为在我看来,像群集本身就足以满足我们需要。
我还发现了 Stackoverflow question + answer它依赖于编织,它需要sudo权限,我们的管理员不会给每个人。
是否有解决方案,以便启动hadoop群集归结为启动一些容器通过swarm?
我无法给出明确的答案,但是如果您希望在没有管理员权限的情况下进行设置和所有答案失败我担心你可能会失去运气。
考虑询问管理员为什么他不想发布sudo访问权限,可能是要么您可以消除他的疑惑,要么事实证明您想要做的是不可取的。
I would like to be able to start a Hadoop cluster in Docker, distributing the Hadoop nodes to the different physical nodes, using swarm.
I have found the sequenceiq image that lets me run hadoop in a docker container, but this doesn't allow me to use multiple nodes. I have also looked at the cloudbreak project, but it seems to need an openstack installation, which seems a bit overkill, because it seems to me like swarm alone should be enough to do what we need.
Also I found this Stackoverflow question+answer which relies on weave, which needs sudo-rights, which our admin won't give to everyone.
Is there a solution so that starting the hadoop cluster comes down to starting a few containers via swarm?
I cannot give a definitive answer, but if you are looking to set this up without administratrator privileges and all answers to this question fail I fear you might be out of luck.
Consider asking the admin why he does not want to give out sudo access, chances are that either you can take away his doubts, or else that it turns out that what you want to do is undesirable.
这篇关于如何在Docker Swarm中设置Hadoop?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!