简体   繁体   English

通过Ambari安装DataNode

[英]Install DataNode by Ambari

I have 我有

OS Red Hat Enterprise Linux Server release 7.4 (Maipo)
Ambari Version 2.5.1.0
HDP 2.6

After finished deploy components 2 datanodes not can start. 完成部署组件后,2个数据节点无法启动。 Tried start returned error: 尝试开始返回错误:

  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start datanode'' returned 127. -bash: /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh: No such file or directory

I tried to delete component and make new install by Ambari. 我试图删除组件并通过Ambari进行新安装。

Installed completed without error 安装完成无错误

2018-02-27 20:47:31,550 - Execute['ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.6 | tail -1`'] {'only_if': 'ls -d /usr/hdp/2.6*'}
2018-02-27 20:47:31,554 - Skipping Execute['ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.6 | tail -1`'] due to only_if
2018-02-27 20:47:31,554 - FS Type: 
2018-02-27 20:47:31,554 - XmlConfig['core-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {u'final': {u'fs.defaultFS': u'true'}}, 'owner': 'hdfs', 'only_if': 'ls /usr/hdp/current/hadoop-client/conf', 'configurations': ...}
2018-02-27 20:47:31,568 - Generating config: /usr/hdp/current/hadoop-client/conf/core-site.xml
2018-02-27 20:47:31,569 - File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-02-27 20:47:31,583 - Could not load 'version' from /var/lib/ambari-agent/data/structured-out-3374.json

Command completed successfully!

BUT new start show more again error. 但是新的开始再次显示更多错误。 I checked folder /usr/hdp/current/hadoop-client/ In folder new files for example /sbin/hadoop-daemon.sh did not appear. 我检查了文件夹/ usr / hdp / current / hadoop-client /在文件夹中未出现新文件,例如/sbin/hadoop-daemon.sh

How to do it again deploy component DataNode by Ambari? 如何再次通过Ambari部署组件DataNode?

I'd guess the issue is caused by wrong symlinks at /usr/hdp . 我猜这个问题是由/usr/hdp错误的符号链接引起的。 You may even try to fix them manually, the structure is trivial enough. 您甚至可以尝试手动修复它们,该结构很简单。 Through the issue does not sound like a common one after a plain stack deployment. 在普通堆栈部署之后,通过发行该问题听起来并不常见。

Are you running Ambari with non-root/custom user? 您是否以非root用户/自定义用户身份运行Ambari? Maybe Ambari has not sufficient permissions? 也许Ambari没有足够的权限? See https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.0.0/bk_ambari-security/content/how_to_configure_ambari_server_for_non-root.html 参见https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.0.0/bk_ambari-security/content/how_to_configure_ambari_server_for_non-root.html

Ambari Version 2.5.1.0 is considerably outdated, so it would make sense to update Ambari and see whether it helps. Ambari版本2.5.1.0已过时,因此更新Ambari并查看是否有帮助是有意义的。 Also, if you want to whipe out everything see https://github.com/hortonworks/HDP-Public-Utilities/blob/master/Installation/cleanup_script.sh 另外,如果您想了解所有内容,请参见https://github.com/hortonworks/HDP-Public-Utilities/blob/master/Installation/cleanup_script.sh

Also, it may be more productive to ask Ambari-related questions here https://community.hortonworks.com/ 另外,在这里https://community.hortonworks.com/询问与Ambari相关的问题可能会更有效率

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM