简体   繁体   English

我们是否需要在所有节点中安装所有 HDP 的服务客户端?

[英]Do we need install all HDP's Services Client in all node?

We want to deploy HDP 3.1.5 in production environment We have 3 server for masternode and 6 server for workernode And we have plan component layout across 9 nodes above but we want to make sure where we need to place the service-client below我们想在生产环境中部署 HDP 3.1.5 我们有 3 台服务器用于 masternode 和 6 台服务器用于 workernode 我们已经计划了上面 9 个节点的组件布局,但我们想确定我们需要在下面放置服务客户端的位置

  1. yarn clients纱线客户

First we've plan to install this to 9 nodes, does it okay or just install to 3 master nodes?首先,我们计划将它安装到 9 个节点,可以安装还是只安装到 3 个主节点? Because as far as we know, yarn is needed for all nodes include resource managers and node managers因为据我们所知,所有节点都需要yarn,包括资源管理器和节点管理器

Or is it just needed for launch yarn apps or anything else还是只需要启动纱线应用程序或其他任何东西

  1. mapreduce2 clients mapreduce2 客户端

Same as above, we plan to install it to 9 nodes because it required for mapreduce jobs Do we need to install across 9 nodes?同上,我们计划将它安装到 9 个节点,因为它需要 mapreduce 作业我们需要跨 9 个节点安装吗?

  1. hive clients hive 客户端

We've plan to install it to 3 master nodes, or we just need to install it to a master node?我们计划将它安装到 3 个主节点上,还是只需要安装到一个主节点上? Is it just only needed for submit hive apps from beeline (cli)?是否只需要从直线(cli)提交 hive 应用程序?

  1. infra solr clients基础设施 solr 客户

we just plan to install it to 9 nodes and we dont know enough to know how this client works我们只是计划将它安装到 9 个节点上,我们不知道该客户端是如何工作的

  1. kerberos clients kerberos 客户端

Does all nodes need kerberos clients because it automatically installed across all nodes when we deploy in development environment是否所有节点都需要 kerberos 客户端,因为当我们在开发环境中部署时,它会自动安装在所有节点上

  1. oozie clients oozie 客户

same as infra solr clients point, 9 nodes (plan)与infra solr 客户点相同,9 个节点(计划)

  1. Pig Clients猪客户

We've plan to install it to only 3 master node, is it related to run pig via cli or submit pig applications?我们计划只安装到 3 个主节点上,是否与通过 cli 运行 pig 或提交 pig 应用程序有关?

  1. spark2 clients spark2 客户端

we've plan to install it to a master node because we just want limit it where only one server that can submit spark apps我们计划将其安装到主节点,因为我们只想将其限制在只有一台可以提交 Spark 应用程序的服务器

but in development environment, it installed in all nodes, how do uninstall the spark2 client in worker nodes?但是在开发环境中,它安装在所有节点中,如何卸载工作节点中的spark2客户端?

  1. sqoop clients sqoop 客户端

same point as number 9, only to a master node与第 9 点相同,仅指向主节点

  1. Tez client Tez 客户端

we plan to install it to 9 nodes but we dont have any info how this client works我们计划将它安装到 9 个节点,但我们没有任何信息这个客户端是如何工作的

Clients of any service is nothing but some libraries/binaries which will allow you to connect/access the service from the nodes where it is installed.任何服务的客户端只不过是一些库/二进制文件,它们允许您从安装它的节点连接/访问服务。

You can certainly restrict on which nodes you want to install the clients.您当然可以限制要安装客户端的节点。

Some clients have to be installed on all nodes eg kerberos client .有些客户端必须安装在所有节点上,例如kerberos client

Clients will not use much of disk space, however the more clients you have the more time it will take for that service to start.客户端不会使用太多磁盘空间,但是您拥有的客户端越多,启动该服务所需的时间就越多。

Whenever you start or restart service, by default ambari will check if clients are installed or not.每当您启动或重新启动服务时,默认情况下 ambari 将检查是否安装了客户端。 (No way to by-pass this.) (没有办法绕过这个。)

Now all that being said, lets take a look at your scenario:-现在说了这么多,让我们来看看你的场景: -

  1. yarn clients : Not necessary to have it on installed on master hosts but good to have it on all nodes yarn 客户端:不必在主主机上安装它,但最好在所有节点上安装它

  2. mapreduce2 clients : Not necessary to have it on installed on master hosts but good to have it on all nodes mapreduce2 客户端:不必将其安装在主主机上,但最好将其安装在所有节点上

  3. hive clients : Yes, it is only needed to run beeline and run hive queries through command line. hive 客户端:是的,只需要运行 beeline 并通过命令行运行 hive 查询。 You can choose on which hosts you want to install it.您可以选择要在哪些主机上安装它。

  4. infra solr clients : Installing it on 2 or 3 would be sufficient, as the clients are needed to access the service. infra solr 客户端:将其安装在 2 或 3 上就足够了,因为需要客户端才能访问该服务。 Unless you use infra-solr extensively.除非您广泛使用 infra-solr。

  5. kerberos clients : needs to be on all nodes, if not you will have kerberos issues kerberos 客户端:需要在所有节点上,否则您将遇到 kerberos 问题

  6. oozie clients : Installing it on 2 or 3 would be sufficient oozie 客户端:在 2 或 3 上安装就足够了

  7. Pig Clients : it is related to both猪客户:它与两者有关

  8. spark2 clients spark2 客户端

    curl -u admin:admin -H "X-Requested-By: ambari" -X DELETE http://AMBARI_SERVER_HOST:8080/api/v1/clusters/CLUSTERNAME/hosts/HOSTNAME/host_components/Client_name curl -u admin:admin -H "X-Requested-By: ambari" -X DELETE http://AMBARI_SERVER_HOST:8080/api/v1/clusters/CLUSTERNAME/hosts/HOSTNAME/component_name_

  9. sqoop clients : Installing it on 2 or 3 would be sufficient sqoop 客户端:在 2 或 3 上安装就足够了

  10. Tez client : Installing it on 2 or 3 would be sufficient Tez 客户端:在 2 或 3 上安装就足够了

Please keep in mind, that you can install it in any way you want.请记住,您可以以任何您想要的方式安装它。

I would suggest to choose 3-4 nodes and install all the clients on those hosts which are required.我建议选择 3-4 个节点并将所有客户端安装在需要的主机上。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 我们是否需要在所有数据节点上安装spark worker节点? - Do we need install spark worker node on all data node? 我们需要 memory 中的所有数据在 Spark 上运行 group by - Do we need all the data in memory for running group by on Spark 我是否需要安装 Hadoop 才能使用 Pyspark 的所有方面? - Do I need to install Hadoop in order to use all aspects of Pyspark? 为什么我们需要一个虚拟机来安装和使用 Apache Spark? - Why do we need a virtual machine to install and use Apache Spark? 我需要在我的 Spark 集群的每个节点上安装 Koalas 还是只在主节点上安装 Koalas? - Do I need to install Koalas on every node of my Spark cluster or just on the master node? 我们是否需要在 yarn 上安装 spark 才能将 HDFS 中的数据读取到 Py Spark 中? - do we need to install spark on yarn to read data from HDFS into Py Spark? 为什么我们需要完全在Spark中分发文件,例如--py-files? - Why we need to distribute files in Spark at all, e.g. --py-files? 您是否需要向 Kryo 注册接口的所有实现? - Do you need to register all implementations of an interface with Kryo? 是否需要完成所有工作才能完成 Spark 应用程序? - Do all jobs need to finish for spark application to finish? 我可以只从 HDP 安装 HDFS 吗? - Can i install only HDFS from HDP?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM