[英]How many Crawlers can run simultaneously using storm crawler
I am working on StormCrawler 1.10,archetype 1.11 and Elastic Search 6.4.x. 我正在使用StormCrawler 1.10,原型1.11和Elastic Search6.4.x。 How many crawlers can run simultaneously.
可以同时运行多少个搜寻器。
Crawlers == topologies? 爬虫==拓扑? As many as you want as long as they have different names.
只要它们具有不同的名称,就可以任意选择。 Why do you need multiple topologies?
为什么需要多个拓扑? You can have a single one dealing with multiple sites
您可以拥有一个处理多个站点的站点
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.