简体   繁体   English

如何为Google数据流实例分配IP范围?

[英]How to assign IP ranges to google data flow instances?

I need to move data from google bigquery to elasticsearch instances, For that I have created python dataflow job to copy bigquery table to elasticsearch. 我需要将数据从google bigquery移到elasticsearch实例,为此,我创建了python数据流作业以将bigquery表复制到elasticsearch。 But problem is recently they have added IP based restriction on elastic search instances so that it will allow only for specific IP ranges only. 但是问题是最近他们在弹性搜索实例上增加了基于IP的限制,因此它仅允许特定的IP范围。

So How can I identify or assign IP ranges of my dataflow workers when I using "DataflowRunner" option? 那么,当我使用“ DataflowRunner”选项时如何识别或分配数据流工作人员的IP范围?

In the pipeline options you can set the network and the subnetwork you want to use. 管道选项中,您可以设置要使用的网络和子网。 Each VPC network contains subnets, each with a defined IP range . 每个VPC网络都包含子网,每个子网都有一个定义的IP范围 By defining the subnet to the ip range needed and setting that subnet in the pipeline options you can assign a ip range to your workers. 通过将子网定义为所需的ip范围并在管道选项中设置该子网,可以为工作人员分配ip范围。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM