简体   繁体   中英

HD Insight 3.6: Data Factory V2- Running Script Actions to Install Dependencies for On-Demand Cluster

I am currently trying to use script actions to deploy an On-Demand cluster (spark), using Data Factory V2, to install dependencies that my Python scripts need to run. However, the concept of script actions seems lost with V2.

How do I install packages on the cluster nodes as of now ? Is there an alternative ?

As far as i know, it seems that script actions are still supported in HD Insight,for installing external dependencies:

From the Azure Portal, from the startboard, click the tile for your Spark cluster (if you pinned it to the startboard). You can also navigate to your cluster under Browse All > HDInsight Clusters.

From the Spark cluster blade, click Script Actions from the left pane. Run the custom action that installs TensorFlow in the head nodes and the worker nodes. The bash script can be referenced from: https://hdiconfigactions.blob.core.windows.net/linuxtensorflow/tensorflowinstall.sh Visit the documentation on how to use custom script actions.

Full documentation is available here

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM