简体   繁体   中英

Authentication for Spark standalone cluster

I have a standalone Spark cluster running on a remote server and I'm new to Spark. It appears that there's no authentication scheme protecting the cluster master's (7077) port by default. Anyone can just simply submit their own code to the cluster without any restrictions.

The Spark documentation states that authentication is possible in stand-alone deploy mode using the spark.authenticate.secret parameter, but doesn't really elaborate how exactly this should be used.

Is it possible to use some sort of shared secret that would prevent any potential attacker from submitting tasks to the cluster? Can anyone explain how exactly that can be configured?

there are 2 parts to enable support of authentication:

  1. setting the secret on the master an all the slaves
  2. using the same secret when submitting jobs to the cluster

master and slaves

on each server in your cluster, add the following config to conf/spark-defaults.conf :

spark.authenticate.secret      SomeSecretKey

submitting jobs

when you initialize the spark context, you should add the same config to it as well, ie:

val conf = new SparkConf()
      .set("spark.authenticate.secret", "SomeSecretKey")
val sc = new SparkContext(conf)

or if you are using SparkSession:

val spark = SparkSession.builder()
    .conf("spark.authenticate.secret", "SomeSecretKey")
    .getOrCreate()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM