简体   繁体   中英

Datamechanics - spark docker image - example of how to use the connector that comes inbuilt with the image

I came across the below docker image for spark. The image also comes with some of the connectors to some of the popular cloud services. An example of how to use the inbuilt connectors(say Azure storage gen2) in pyspark application will be of great help.

link to dockerhub image: https://hub.docker.com/r/datamechanics/spark

I looked into the below example that was provided but it didn't help much in understanding how to use the connector that comes with the default image https://github.com/datamechanics/examples/blob/main/pyspark-example/main.py

There is some more documentation at https://docs.datamechanics.co/docs/docker-images but it is not very helpful to understand how to use the images indeed.. The point that there is no Dockerfile and also no response to reported issues makes it very difficult.

It looks like https://g1thubhub.github.io/docker.html is helpful, although the versions of the images that are used are older.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM