简体   繁体   中英

spark UI - Understand metrics memory used

Could you please help me understand the metrics on spark UI Memory: 10 MB Used (552.6 GB Total)

在此处输入图像描述

PartitionNumber.nbExecutors = conf.getInt("spark.executor.instances", 20)
PartitionNumber.nbPartitions = PartitionNumber.nbExecutors * conf.getInt("spark.executor.cores", 2) * 3
  conf.set("spark.sql.shuffle.partitions", PartitionNumber.nbPartitions.toString())

Is it correct that the memory used is 10Mb and the available memory 552Gb?

Any help or suggestions you could provide would be greatly appreciated

Thanks

The total memory available for all executors: 552.6 Gb and the total memory used by all executors: 10 Mb

You can see it in "Storage memory" column (memory available to a Spark executor for storing and caching rdds/dfs). Each executor uses 169.1 Kb out of 9.1 Gb and there are 61 execs:

61 * 169.1 Kb ~= 10 Mb 61 * 9.1 Gb ~= 555 Gb

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM