简体   繁体   中英

how much memory can be allocated to Cassandra in DSE with spark enabled?

Currently my DSE Cassandra uses up all of the memory. And therefore after some time and increasing data amount the whole system crashes. But spark and ops center and agent etc also needs several G memory. I am now trying to only allocate half of the memory to cassandra but not sure if that will work.

This is my error message:

 kernel: Out of memory: Kill process 31290 (java) score 293 or sacrifice child

By default DSE sets the Executor memory to (Total Ram)*(.7) - Ram Used By C* . This should be ok for most systems. With this setup it should Spark shouldn't be able to OOM C* or Vice Versa. If you want to change that multipler (.7) it's set in the dse.yaml file as

initial_spark_worker_resources: 0.7

If I was going for minimum memory for the system it would be 16GB but I would recommend at least 32GB if you are serious. This should be increased even more if you are doing a lot of in-memory caching.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM