简体   繁体   English

限制 Dask CPU 和内存使用(单节点)

[英]Limit Dask CPU and Memory Usage (Single Node)

I am running Dask on a single computer where running .compute() to perform the computations on a huge parquet file will cause dask to use up all the CPU cores on the system.我在一台计算机上运行.compute() ,其中运行.compute()以对巨大的镶木地板文件执行计算将导致 dask 用完系统上的所有 CPU 内核。

import dask as dd

df = dd.read_parquet(parquet_file)  # very large file
print(df.names.unique().compute())

Is it possible to configure dask to use a specific number of CPU cores and limit its memory usage to say 32 GB?是否可以将 dask 配置为使用特定数量的 CPU 内核并将其内存使用限制为 32 GB? Using Python 3.7.2 and Dask 2.9.2.使用 Python 3.7.2 和 Dask 2.9.2。

Dask.distributed.Client creates a LocalCluster for which you can explicitly set the memory use and the number of cores. Dask.distributed.Client 创建一个 LocalCluster,您可以为其显式设置内存使用和内核数。

import numpy as np
import pandas as pd
from dask.distributed import Client
from dask import dataframe as dd

def names_unique(x):
    return x['Names'].unique()

client = Client(memory_limit='2GB', processes=False,
                n_workers=1, threads_per_worker=2)

# Data generation
df = pd.DataFrame({'Names': np.random.choice(['A', 'B', 'C', 'D'], size=1000000),
                   'sales': np.arange(1000000)})
df.to_parquet('parq_df')
ddf = dd.read_parquet('parq_df', npartitions=10)

# Custom computation
sent = client.submit(names_unique, ddf)
names_unique = sent.result().compute()
client.close()

Output:输出:

names_unique
Out[89]: 
0    D
1    B
2    C
3    A
Name: Names, dtype: object

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM