简体   繁体   中英

Why is my laptop struggling with large data sets? 16gb RAM defeated by a 1.5gb csv

I just started a new job and have been given a Dell XPS 13 7390 laptop and it is really struggling with big data files / processing.

I'm currently working with a 1.5gb csv and I get a memory error when I try to open it with Python in a Jupyter Notebook.

Error tokenizing data. C error: out of memory

I was sure I'd opened files like this with ease on my personal laptop, a 10 year old Macbook, so i tested it on the same file and it opened .

Why is my Dell laptop struggling despite having lots of RAM available? Could settings be adjusted to allocate more memory to Jupyter Notebooks? What tests could I run to look into this further?

Hardware details below. The obvious difference is processor speed - does that explain it?

Dell laptop :

Ram- 16gb

Processor- Intel Core i7-10510U CPU @ 1.80GHz

Macbook :

Ram- 4gb

Processor- 2.7 GHz Intel Core i7

Code used to open the file:

import pandas as pd
data = pd.read_csv('data.csv')
data.shape

The shape = (2250493, 218)

I had 32bit python installed... I've switched to 64bit and it works fine

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM