简体   繁体   English

Neo4j 导入 CSV 文件(neo.TransientError.General.OutOfMemoryError)

[英]Neo4j Importing CSV File(neo.TransientError.General.OutOfMemoryError)

Good day.再会。 I am new to neo4j and taking time to explore things however I got stuck with loading data from csv.我是 neo4j 的新手,并花时间探索事物,但是我被从 csv 加载数据卡住了。

I am trying to load 200k data using Load csv function with using periodic commit options but it takes time to load and pops ('neo.TransientError.General.OutOfMemoryError') .我正在尝试使用 Load csv function 加载 200k 数据,并使用定期提交选项,但加载和弹出('neo.TransientError.General.OutOfMemoryError')需要时间。 Please find the code which I am using to load it:请找到我用来加载它的代码:

USING PERIODIC COMMIT 500 LOAD CSV WITH HEADERS FROM "file:///C:/tree.csv" as Real MERGE(P:person{name:Real.ParentNode}) MERGE(C1:person{name:Real.ChildNode})
MERGE(P)-[REL:relation_of]->(C1)
RETURN P,C1,REL

As per community blog have changed the heap size of my db to 16G but still no change on the output(It taking too much time to run).根据社区博客,我的数据库的堆大小已更改为 16G ,但输出仍然没有变化(运行时间太长)。 Appreciate your inputs on this to overcome the error.感谢您对此的投入以克服错误。

Thanks much!非常感谢!

The first thing to check would be have you set indexes on your 'unique' properties?首先要检查的是您是否在“唯一”属性上设置了索引? In this example, those would be on person.name在此示例中,这些将在 person.name 上

Information on indexes is here .关于索引的信息在这里

The next thing to bear aware of is eager queries.接下来要注意的是急切的查询。 Mark Needham has done an excellent post on this. Mark Needham 在这方面做了一篇出色的文章 He describes splitting out part of your LOAD CSV query to avoid out of memory challenges.他描述了拆分 LOAD CSV 查询的一部分以避免 memory 挑战。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM