问题描述
在我的应用程序中,我使用CSVReader& hibernate从csv文件导入大量实体(如1 500 000或更多)到数据库中。代码如下:
In my application I'm using CSVReader & hibernate to import large amount of entities (like 1 500 000 or more) into database from a csv file. The code looks like this:
Session session = headerdao.getSessionFactory().openSession();
Transaction tx = session.beginTransaction();
int count = 0;
String[] nextLine;
while ((nextLine = reader.readNext()) != null) {
try {
if (nextLine.length == 23
&& Integer.parseInt(nextLine[0]) > lastIdInDB) {
JournalHeader current = parseJournalHeader(nextLine);
current.setChain(chain);
session.save(current);
count++;
if (count % 100 == 0) {
session.flush();
tx.commit();
session.clear();
tx.begin();
}
if (count % 10000 == 0) {
LOG.info(count);
}
}
} catch (NumberFormatException e) {
e.printStackTrace();
} catch (ParseException e) {
e.printStackTrace();
}
}
tx.commit();
session.close();
使用足够大的文件(约700 000行) 。
With large enough files (somewhere around 700 000 lines) I get out of memory exception (heap space).
似乎问题是某种方式与hibernate相关,因为如果我只评论行session.save(current);它运行良好。如果它被取消注释,任务管理器显示javaw的不断增加的内存使用,然后在某些时候解析变得很慢,它崩溃。
It seems that the problem is somehow hibernate related, because if I comment just the line session.save(current); it runs fine. If it's uncommented, the task manager shows continuously increasing memory usage of javaw and then at some point the parsing gets real slow and it crashes.
parseJournalHeader ()
没有什么特别的,它只是基于csv reader给出的 String []
解析一个实体。
parseJournalHeader()
does nothing special, it just parses an entity based on the String[]
that the csv reader gives.
推荐答案
会话实际上在缓存中保留对象。你正在做正确的事情来处理一级缓存。但还有更多的东西,防止垃圾收集发生。
Session actually persists objects in cache. You are doing correct things to deal with first-level cache. But there's more things which prevent garbage collection from happening.
尝试使用StatelessSession。
Try to use StatelessSession instead.
这篇关于当保存大量实体时,Hibernate会导致内存不足的异常的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!