MySQL批量插入问题

MySQL批量插入问题

本文介绍了Hibernate / MySQL批量插入问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述



我使用Hibernate 3.3和MySQL 5.1



在很高的层面上,这是发生了什么:

  @Transactional 
public Set<龙> doUpdate(项目项目,IRepository externalSource){
List< IEntity> entities = externalSource.loadEntites();
buildEntities(entities,project);
persistEntities(项目);
}
public void persistEntities(Project project){
projectDAO.update(project);
}

这导致了n个日志条目(每行1个),如下所示: / b>

我希望看到这个批处理,所以更新更高性能。这个例程可能会导致成千上万行生成,并且每行的db行程是一个杀手。



为什么这不会被批量处理? (这是我的理解,批量插入应该在默认情况下由hibernate进行适当的设置)。

正如: p>

不要忘记 flush ,然后 clear 会话,或者您将得到 OutOfMemoryException ,如。



但是IMO对于数以万计的行,应该考虑使用。

I'm having trouble getting Hibernate to perform a bulk insert on MySQL.

I'm using Hibernate 3.3 and MySQL 5.1

At a high level, this is what's happening:

@Transactional
public Set<Long> doUpdate(Project project, IRepository externalSource) {
    List<IEntity> entities = externalSource.loadEntites();
    buildEntities(entities, project);
    persistEntities(project);
}
public void persistEntities(Project project) {
     projectDAO.update(project);
}

This results in n log entries (1 for every row) as follows:

I'd like to see this get batched, so the update is more performant. It's possible that this routine could result in tens-of-thousands of rows generated, and a db trip per row is a killer.

Why isn't this getting batched? (It's my understanding that batch inserts are supposed to be default where appropriate by hibernate).

解决方案

As documented in the Chapter 13. Batch processing:

Don't forget to flush and then clear the session regularly or you'll get OutOfMemoryException as documented in 13.1. Batch inserts.

But IMO, for tens-of-thousands of rows, you should consider using the StatelessSession interface.

这篇关于Hibernate / MySQL批量插入问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-21 17:58