本文介绍了如何优化MySQL以插入数百万行?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要在MySQL数据库(InnoDB引擎)中插入数百万行.桌子很大时,我的时间有问题.几乎所有时间都花在插入查询上.也许有人知道如何优化它?

I need to insert millions rows into the MySQL database (InnoDB engine). I have a problem with time when the tables have big sizes. Almost all time is spent on insert queries. Maybe somebody know how to optimize it?

推荐答案

要将大量数据导入InnoDB:

To import large bulk of data into InnoDB:

  1. 在MySQL配置中设置

  1. set in MySQL configuration

  • innodb_doublewrite = 0
  • innodb_buffer_pool_size = 50%以上的系统内存
  • innodb_log_file_size = 512M
  • log-bin = 0
  • innodb_support_xa = 0
  • innodb_flush_log_at_trx_commit = 0

在交易开始后立即添加:

Add right after transaction start:

SET FOREIGN_KEY_CHECKS = 0;

SET FOREIGN_KEY_CHECKS = 0;

SET UNIQUE_CHECKS = 0;

SET UNIQUE_CHECKS = 0;

SET AUTOCOMMIT = 0;

SET AUTOCOMMIT = 0;

在交易结束前设置

SET UNIQUE_CHECKS = 1;

SET UNIQUE_CHECKS = 1;

SET FOREIGN_KEY_CHECKS = 1;

SET FOREIGN_KEY_CHECKS = 1;

这篇关于如何优化MySQL以插入数百万行?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-04 12:21
查看更多