问题描述
我正在尝试使用Oracle ODI将记录插入到Azure sql数据仓库中,但是在插入某些记录后出现错误.
I am trying to insert record in to Azure sql Dataware House using Oracle ODI, but i am getting error after insertion of some records.
注意:我正在尝试插入1000条记录,但在800条之后出现错误.
NOTE: I am trying to insert 1000 records, but error is coming after 800.
错误消息:原因:java.sql.BatchUpdateException:112007;对于准备好的语句,每个会话超出了20 MB的内存限制.减少准备好的语句的数量或大小.
Error Message: Caused By: java.sql.BatchUpdateException: 112007;Exceeded the memory limit of 20 MB per session for prepared statements. Reduce the number or size of the prepared statements.
我正在尝试插入1000条记录,但在800条之后出现错误.
I am trying to insert 1000 records,but error is coming after 800.
错误消息:原因:java.sql.BatchUpdateException:112007;对于准备好的语句,每个会话超出了20 MB的内存限制.减少准备好的语句的数量或大小.
Error Message: Caused By: java.sql.BatchUpdateException: 112007;Exceeded the memory limit of 20 MB per session for prepared statements. Reduce the number or size of the prepared statements.
推荐答案
虽然Abhijith的答案在技术上是正确的,但我想提出一个替代方案,以使您获得更好的性能.
While Abhijith's answer is technically correct, I'd like to suggest an alternative that will give you far better performance.
问题的根源在于,您选择了最糟糕的方式将大量数据加载到Azure SQL数据仓库中.不管您要扔多少DWU,一长串INSERT语句的执行都会非常糟糕,因为它始终将是单节点操作.
The root of your problem is that you've chosen the worst-possible way to load a large volume of data into Azure SQL Data Warehouse. A long list of INSERT statements is going to perform very badly, no matter how many DWUs you throw at it, because it is always going to be a single-node operation.
我的建议是,假设您的Oracle是本地部署,请以以下方式调整ODI流程.
My recommendation is to adapt your ODI process in the following way, assuming that your Oracle is on-premise.
- 将提取的内容写入文件
- 调用AZCOPY将文件移至Azure blob存储
- 创建外部表以在存储中的文件上映射视图
- 创建表AS或INSERT INTO以从该视图读取到目标表
这将比您当前的方法快几个数量级.
This will be orders of magnitude faster than your current approach.
这篇关于对于准备好的语句,每个会话超出了20 MB的内存限制.减少准备好的语句的数量或大小的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!