问题描述
我想从 CSV 中读取大量数据,包含大约 500,000 行.我正在使用 OpenCSV 库.我的代码是这样的
I want to read huge data from CSV, containing around 500,000 rows.I am using OpenCSV library for it. My code for it is like this
CsvToBean<User> csvConvertor = new CsvToBean<User>();
List<User> list = null;
try {
list =csvConvertor.parse(strategy, new BufferedReader(new FileReader(filepath)));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
多达 200,000 条记录,数据被读入用户 bean 对象列表.但对于更多的数据,我得到了
Upto 200,000 records,data is read into list of User bean objects. But for data more than that I am getting
java.lang.OutOfMemoryError: Java heap space
我在eclipse.ini"文件中有这个内存设置
I have this memory setting in "eclipse.ini" file
-Xms256m
-Xmx1024m
我正在考虑将大文件拆分为单独的文件并再次读取这些文件的解决方案,我认为这是一个冗长的解决方案.
I am thinking a solution of splitting the huge file in separate files and read those files again, which I think is a lengthy solution.
有没有其他方法可以避免 OutOfMemoryError 异常.
Is there any other way, by which I can avoid OutOfMemoryError exception.
推荐答案
逐行阅读
类似的东西
CSVReader reader = new CSVReader(new FileReader("yourfile.csv"));
String [] nextLine;
while ((nextLine = reader.readNext()) != null) {
// nextLine[] is an array of values from the line
System.out.println(nextLine[0] + nextLine[1] + "etc...");
}
这篇关于在java中读取大型CSV的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!