问题描述
我正在运行一个代码,该代码将.hdf5文件作为输入(由模拟生成),然后对其进行分析并生成一些统计信息,并通过运行命令行进行绘制:python3 Collector.py
在Fedora 21 Linux的Konsole外壳程序中.在工作目录中的两个分别名为gizmo
和utilities
的文件夹中,我有很多.py
例程. snapshot_index.hdf5
文件是从另一台计算机上转移的(使用globus软件),该计算机将模拟运行到笔记本电脑工作目录中名为output
的本地目录中. (有许多文件对应于从0到600的索引",但是我只需要两个这样的索引,例如,snapshot_396.hdf5和snapshot_600.hdf5).模拟以两种不同的模式运行:低分辨率和高分辨率.
I am running a code which takes .hdf5 files as an input (produced by a simulation) and then analyze them and produces some statistics and plot through running the command line: python3 Collector.py
in a Konsole shell of a Fedora 21 Linux. I have lots of .py
routines in two different folders named gizmo
and utilities
in the working directory. The snapshot_index.hdf5
files are transferred separately (using globus software) from another machine that runs the simulations into a local directory called output
inside the working directory in my laptop. (There are many files corresponding to "index" running from 0 to 600 but I only need two such indices, eg. snapshot_396.hdf5 and snapshot_600.hdf5). Simulations are run in two different modes: low-resolution and high-resolution.
当part
表达式的输入是千比特大小的低分辨率.hdf5文件时(在上面提到的主要python代码内部),我可以运行该代码并产生结果,但是当我将mega-位大小的高分辨率.hdf5文件作为part
表达式的输入,我收到以下错误消息:
When kilo-bite size low resolution .hdf5 files are the input of the part
expression (inside the main python code mentioned above), I am able to run the code and produce the results but when I am putting mega-bite size high resolution .hdf5 files as input of the part
expression, I am receiving the following error message:
# in utilities.simulation.Snapshot():
read snapshot_times.txt
reading snapshot index = 600, redshift = 0.000
# in gizmo.gizmo_io.Read():
reading header from: ./output/snapshot_600.hdf5
Traceback (most recent call last):
File "Collector.py", line 12, in <module>
part=gizmo.io.Read.read_snapshots('all', 'index', 600, element_indices=None)
File "/home/username/Desktop/Projects/PaperMaterials/DM_Dominated_Objects/NewFolder2/covering_fractions/Simulations/gizmo/gizmo_io.py", line 314, in read_snapshots
'index', snapshot_index, simulation_directory, snapshot_directory, simulation_name)
File "/home/username/Desktop/Projects/PaperMaterials/DM_Dominated_Objects/NewFolder2/covering_fractions/Simulations/gizmo/gizmo_io.py", line 513, in read_header
file_in = h5py.File(file_name, 'r') # open hdf5 snapshot file
File "/usr/lib/python3.4/site-packages/h5py/_hl/files.py", line 222, in __init__
fid = make_fid(name, mode, userblock_size, fapl)
File "/usr/lib/python3.4/site-packages/h5py/_hl/files.py", line 79, in make_fid
fid = h5f.open(name, h5f.ACC_RDONLY, fapl=fapl)
File "h5f.pyx", line 71, in h5py.h5f.open (h5py/h5f.c:1809)
OSError: Unable to open file (Truncated file: eof = 933756928, sblock->base_addr = 0, stored_eoa = 1765865624)
我不了解该错误的含义.但是当我对该主题进行搜索时,我注意到这不是关于.hdf5
文件的新问题()时文件损坏,但是无法打开文件的原因因我而异.据我了解(不确定是否正确),文件太大,因此被截断了.如果是这样,那有什么解决方案?如果我错了,那是什么问题呢?非常感谢您的帮助.
I am not understanding the meaning of the error. But when I did a search on the topic, I noticed that this is not a new issue regarding .hdf5
files (Corrupt files when creating HDF5 files without closing them (h5py)) except that the reason of not being able to open the file is different in my case. From what I understand (not sure if correctly), the files are too big and hence truncated. If this is the case, what is the solution for that? And if I am wrong then what is the issue?Your help is greatly appreciated.
推荐答案
看来您的输入文件已损坏.如果在输入文件上运行命令行实用程序h5dump
,h5stat
或h5ls
,会发生什么情况?
It appears your input file is corrupt. What happens if you run the command-line utilities h5dump
, h5stat
, or h5ls
on your input file?
我认为您的问题与HDF5的设计者无关,也不与您链接的其他问题(与编写HDF5文件时崩溃的程序有关)有关.您的编写程序很可能有错误.您可以通过查看其他有效的HDF5程序是否可以处理您的文件来验证这一点.
I don't think your problem has anything to do with the designers of HDF5, or the other question you linked (which was about a program which crashed while writing an HDF5 file). Most likely your writing program has a bug. You can verify this by seeing if other valid HDF5 programs work with your files.
您的文件大小似乎小于1 GB,对于HDF5来说并不是那么大.
Your file appears to be less than 1 GB in size, which is not that huge for HDF5.
这篇关于hdf5设计人员是否解决了与打开.hdf5文件有关的损坏问题?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!