Netcdf java library for reading HDf5 error -


i newbie netcdf library , using read metadata hdf5 file in java. after reading, thought netcdf decent enough library use, using it.

however in first step when try read file throws error

logger.debug("inside try"); //inputstream filestream = new fileinputstream(h5file);                 //parser.parse(filestream, handler, metadata);    logger.debug("path :"+ h5file.getpath());    netcdffile hf5file = netcdffile.open(h5file.getpath());        logger.debug("got netcdfile"); 

i assuming problem occurs when try open , says :

inside try 13:42:04.393 [main] debug e.k.n.c.m.e.hdf5metadataextractor - path :/var/www/webdav/admin/1151/data/xxxx.h5 13:42:04.495 [main] debug ucar.nc2.netcdffile - using iosp ucar.nc2.iosp.hdf5.h5iosp 13:42:04.544 [main] error ucar.nc2.iosp.hdf5.h5header - shape[0]=0 must > 0

my hd5f 2 dimensional array of integers, not interested in array such metadata group associated file.

netcdf-4 creates hdf5 files, it's true. hdf5 library can read hdf5-formatted files produced netcdf-4, netcdf-4 cannot read arbitrary hdf5 files.

either have found bug in netcdf-java, or have odd hdf5 file. have confirmed file not corrupted in way? things try: - use c utility 'ncdump -h' inspect header - use hdf5 c utility 'h5dump -h' inspect file via hdf5

if both of commands give sensible output, issue might rest netcdf-java.


Comments

Popular posts from this blog

html - Outlook 2010 Anchor (url/address/link) -

javascript - Why does running this loop 9 times take 100x longer than running it 8 times? -

Getting gateway time-out Rails app with Nginx + Puma running on Digital Ocean -