Opened 20 years ago
Closed 16 years ago
#35 closed defect (fixed)
Introduce large file support for NetCDF to allow larger than 2 GByte files
Reported by: | ole | Owned by: | rwilson |
---|---|---|---|
Priority: | normal | Milestone: | AnuGA ready for release |
Component: | Efficiency and optimisation | Version: | 1.0 |
Severity: | normal | Keywords: | NetCDF |
Cc: |
Description
The barrier of 2GB can be removed by moving 2 v3.6 and 64bit storage (even on 32 bit platforms). See
http://www.unidata.ucar.edu/software/netcdf/docs/faq.html#Large%20File%20Support4
Change History (8)
comment:1 Changed 19 years ago by
Priority: | normal → high |
---|
comment:2 Changed 19 years ago by
Owner: | changed from someone to duncan |
---|
comment:3 Changed 19 years ago by
Currently, there is some functionality for splitting Netcdf files exceeding 2GB in data_manager.py but it does not have a unit test. In fact the max_size argument to the class Data_format_sww which is suppose to control the split isn't even passed down from get_dataobject or store_connectivity where one would like to control this. Especially if one were to write a small test with a small value for max_size.
comment:4 Changed 19 years ago by
Priority: | high → low |
---|
comment:5 Changed 17 years ago by
Owner: | changed from duncan to rwilson |
---|---|
Priority: | low → normal |
comment:6 Changed 17 years ago by
It appears that recent changes in NetCDF (beyond the 64-bit offset format changes) allow large dataset file sizes:
With the netCDF-4/HDF5 format size limitations are further relaxed, and files can be as large as the underlying file system supports.
http://www.unidata.ucar.edu/software/netcdf/docs/netcdf.html#index-limitations-of-netCDF-61
comment:7 Changed 16 years ago by
Status: | new → assigned |
---|
comment:8 Changed 16 years ago by
Resolution: | → fixed |
---|---|
Status: | assigned → closed |
Duncan, would you mind putting this on *your* list of things todo?