summaryrefslogtreecommitdiff
path: root/grid_tools
diff options
context:
space:
mode:
authorEven Rouault <even.rouault@spatialys.com>2020-01-28 16:16:22 +0100
committerEven Rouault <even.rouault@spatialys.com>2020-01-28 17:07:49 +0100
commit6de85e3690dab9377205adb2678aa4555a342b3c (patch)
treeb0efc83d23c58e9f58cd64a1e2735ce6fa4581b5 /grid_tools
parent324311619cf94ed024526b160282d9ae0d5a9e37 (diff)
downloadPROJ-data-6de85e3690dab9377205adb2678aa4555a342b3c.tar.gz
PROJ-data-6de85e3690dab9377205adb2678aa4555a342b3c.zip
Add grid_tools/
Diffstat (limited to 'grid_tools')
-rw-r--r--grid_tools/README.md185
-rwxr-xr-xgrid_tools/check_gtiff_grid.py726
-rwxr-xr-xgrid_tools/cloud_optimize_gtiff.py390
-rwxr-xr-xgrid_tools/convert_gr3df97a.py146
-rwxr-xr-xgrid_tools/convert_gr3dnc01b.py121
-rwxr-xr-xgrid_tools/convert_gr3dnc03a.py121
-rwxr-xr-xgrid_tools/ntv2_to_gtiff.py517
-rwxr-xr-xgrid_tools/vertoffset_grid_to_gtiff.py389
8 files changed, 2595 insertions, 0 deletions
diff --git a/grid_tools/README.md b/grid_tools/README.md
new file mode 100644
index 0000000..f8cab3c
--- /dev/null
+++ b/grid_tools/README.md
@@ -0,0 +1,185 @@
+# Grid tools
+
+This directory contains a set of Python scripts to convert grids to
+the [Geodetic TIFF grids (GTG)](https://github.com/OSGeo/PROJ/blob/master/docs/source/specifications/geodetictiffgrids.rst),
+and check them
+
+## Requirements
+
+GDAL >= 3.1 (currently master) is required to write geoid models with GeoTIFF 1.1
+encoding of Geographic 3D CRS. Otherwise, GDAL 2.4/3.0 can be used to produce
+horizontal shift grids.
+
+The "osgeo/gdal:alpine-normal-latest" docker image documented at
+https://github.com/OSGeo/gdal/tree/master/gdal/docker can be used for that purpose
+(the default "osgeo/gdal" can also be used. It is larger, using a Ubuntu base, whereas
+"osgeo/gdal" uses a smaller Alpine Linux base)
+
+For Windows, the development version of [GisInternals builds](http://gisinternals.com/development.php)
+can also be used. Or the "gdal-dev-python" package of [OSGeo4W](https://trac.osgeo.org/osgeo4w/)
+
+Python 3 is suggested, but the scripts should still be compatible of Python 2.7
+
+## Converting from NTv2 to GTG
+
+With the [ntv2_to_gtiff.py](ntv2_to_gtiff.py) script.
+
+Usage:
+```
+usage: ntv2_to_gtiff.py [-h] --source-crs SOURCE_CRS --target-crs TARGET_CRS
+ --copyright COPYRIGHT [--description DESCRIPTION]
+ [--do-not-write-accuracy-samples]
+ [--positive-longitude-shift-value {east,west}]
+ [--uint16-encoding] [--datetime DATETIME]
+ [--accuracy-unit {arc-second,metre,unknown}]
+ [--area-of-use AREA_OF_USE]
+ source dest
+```
+
+Options:
+
+* --source-crs SOURCE_CRS: the source CRS, as a "EPSG:XXXX" value or a CRS WKT string. Mandatory. Must be a Geographic 2D CRS.
+* --target-crs SOURCE_CRS: the target CRS, as a "EPSG:XXXX" value or a CRS WKT string. Mandatory. Must be a Geographic 2D CRS.
+* --copyright COPYRIGHT: Attribution and license to be stored in the TIFF Copyright tag. Mandatory
+* --area-of-use AREA_OF_USE: To specify a textual area of use for the grid. For example "France", "Germany - Saxony". Strongly recommended.
+* --description DESCRIPTION: Text describing the grid. If not provided, a default value will be automatically set.
+ For example "NAD27 (EPSG:4267) to NAD83 (EPSG:4269). Converted from ntv2_0.gsb (last updated on 1995-07-05)"
+* --do-not-write-accuracy-samples: To prevent accuracy samples from the NTv2 grid to be written in the output. Note: if it is detected that those samples are set to dummy values (negative values), they will automatically be discarded
+* --positive-longitude-shift-value {east,west}: To force the convention for the longitude shift value. NTv2 uses a positive-is-west convention, that is confusing. By default, the script will negate the sign of the longitude shift values to output a positive-is-east convention. Setting this option to "west" will preserve the original convention. Not recommended
+* --uint16-encoding: Whether values should be encoded on a 16-bit unsigned value, using a offset and scale floating point values. Default behaviour is to use Float32 encoding, which will preserve the binary values of the original NTv2 file
+* --datetime DATETIME: to specify the value of the TIFF DateTime tag. Must be formatted as "YYYY:MM:DD HH:MM:SS" (note the use of column as the separator for the Y-M-D part, as mandated by the TIFF specification). If not specified, the script will try to use the value from the corresponding NTv2 header
+* --accuracy-unit {arc-second,metre,unknown}: to specify the unit of the accuracy samples. The NTv2 specification has been [historically interpreted in different ways regarding that](https://github.com/OSGeo/PROJ/wiki/Units-of-NTv2-accuracy-samples-%3F). Mandatory if accuracy samples are written (the script contains a few hardcoded rules for known datasets)
+
+Example:
+
+```
+python3 ntv2_to_gtiff.py \
+ --area-of-use "Canada" \
+ --copyright "Derived from work by Natural Resources Canada. Open Government Licence - Canada: http://open.canada.ca/en/open-government-licence-canada" \
+ --source-crs EPSG:4267 \
+ --target-crs EPSG:4269 \
+ ntv2_0.gsb ca_nrc_ntv2_0.tif
+```
+
+Using the Docker image:
+
+```
+docker run --rm -v /home:/home osgeo/gdal:alpine-normal-latest python3 $PWD/ntv2_to_gtiff.py \
+ --area-of-use "Canada" \
+ --copyright "Derived from work by Natural Resources Canada. Open Government Licence - Canada: http://open.canada.ca/en/open-government-licence-canada" \
+ --source-crs EPSG:4267 \
+ --target-crs EPSG:4269 \
+ $PWD/ntv2_0.gsb $PWD/ca_nrc_ntv2_0.tif
+```
+
+## Converting from GTX to GTG
+
+With the [vertoffset_grid_to_gtiff.py](vertoffset_grid_to_gtiff.py) script.
+
+Usage:
+```
+usage: vertoffset_grid_to_gtiff.py [-h] --type
+ {GEOGRAPHIC_TO_VERTICAL,VERTICAL_TO_VERTICAL}
+ --source-crs SOURCE_CRS
+ [--interpolation-crs INTERPOLATION_CRS]
+ --target-crs TARGET_CRS --copyright
+ COPYRIGHT [--description DESCRIPTION]
+ [--encoding {float32,uint16,int32-scale-1-1000}]
+ [--ignore-nodata] [--datetime DATETIME]
+ [--area-of-use AREA_OF_USE]
+ source dest
+```
+
+
+Options:
+* --type {GEOGRAPHIC_TO_VERTICAL,VERTICAL_TO_VERTICAL}: specify the type of the grid. GEOGRAPHIC_TO_VERTICAL is for geoid-like grids transforming between a vertical CRS and ellipsoid heights. The value in the grid must be the offset to add to the height in the vertical CRS to get an ellipsoidal height. VERTICAL_TO_VERTICAL is for transformations between 2 vertical CRS: the value in the grid must be the offset to add to heights in the source CRS to get heights in the target CRS.
+* --source-crs SOURCE_CRS: the source CRS, as a "EPSG:XXXX" value or a CRS WKT string. Mandatory. For type=GEOGRAPHIC_TO_VERTICAL, this must be a Geographic 3D CRS. For type=VERTICAL_TO_VERTICAL, this must be a Vertical CRS.
+* --interpolation-crs INTERPOLATION_CRS: the geographic CRS in which the grid is referenced to. This is ignored for type=GEOGRAPHIC_TO_VERTICAL (the interpolation CRS is the source CRS), but mandatory for type=VERTICAL_TO_VERTICAL
+* --target-crs SOURCE_CRS: the target CRS, as a "EPSG:XXXX" value or a CRS WKT string. Mandatory. Must be a vertical CRS.
+* --area-of-use AREA_OF_USE: To specify a textual area of use for the grid. For example "France", "Germany - Saxony". Strongly recommended.
+* --description DESCRIPTION: Text describing the grid. If not provided, a default value will be automatically set.
+ For example "NAD27 (EPSG:4267) to NAD83 (EPSG:4269). Converted from ntv2_0.gsb (last updated on 1995-07-05)"
+* --encoding {float32,uint16,int32-scale-1-1000}: How values should be encoded. Defaults to float32. If specifying uint16, values will be encoded on a 16-bit unsigned value, using a offset and scale floating point values. If specifying int32-scale-1-1000, values will be conded on a 32-bit signed integer with a scaling factor of 1000 (this is the encoding naturally used by the .byn format of Canadian vertical shift grids)
+* --ignore-nodata: whether the nodata value from the source should be encoded. This might be useful because there is an ambiguity for the GTX format. In some circumstances, the -88.8888 value must be interpreted as a nodata value, in others as a valid value. If not specifying this option, the script will print a warning if encountering a grid value that matches the nodata value, so as to call for manual verification.
+* --datetime DATETIME: to specify the value of the TIFF DateTime tag. Must be formatted as "YYYY:MM:DD HH:MM:SS" (note the use of column as the separator for the Y-M-D part, as mandated by the TIFF specification). If not specified, the script will use the modification date of the source file.
+
+Example:
+
+```
+python3 vertoffset_grid_to_gtiff.py \
+ --type GEOGRAPHIC_TO_VERTICAL \
+ --area-of-use "World" \
+ --source-crs EPSG:4326 \
+ --target-crs EPSG:5773 \
+ --copyright "Public Domain. Derived from work at http://earth-info.nga.mil/GandG/wgs84/gravitymod/egm96/egm96.html" \
+ egm96_15.gtx us_nga_egm96_15.tif
+```
+
+Using the Docker image:
+
+```
+docker run --rm -v /home:/home osgeo/gdal:alpine-normal-latest python3 $PWD/vertoffset_grid_to_gtiff.py \
+ --type GEOGRAPHIC_TO_VERTICAL \
+ --area-of-use "World" \
+ --source-crs EPSG:4326 \
+ --target-crs EPSG:5773 \
+ --copyright "Public Domain. Derived from work at http://earth-info.nga.mil/GandG/wgs84/gravitymod/egm96/egm96.html" \
+ $PWD/egm96_15.gtx $PWD/us_nga_egm96_15.tif
+```
+
+## Checking compliance of GeoTIFF files with the GTG specification
+
+With the [check_gtiff_grid.py](check_gtiff_grid.py) script.
+
+Usage:
+```
+usage: check_gtiff_grid.py [-h] filename
+```
+
+If the script outputs anything and return the success error code, then everything
+is perfect.
+
+Otherwise, it will messages among one of the three following categories:
+
+* information: optional metadata missing, or extra metadata elements found. No corrective action is required
+* warning: the file will probably be read correctly by PROJ, but corrective action may be required. Adressing those warnings is recommended
+* error: the file will not be read correctly by PROJ. Corrective action is required. The return code of the script will be an error code.
+
+Example of output on non-conformant file:
+```
+/home/even/gdal/git/gdal/gdal/byte.tif is NOT a valid PROJ GeoTIFF file.
+The following errors were found (corrective action is required):
+ - CRS found, but not a Geographic CRS
+ - Datatype Byte of band 1 is not support
+
+The following warnings were found (the file will probably be read correctly by PROJ, but corrective action may be required):
+ - This file uses a RasterTypeGeoKey = PixelIsArea convention. While this will work properly with PROJ, a georeferencing using RasterTypeGeoKey = PixelIsPoint is more common.
+ - Missing TYPE metadata item
+ - GDAL area_of_use metadata item is missing. Typically used to capture plain text information about where the grid applies
+ - TIFF tag ImageDescription is missing. Typically used to capture plain text information about what the grid does
+ - TIFF tag Copyright is missing. Typically used to capture information on the source and licensing terms
+ - TIFF tag DateTime is missing. Typically used to capture when the grid has been generated/converted
+ - Image is uncompressed. Could potentially benefit from compression
+
+The following informative message are issued (no corrective action required):
+ - Metadata LAYER_TYPE=athematic on band 1 ignored
+```
+
+## Optimize an existing GeoTIFF file
+
+With the [cloud_optimize_gtiff.py](cloud_optimize_gtiff.py) script.
+
+Usage:
+```
+usage: cloud_optimize_gtiff.py [-h] source dest
+```
+
+This file takes an existing GeoTIFF file and optimize its layout, so that
+headers are put at the beginning of the file. This is used internally by the
+ntv2_to_gtiff.py and vertoffset_grid_to_gtiff.py script
+
+This can be useful as the final step if preparing a file with the GDAL API / tools
+such as [gdal_translate](https://gdal.org/programs/gdal_translate.html) or
+[gdal_edit.py](https://gdal.org/programs/gdal_edit.html)
+
+Note: this script will not compress data. This must be done in a prior step.
diff --git a/grid_tools/check_gtiff_grid.py b/grid_tools/check_gtiff_grid.py
new file mode 100755
index 0000000..f5b086c
--- /dev/null
+++ b/grid_tools/check_gtiff_grid.py
@@ -0,0 +1,726 @@
+#!/usr/bin/env python
+###############################################################################
+# $Id$
+#
+# Project: PROJ
+# Purpose: Check that a GeoTIFF grid meets the requirements/recommendation of RFC4
+# Author: Even Rouault <even.rouault at spatialys.com>
+#
+###############################################################################
+# Copyright (c) 2019, Even Rouault <even.rouault at spatialys.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining a
+# copy of this software and associated documentation files (the "Software"),
+# to deal in the Software without restriction, including without limitation
+# the rights to use, copy, modify, merge, publish, distribute, sublicense,
+# and/or sell copies of the Software, and to permit persons to whom the
+# Software is furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included
+# in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
+# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+# DEALINGS IN THE SOFTWARE.
+###############################################################################
+
+from osgeo import gdal
+from osgeo import osr
+import argparse
+import datetime
+import re
+import sys
+
+
+def get_args():
+ parser = argparse.ArgumentParser(
+ description='Check a PROJ GeoTIFF.')
+ parser.add_argument('filename',
+ help='GeoTIFF file')
+
+ return parser.parse_args()
+
+
+def get_srs(ds, epsg_code_key, wkt_key, is_first_subds, infos, warnings, errors):
+
+ epsg_code = ds.GetMetadataItem(epsg_code_key)
+ wkt = None
+ if epsg_code:
+ try:
+ epsg_code = int(epsg_code)
+ except ValueError:
+ epsg_code = None
+ warnings.append('%s=%s is not a valid EPSG code' %
+ (epsg_code_key, epsg_code))
+ else:
+ wkt = ds.GetMetadataItem(wkt_key)
+ if not wkt and is_first_subds:
+ warnings.append('Missing %s / %s' % (epsg_code_key, wkt_key))
+
+ srs = None
+ if epsg_code:
+ srs = osr.SpatialReference()
+ if srs.ImportFromEPSG(epsg_code) != 0:
+ errors.append('%s=%s is not a valid EPSG code' %
+ (epsg_code_key, str(epsg_code)))
+ elif wkt:
+ srs = osr.SpatialReference()
+ if srs.ImportFromWkt(wkt) != 0:
+ errors.append('%s=%s is invalid' % (wkt_key, wkt))
+
+ return srs
+
+
+def validate_horizontal_offset(ds, is_first_subds):
+
+ infos = []
+ warnings = []
+ errors = []
+
+ target_crs = get_srs(ds, 'target_crs_epsg_code', 'target_crs_wkt',
+ is_first_subds, infos, warnings, errors)
+ if target_crs:
+ if not target_crs.IsGeographic():
+ warnings.append("target_crs found, but not a Geographic CRS")
+
+ if ds.RasterCount < 2:
+ return infos, warnings, ["TYPE=HORIZONTAL_OFFSET should have at least 2 bands"]
+
+ lat_offset_idx = 0
+ lon_offset_idx = 0
+ lat_accuracy_idx = 0
+ lon_accuracy_idx = 0
+ for i in range(ds.RasterCount):
+ b = ds.GetRasterBand(i+1)
+ desc = b.GetDescription()
+ if desc == 'latitude_offset':
+ if lat_offset_idx > 0:
+ return infos, warnings, ["At least, 2 bands are tagged with Description = latitude_offset"]
+ lat_offset_idx = i+1
+ elif desc == 'longitude_offset':
+ if lon_offset_idx > 0:
+ return infos, warnings, ["At least, 2 bands are tagged with Description = longitude_offset"]
+ lon_offset_idx = i+1
+ elif desc == 'latitude_offset_accuracy':
+ lat_accuracy_idx = i+1
+ elif desc == 'longitude_offset_accuracy':
+ lon_accuracy_idx = i+1
+ elif desc:
+ infos.append('Band of type %s not recognized by PROJ' % desc)
+
+ if lat_offset_idx > 0 and lon_offset_idx > 0:
+ if lat_offset_idx != 1 or lon_offset_idx != 2:
+ infos.append(
+ 'Usually the first band should be latitude_offset and the second longitude_offset.')
+ elif lat_offset_idx > 0 or lon_offset_idx > 0:
+ return infos, warnings, ["One of the band is tagged with Description = latitude_offset/longitude_offset but not the other one"]
+ else:
+ if is_first_subds:
+ warnings.append(
+ 'No explicit bands tagged with Description = latitude_offset and longitude_offset. Assuming first one is latitude_offset and second one longitude_offset')
+ lat_offset_idx = 1
+ lon_offset_idx = 2
+
+ for idx in (lat_offset_idx, lon_offset_idx):
+ band = ds.GetRasterBand(idx)
+ if band.GetNoDataValue():
+ warnings.append(
+ "One of latitude_offset/longitude_offset band has a nodata setting. Nodata for horizontal shift grids is ignored by PROJ")
+ units = band.GetUnitType()
+ if not units:
+ if is_first_subds:
+ warnings.append(
+ "One of latitude_offset/longitude_offset band is missing units description. arc-second will be assumed")
+ elif units not in ('arc-second', 'degree', 'radians'):
+ errors.append(
+ "One of latitude_offset/longitude_offset band is using a unit not supported by PROJ")
+
+ positive_value = ds.GetRasterBand(
+ lon_offset_idx).GetMetadataItem('positive_value')
+ if not positive_value:
+ if is_first_subds:
+ warnings.append(
+ "The latitude_offset band should include a positive_value=west/east metadata item, to avoid any ambiguity w.r.t NTv2 original convention. 'east' will be assumed")
+ elif positive_value not in ('west', 'east'):
+ errors.append("positive_value=%s not supported by PROJ" %
+ positive_value)
+
+ if lat_accuracy_idx > 0 and lon_accuracy_idx > 0:
+ if lat_accuracy_idx != 3 or lon_accuracy_idx != 4:
+ infos.append(
+ 'Usually the third band should be latitude_offset_accuracy and the fourth longitude_offset_accuracy.')
+ elif lat_accuracy_idx > 0 or lon_accuracy_idx > 0:
+ warnings.append(
+ "One of the band is tagged with Description = latitude_offset_accuracy/longitude_offset_accuracy but not the other one")
+ elif ds.RasterCount >= 4:
+ lat_accuracy_idx = 3
+ lon_accuracy_idx = 4
+
+ if lat_accuracy_idx > 0 and lon_accuracy_idx > 0:
+ for idx in (lat_accuracy_idx, lon_accuracy_idx):
+ if idx == 0:
+ continue
+ units = ds.GetRasterBand(idx).GetUnitType()
+ if not units:
+ if is_first_subds:
+ warnings.append(
+ "One of latitude_offset_accuracy/longitude_offset_accuracy band is missing units description.")
+ elif units not in ('arc-second', 'degree', 'radians', 'metre', 'unknown'):
+ errors.append(
+ "One of latitude_offset_accuracy/longitude_offset_accuracy band is using a unit not supported by PROJ")
+
+ return infos, warnings, errors
+
+
+def validate_vertical_offset_geographic_to_vertical(ds, is_first_subds):
+
+ infos = []
+ warnings = []
+ errors = []
+
+ target_crs = get_srs(ds, 'target_crs_epsg_code', 'target_crs_wkt',
+ is_first_subds, infos, warnings, errors)
+ if target_crs:
+ if not target_crs.IsVertical():
+ errors.append("target_crs found, but not a Vertical CRS")
+
+ offset_idx = 0
+ for i in range(ds.RasterCount):
+ b = ds.GetRasterBand(i+1)
+ desc = b.GetDescription()
+ if desc == 'geoid_undulation':
+ if offset_idx > 0:
+ return infos, warnings, ["At least, 2 bands are tagged with Description = geoid_undulation"]
+ offset_idx = i+1
+ elif desc:
+ infos.append('Band of type %s not recognized by PROJ' % desc)
+
+ if offset_idx == 0:
+ if is_first_subds:
+ warnings.append(
+ 'No explicit band tagged with Description = geoid_undulation. Assuming first one')
+ offset_idx = 1
+
+ units = ds.GetRasterBand(offset_idx).GetUnitType()
+ if not units:
+ if is_first_subds:
+ warnings.append(
+ "geoid_undulation band is missing units description. Metre will be assumed")
+ elif units not in ('metre', ):
+ errors.append(
+ "geoid_undulation band is using a unit not supported by PROJ")
+
+ return infos, warnings, errors
+
+
+def validate_vertical_offset_vertical_to_vertical(ds, is_first_subds):
+
+ infos = []
+ warnings = []
+ errors = []
+
+ source_crs = get_srs(ds, 'source_crs_epsg_code', 'source_crs_wkt',
+ is_first_subds, infos, warnings, errors)
+ if source_crs:
+ if not source_crs.IsVertical():
+ errors.append("source_crs found, but not a Vertical CRS")
+
+ target_crs = get_srs(ds, 'target_crs_epsg_code', 'target_crs_wkt',
+ is_first_subds, infos, warnings, errors)
+ if target_crs:
+ if not target_crs.IsVertical():
+ errors.append("target_crs found, but not a Vertical CRS")
+
+ offset_idx = 0
+ for i in range(ds.RasterCount):
+ b = ds.GetRasterBand(i+1)
+ desc = b.GetDescription()
+ if desc == 'vertical_offset':
+ if offset_idx > 0:
+ return infos, warnings, ["At least, 2 bands are tagged with Description = vertical_offset"]
+ offset_idx = i+1
+ elif desc:
+ infos.append('Band of type %s not recognized by PROJ' % desc)
+
+ if offset_idx == 0:
+ if is_first_subds:
+ warnings.append(
+ 'No explicit band tagged with Description = vertical_offset. Assuming first one')
+ offset_idx = 1
+
+ units = ds.GetRasterBand(offset_idx).GetUnitType()
+ if not units:
+ if is_first_subds:
+ warnings.append(
+ "vertical_offset band is missing units description. Metre will be assumed")
+ elif units not in ('metre', ):
+ errors.append(
+ "vertical_offset band is using a unit not supported by PROJ")
+
+ return infos, warnings, errors
+
+
+def validate_geocentric_translation(ds, is_first_subds):
+
+ infos = []
+ warnings = []
+ errors = []
+
+ source_crs = get_srs(ds, 'source_crs_epsg_code', 'source_crs_wkt',
+ is_first_subds, infos, warnings, errors)
+ if source_crs:
+ if not source_crs.IsGeocentric():
+ errors.append("source_crs found, but not a geocentric CRS")
+
+ target_crs = get_srs(ds, 'target_crs_epsg_code', 'target_crs_wkt',
+ is_first_subds, infos, warnings, errors)
+ if target_crs:
+ if not target_crs.IsGeocentric():
+ errors.append("target_crs found, but not a geocentric CRS")
+
+ if ds.RasterCount < 3:
+ return infos, warnings, ["TYPE=GEOCENTRIC_TRANSLATION should have at least 3 bands"]
+
+ x_idx = 0
+ y_idx = 0
+ z_idx = 0
+ for i in range(ds.RasterCount):
+ b = ds.GetRasterBand(i+1)
+ desc = b.GetDescription()
+ if desc == 'x_translation':
+ if x_idx > 0:
+ return infos, warnings, ["At least, 2 bands are tagged with Description = x_translation"]
+ x_idx = i+1
+ elif desc == 'y_translation':
+ if y_idx > 0:
+ return infos, warnings, ["At least, 2 bands are tagged with Description = y_translation"]
+ y_idx = i+1
+ elif desc == 'z_translation':
+ if z_idx > 0:
+ return infos, warnings, ["At least, 2 bands are tagged with Description = z_translation"]
+ z_idx = i+1
+ elif desc:
+ infos.append('Band of type %s not recognized by PROJ' % desc)
+
+ if x_idx > 0 and y_idx > 0 and z_idx > 0:
+ if x_idx != 1 or y_idx != 2 or z_idx != 3:
+ infos.append(
+ 'Usually the first, second and third band should be respectively x_translation, y_translation, z_translation')
+ elif x_idx > 0 or y_idx > 0 or z_idx > 0:
+ return infos, warnings, ["Part of the bands are tagged with Description = x_translation/y_translation/z_translation but not all"]
+ else:
+ if is_first_subds:
+ warnings.append('No explicit bands tagged with Description = x_translation/y_translation/z_translation. Assuming the first, second and third band are respectively x_translation, y_translation, z_translation')
+ x_idx = 1
+ y_idx = 2
+ z_idx = 3
+
+ for idx in (x_idx, y_idx, z_idx):
+ if idx == 0:
+ continue
+ units = ds.GetRasterBand(idx).GetUnitType()
+ if not units:
+ if is_first_subds:
+ warnings.append(
+ "One of x_translation/y_translation/z_translation band is missing units description. Metre will be assumed")
+ elif units not in ('metre',):
+ errors.append(
+ "One of x_translation/y_translation/z_translation band is using a unit not supported by PROJ")
+
+ return infos, warnings, errors
+
+
+def validate_velocity(ds, is_first_subds):
+
+ infos = []
+ warnings = []
+ errors = []
+
+ if ds.RasterCount < 3:
+ return infos, warnings, ["TYPE=VELOCITY should have at least 3 bands"]
+
+ x_idx = 0
+ y_idx = 0
+ z_idx = 0
+ for i in range(ds.RasterCount):
+ b = ds.GetRasterBand(i+1)
+ desc = b.GetDescription()
+ if desc == 'east_velocity':
+ if x_idx > 0:
+ return infos, warnings, ["At least, 2 bands are tagged with Description = east_velocity"]
+ x_idx = i+1
+ elif desc == 'north_velocity':
+ if y_idx > 0:
+ return infos, warnings, ["At least, 2 bands are tagged with Description = north_velocity"]
+ y_idx = i+1
+ elif desc == 'up_velocity':
+ if z_idx > 0:
+ return infos, warnings, ["At least, 2 bands are tagged with Description = up_velocity"]
+ z_idx = i+1
+ elif desc and desc not in ('east_velocity_accuracy', 'north_velocity_accuracy', 'up_velocity_accuracy'):
+ infos.append('Band of type %s not recognized by PROJ' % desc)
+
+ if x_idx > 0 and y_idx > 0 and z_idx > 0:
+ if x_idx != 1 or y_idx != 2 or z_idx != 3:
+ infos.append(
+ 'Usually the first, second and third band should be respectively east_velocity, north_velocity, up_velocity')
+ elif x_idx > 0 or y_idx > 0 or z_idx > 0:
+ return infos, warnings, ["Part of the bands are tagged with Description = east_velocity/north_velocity/up_velocity but not all"]
+ else:
+ if is_first_subds:
+ warnings.append('No explicit bands tagged with Description = east_velocity/north_velocity/up_velocity. Assuming the first, second and third band are respectively east_velocity, north_velocity, up_velocity')
+ x_idx = 1
+ y_idx = 2
+ z_idx = 3
+
+ for idx in (x_idx, y_idx, z_idx):
+ if idx == 0:
+ continue
+ units = ds.GetRasterBand(idx).GetUnitType()
+ if not units:
+ if is_first_subds:
+ warnings.append(
+ "One of east_velocity/north_velocity/up_velocity band is missing units description. Metre will be assumed")
+ elif units not in ('millimetres per year',):
+ errors.append(
+ "One of east_velocity/north_velocity/up_velocity band is using a unit not supported by PROJ")
+
+ return infos, warnings, errors
+
+
+class GlobalInfo(object):
+ def __init__(self):
+ self.map_grid_name_to_grid = {}
+ self.map_grid_name_to_children = {}
+
+
+def get_extent(ds):
+ gt = ds.GetGeoTransform()
+ xmin = gt[0] + gt[1] / 2
+ ymax = gt[3] + gt[5] / 2
+ xmax = gt[0] + gt[1] * ds.RasterXSize - gt[1] / 2
+ ymin = gt[3] + gt[5] * ds.RasterYSize - gt[5] / 2
+ return xmin, ymin, xmax, ymax
+
+
+def validate_ifd(global_info, ds, is_first_subds, first_subds):
+
+ infos = []
+ warnings = []
+ errors = []
+
+ wkt = ds.GetProjectionRef()
+ srs = None
+ if wkt and not wkt.startswith('LOCAL_CS['):
+ srs = osr.SpatialReference()
+ if srs.ImportFromWkt(wkt) != 0:
+ srs = None
+
+ if not srs:
+ if is_first_subds:
+ errors.append("No CRS found in the GeoTIFF encoding")
+ else:
+ if not srs.IsGeographic():
+ errors.append("CRS found, but not a Geographic CRS")
+
+ if ds.GetMetadataItem('AREA_OR_POINT') != 'Point':
+ warnings.append("This file uses a RasterTypeGeoKey = PixelIsArea convention. While this will work properly with PROJ, a georeferencing using RasterTypeGeoKey = PixelIsPoint is more common.")
+
+ gt = ds.GetGeoTransform(can_return_null=True)
+ if not gt:
+ errors.append("No geotransform matrix found")
+ else:
+ if gt[2] != 0 or gt[4] != 0:
+ errors.append("Geotransform with rotation terms not supported")
+ if gt[1] < 0:
+ errors.append(
+ "Geotransform with a negative pixel width not supported")
+ if gt[1] == 0:
+ errors.append("Geotransform with a zero pixel width")
+ if gt[5] > 0:
+ warnings.append(
+ "Geotransform with a positive pixel height, that is a south-up image, is supported, but a unusual formulation")
+ if gt[5] == 0:
+ errors.append("Geotransform with a zero pixel height")
+
+ type = ds.GetMetadataItem('TYPE')
+ if not type:
+ if is_first_subds:
+ warnings.append("Missing TYPE metadata item")
+ else:
+ type = first_subds.GetMetadataItem('TYPE')
+ elif type not in ('HORIZONTAL_OFFSET',
+ 'VERTICAL_OFFSET_GEOGRAPHIC_TO_VERTICAL',
+ 'VERTICAL_OFFSET_VERTICAL_TO_VERTICAL',
+ 'GEOCENTRIC_TRANSLATION',
+ 'VELOCITY'):
+ warnings.append("TYPE=%s is not recognize by PROJ" % type)
+
+ if is_first_subds:
+ if not ds.GetMetadataItem('area_of_use'):
+ warnings.append(
+ "GDAL area_of_use metadata item is missing. Typically used to capture plain text information about where the grid applies")
+
+ if not ds.GetMetadataItem('TIFFTAG_IMAGEDESCRIPTION'):
+ warnings.append(
+ "TIFF tag ImageDescription is missing. Typically used to capture plain text information about what the grid does")
+
+ if not ds.GetMetadataItem('TIFFTAG_COPYRIGHT'):
+ warnings.append(
+ "TIFF tag Copyright is missing. Typically used to capture information on the source and licensing terms")
+
+ dt = ds.GetMetadataItem('TIFFTAG_DATETIME')
+ if not dt:
+ warnings.append(
+ "TIFF tag DateTime is missing. Typically used to capture when the grid has been generated/converted")
+ else:
+ m = re.match(
+ r'^(\d\d\d\d):(\d\d):(\d\d) (\d\d):(\d\d):(\d\d)$', dt)
+ wrong_dt_format = False
+ if not m:
+ wrong_dt_format = True
+ else:
+ year, month, day, hour, min, sec = (int(v) for v in m.groups())
+ if not (year >= 1980 and year <= datetime.datetime.now().year):
+ wrong_dt_format = True
+ else:
+ try:
+ datetime.datetime(year, month, day, hour, min, sec)
+ except:
+ wrong_dt_format = True
+
+ if wrong_dt_format:
+ warnings.append(
+ "TIFF tag DateTime malformed. Should be YYYY:MM:DD HH:MM:SS")
+
+ compression = ds.GetMetadataItem('COMPRESSION', 'IMAGE_STRUCTURE')
+ if not compression:
+ warnings.append(
+ 'Image is uncompressed. Could potentially benefit from compression')
+ elif compression not in ('LZW', 'DEFLATE'):
+ warnings.append(
+ 'Image uses %s compression. Might cause compatibility problems' % compression)
+
+ if type == 'HORIZONTAL_OFFSET':
+ i, w, e = validate_horizontal_offset(ds, is_first_subds)
+ infos += i
+ warnings += w
+ errors += e
+ elif type == 'VERTICAL_OFFSET_GEOGRAPHIC_TO_VERTICAL':
+ i, w, e = validate_vertical_offset_geographic_to_vertical(
+ ds, is_first_subds)
+ infos += i
+ warnings += w
+ errors += e
+ elif type == 'VERTICAL_OFFSET_VERTICAL_TO_VERTICAL':
+ i, w, e = validate_vertical_offset_vertical_to_vertical(
+ ds, is_first_subds)
+ infos += i
+ warnings += w
+ errors += e
+ elif type == 'GEOCENTRIC_TRANSLATION':
+ i, w, e = validate_geocentric_translation(ds, is_first_subds)
+ infos += i
+ warnings += w
+ errors += e
+ elif type == 'VELOCITY':
+ i, w, e = validate_velocity(ds, is_first_subds)
+ infos += i
+ warnings += w
+ errors += e
+
+ grid_name = ds.GetMetadataItem('grid_name')
+ if grid_name:
+ if grid_name in global_info.map_grid_name_to_grid:
+ errors.append(
+ 'Several subgrids with grid_name=%s found' % grid_name)
+ global_info.map_grid_name_to_grid[grid_name] = ds
+
+ parent_grid_name = ds.GetMetadataItem('parent_grid_name')
+ if parent_grid_name:
+ if not grid_name:
+ errors.append('Grid has parent_grid_name, but not grid_name')
+
+ if parent_grid_name not in global_info.map_grid_name_to_children:
+ global_info.map_grid_name_to_children[parent_grid_name] = [ds]
+ else:
+ global_info.map_grid_name_to_children[parent_grid_name].append(ds)
+
+ if parent_grid_name not in global_info.map_grid_name_to_grid:
+ errors.append(
+ 'Parent grid named %s not already encountered' % parent_grid_name)
+ else:
+ parent_ds = global_info.map_grid_name_to_grid[parent_grid_name]
+ parent_xmin, parent_ymin, parent_xmax, parent_ymax = get_extent(
+ parent_ds)
+ xmin, ymin, xmax, ymax = get_extent(ds)
+ if not (xmin+1e-10 >= parent_xmin and ymin+1e-10 >= parent_ymin and xmax-1e-10 <= parent_xmax and ymax-1e-10 <= parent_ymax):
+ errors.append('Extent (%f,%f,%f,%f) of grid %s is not inside its parent %s extent (%f,%f,%f,%f)' % (
+ xmin, ymin, xmax, ymax, grid_name if grid_name else 'unknown', parent_grid_name, parent_xmin, parent_ymin, parent_xmax, parent_ymax))
+
+ # Check for well kown metadata item names
+ md = ds.GetMetadata_Dict()
+ md_keys = md.keys()
+ for key in md_keys:
+ if key not in ('AREA_OR_POINT', 'TYPE',
+ 'area_of_use',
+ 'grid_name',
+ 'parent_grid_name',
+ 'source_crs_epsg_code', 'source_crs_wkt',
+ 'target_crs_epsg_code', 'target_crs_wkt',
+ 'interpolation_crs_wkt',
+ 'recommended_interpolation_method',
+ 'TIFFTAG_COPYRIGHT',
+ 'TIFFTAG_DATETIME',
+ 'TIFFTAG_IMAGEDESCRIPTION',
+ 'number_of_nested_grids'):
+ infos.append('Metadata %s=%s ignored' % (key, md[key]))
+
+ for i in range(ds.RasterCount):
+ b = ds.GetRasterBand(i+1)
+ md = b.GetMetadata_Dict()
+ md_keys = md.keys()
+ for key in md_keys:
+ if key not in ('positive_value'):
+ infos.append('Metadata %s=%s on band %d ignored' %
+ (key, md[key], i+1))
+
+ if b.DataType not in (gdal.GDT_Int16, gdal.GDT_UInt16,
+ gdal.GDT_Int32, gdal.GDT_UInt32,
+ gdal.GDT_Float32, gdal.GDT_Float64):
+ errors.append('Datatype %s of band %d is not support' %
+ (gdal.GetDataTypeName(b.DataType), i+1))
+
+ if b.GetMetadataItem('NBITS'):
+ errors.append('NBITS != native bit depth not supported')
+
+ gdal.ErrorReset()
+ b.Checksum()
+ if gdal.GetLastErrorMsg() != '':
+ errors.append('Cannot read values of band %d' % (i+1))
+
+ block_width, block_height = ds.GetRasterBand(1).GetBlockSize()
+ if ds.RasterYSize > 512:
+ if block_width > 512 or block_height > 512:
+ warnings.append(
+ 'Given image dimension, tiling could be appropriate')
+
+ return infos, warnings, errors
+
+
+def validate(filename):
+
+ infos = []
+ warnings = []
+ errors = []
+ ds = gdal.Open(filename)
+ global_info = GlobalInfo()
+
+ if not ds:
+ return infos, warnings, ["Cannot open file"]
+
+ if ds.GetDriver().ShortName != 'GTiff':
+ return infos, warnings, ["Not a TIFF file"]
+
+ subds_list = ds.GetSubDatasets()
+ if not subds_list:
+ subds_list = [(filename, None)]
+ first_subds = None
+
+ ifd_offset = int(ds.GetRasterBand(1).GetMetadataItem('IFD_OFFSET', 'TIFF'))
+ if ifd_offset > 100:
+ warnings.append(
+ 'Offset of first IFD is %d > 100. Not cloud friedly layout' % ifd_offset)
+ last_ifd_offset = ifd_offset
+
+ block_offsets = []
+
+ # Iterate through IFDs
+ for idx, subds_tuple in enumerate(subds_list):
+
+ subds_name = subds_tuple[0]
+
+ sub_ds = gdal.Open(subds_name)
+ if not ds:
+ return infos, warnings, ["Cannot open subdataset of IFD %d" % idx]
+
+ sub_ds_infos, sub_ds_warnings, sub_ds_errors = validate_ifd(
+ global_info, sub_ds, idx == 0, first_subds)
+
+ ifd_offset = int(sub_ds.GetRasterBand(
+ 1).GetMetadataItem('IFD_OFFSET', 'TIFF'))
+ if idx > 0 and ifd_offset < last_ifd_offset:
+ warnings.append('Offset of IFD %d is %d, before offset of previous IFD' % (
+ ifd_offset, last_ifd_offset))
+ last_ifd_offset = ifd_offset
+ block_offsets.append(int(sub_ds.GetRasterBand(
+ 1).GetMetadataItem('BLOCK_OFFSET_0_0', 'TIFF')))
+
+ if len(subds_list) == 1:
+ return sub_ds_infos, sub_ds_warnings, sub_ds_errors
+
+ if sub_ds_infos:
+ for i in sub_ds_infos:
+ infos.append('IFD %d: %s' % (idx, i))
+
+ if sub_ds_warnings:
+ for w in sub_ds_warnings:
+ warnings.append('IFD %d: %s' % (idx, w))
+
+ if sub_ds_errors:
+ for e in sub_ds_errors:
+ errors.append('IFD %d: %s' % (idx, e))
+
+ if first_subds is None:
+ first_subds = gdal.Open(subds_name)
+
+ for idx, offset in enumerate(block_offsets):
+ if offset < last_ifd_offset:
+ warnings.append(
+ 'Data for IFD %d is not located after the last IFD' % idx)
+
+ # Check value of number_of_nested_grids metadata item
+ for grid_name, children in global_info.map_grid_name_to_children.items():
+ ds = global_info.map_grid_name_to_grid[grid_name]
+ number_of_nested_grids = ds.GetMetadataItem('number_of_nested_grids')
+ if not number_of_nested_grids:
+ warnings.append('Grid %s missing number_of_nested_grids=%d' % (
+ grid_name, len(children)))
+ elif number_of_nested_grids != str(len(children)):
+ warnings.append('Grid %s contains number_of_nested_grids=%s, but should be %d' % (
+ grid_name, number_of_nested_grids, len(children)))
+
+ return infos, warnings, errors
+
+
+if __name__ == '__main__':
+
+ args = get_args()
+
+ infos, warnings, errors = validate(args.filename)
+
+ if errors:
+ print('%s is NOT a valid PROJ GeoTIFF file.' % args.filename)
+ print('The following errors were found (corrective action is required):')
+ for error in errors:
+ print(' - ' + error)
+ print('')
+
+ if warnings:
+ print('The following warnings were found (the file will probably be read correctly by PROJ, but corrective action may be required):')
+ for warning in warnings:
+ print(' - ' + warning)
+ print('')
+
+ if infos:
+ print(
+ 'The following informative message are issued (no corrective action required):')
+ for info in infos:
+ print(' - ' + info)
+ print('')
+
+ sys.exit(1 if errors else 0)
diff --git a/grid_tools/cloud_optimize_gtiff.py b/grid_tools/cloud_optimize_gtiff.py
new file mode 100755
index 0000000..b8ef92c
--- /dev/null
+++ b/grid_tools/cloud_optimize_gtiff.py
@@ -0,0 +1,390 @@
+#!/usr/bin/env python
+###############################################################################
+# $Id$
+#
+# Project: PROJ
+# Purpose: Cloud optimize a GeoTIFF file
+# Author: Even Rouault <even.rouault at spatialys.com>
+#
+###############################################################################
+# Copyright (c) 2019, Even Rouault <even.rouault at spatialys.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining a
+# copy of this software and associated documentation files (the "Software"),
+# to deal in the Software without restriction, including without limitation
+# the rights to use, copy, modify, merge, publish, distribute, sublicense,
+# and/or sell copies of the Software, and to permit persons to whom the
+# Software is furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included
+# in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
+# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+# DEALINGS IN THE SOFTWARE.
+###############################################################################
+
+from osgeo import gdal
+
+import argparse
+import os
+import struct
+
+
+def get_args():
+ parser = argparse.ArgumentParser(
+ description='Convert a GeoTIFF file into a cloud optimized one.')
+ parser.add_argument('source',
+ help='Source GeoTIFF file')
+ parser.add_argument('dest',
+ help='Destination GeoTIFF file')
+
+ return parser.parse_args()
+
+
+def generate_optimized_file(srcfilename, destfilename):
+
+ TIFF_BYTE = 1 # 8-bit unsigned integer
+ TIFF_ASCII = 2 # 8-bit bytes w/ last byte null
+ TIFF_SHORT = 3 # 16-bit unsigned integer
+ TIFF_LONG = 4 # 32-bit unsigned integer
+ TIFF_RATIONAL = 5 # 64-bit unsigned fraction
+ TIFF_SBYTE = 6 # !8-bit signed integer
+ TIFF_UNDEFINED = 7 # !8-bit untyped data
+ TIFF_SSHORT = 8 # !16-bit signed integer
+ TIFF_SLONG = 9 # !32-bit signed integer
+ TIFF_SRATIONAL = 10 # !64-bit signed fraction
+ TIFF_FLOAT = 11 # !32-bit IEEE floating point
+ TIFF_DOUBLE = 12 # !64-bit IEEE floating point
+ TIFF_IFD = 13 # %32-bit unsigned integer (offset)
+ TIFF_LONG8 = 16 # BigTIFF 64-bit unsigned integer
+ TIFF_SLONG8 = 17 # BigTIFF 64-bit signed integer
+ TIFF_IFD8 = 18 # BigTIFF 64-bit unsigned integer (offset)
+
+ TIFFTAG_STRIPOFFSETS = 273
+ TIFFTAG_SAMPLESPERPIXEL = 277
+ TIFFTAG_STRIPBYTECOUNTS = 279
+ TIFFTAG_PLANARCONFIG = 284
+ TIFFTAG_TILEOFFSETS = 324
+ TIFFTAG_TILEBYTECOUNTS = 325
+
+ PLANARCONFIG_CONTIG = 1
+ PLANARCONFIG_SEPARATE = 2
+
+ TIFFTAG_GDAL_METADATA = 42112
+
+ typesize = {}
+ typesize[TIFF_BYTE] = 1
+ typesize[TIFF_ASCII] = 1
+ typesize[TIFF_SHORT] = 2
+ typesize[TIFF_LONG] = 4
+ typesize[TIFF_RATIONAL] = 8
+ typesize[TIFF_SBYTE] = 1
+ typesize[TIFF_UNDEFINED] = 1
+ typesize[TIFF_SSHORT] = 2
+ typesize[TIFF_SLONG] = 4
+ typesize[TIFF_SRATIONAL] = 8
+ typesize[TIFF_FLOAT] = 4
+ typesize[TIFF_DOUBLE] = 8
+ typesize[TIFF_IFD] = 4
+ typesize[TIFF_LONG8] = 8
+ typesize[TIFF_SLONG8] = 8
+ typesize[TIFF_IFD8] = 8
+
+ class OfflineTag:
+ def __init__(self, tagtype, nvalues, data, fileoffset_in_out_ifd):
+ self.tagtype = tagtype
+ self.nvalues = nvalues
+ self.data = data
+ self.fileoffset_in_out_ifd = fileoffset_in_out_ifd
+
+ def unpack_array(self):
+ if type(self.data) == type((0,)):
+ return self.data
+ elif self.tagtype == TIFF_SHORT:
+ return struct.unpack('<' + ('H' * self.nvalues), self.data)
+ elif self.tagtype == TIFF_LONG:
+ return struct.unpack('<' + ('I' * self.nvalues), self.data)
+ else:
+ assert False
+
+ class IFD:
+ def __init__(self, tagdict):
+ self.tagdict = tagdict
+
+ ds = gdal.Open(srcfilename)
+ assert ds
+ first_band_to_put_at_end = None
+ if ds.GetMetadataItem('TYPE') == 'HORIZONTAL_OFFSET' and \
+ ds.RasterCount >= 4 and \
+ ds.GetRasterBand(1).GetDescription() == 'latitude_offset' and \
+ ds.GetRasterBand(2).GetDescription() == 'longitude_offset' and \
+ ds.GetRasterBand(3).GetDescription() == 'latitude_offset_accuracy' and \
+ ds.GetRasterBand(4).GetDescription() == 'longitude_offset_accuracy':
+ first_band_to_put_at_end = 3
+ elif ds.GetMetadataItem('TYPE') == 'VELOCITY' and \
+ ds.RasterCount >= 6 and \
+ ds.GetRasterBand(1).GetDescription() == 'east_velocity' and \
+ ds.GetRasterBand(2).GetDescription() == 'north_velocity' and \
+ ds.GetRasterBand(3).GetDescription() == 'up_velocity' and \
+ ds.GetRasterBand(4).GetDescription() == 'east_velocity_accuracy' and \
+ ds.GetRasterBand(5).GetDescription() == 'north_velocity_accuracy' and \
+ ds.GetRasterBand(6).GetDescription() == 'up_velocity_accuracy':
+ first_band_to_put_at_end = 4
+ del ds
+
+ in_f = open(srcfilename, 'rb')
+ signature = in_f.read(4)
+ assert signature == b'\x49\x49\x2A\x00'
+ next_ifd_offset = struct.unpack('<I', in_f.read(4))[0]
+
+ out_f = open(destfilename, 'wb')
+ out_f.write(signature)
+ next_ifd_offset_out_offset = 4
+ # placeholder for pointer to next IFD
+ out_f.write(struct.pack('<I', 0xDEADBEEF))
+
+ out_f.write(b'\n-- Generated by cloud_optimize_gtiff.py v1.0 --\n')
+ essential_metadata_size_hint_offset_to_patch = out_f.tell()
+ dummy_metadata_hint = b'-- Metadata size: XXXXXX --\n'
+ out_f.write(dummy_metadata_hint)
+
+ ifds = []
+
+ offlinedata_to_offset = {}
+
+ reuse_offlinedata = True
+
+ while next_ifd_offset != 0:
+
+ in_f.seek(next_ifd_offset)
+
+ cur_pos = out_f.tell()
+ if (cur_pos % 2) == 1:
+ out_f.write(b'\x00')
+ cur_pos += 1
+ out_f.seek(next_ifd_offset_out_offset)
+ out_f.write(struct.pack('<I', cur_pos))
+ out_f.seek(cur_pos)
+
+ numtags = struct.unpack('<H', in_f.read(2))[0]
+ out_f.write(struct.pack('<H', numtags))
+
+ tagdict = {}
+ ifd = IFD(tagdict)
+
+ # Write IFD
+ for i in range(numtags):
+ tagid = struct.unpack('<H', in_f.read(2))[0]
+ tagtype = struct.unpack('<H', in_f.read(2))[0]
+ tagnvalues = struct.unpack('<I', in_f.read(4))[0]
+ tagvalueoroffset_raw = in_f.read(4)
+ tagvalueoroffset = struct.unpack('<I', tagvalueoroffset_raw)[0]
+ #print(tagid, tagtype, tagnvalues, tagvalueoroffset)
+ tagvalsize = typesize[tagtype] * tagnvalues
+
+ if tagid == TIFFTAG_PLANARCONFIG:
+ assert tagvalueoroffset in (
+ PLANARCONFIG_CONTIG, PLANARCONFIG_SEPARATE)
+ ifd.planarconfig_contig = True if tagvalueoroffset == PLANARCONFIG_CONTIG else False
+ elif tagid == TIFFTAG_SAMPLESPERPIXEL:
+ ifd.nbands = tagvalueoroffset
+
+ out_f.write(struct.pack('<H', tagid))
+ out_f.write(struct.pack('<H', tagtype))
+ out_f.write(struct.pack('<I', tagnvalues))
+ if tagvalsize <= 4:
+ if tagid in (TIFFTAG_STRIPOFFSETS, TIFFTAG_TILEOFFSETS):
+ ifd.offset_out_offsets = out_f.tell()
+ if tagtype == TIFF_SHORT:
+ tagdict[tagid] = OfflineTag(
+ tagtype, tagnvalues, struct.unpack('<HH', tagvalueoroffset_raw), -ifd.offset_out_offsets)
+ elif tagtype == TIFF_LONG:
+ tagdict[tagid] = OfflineTag(
+ tagtype, tagnvalues, struct.unpack('<I', tagvalueoroffset_raw), -ifd.offset_out_offsets)
+ out_f.write(struct.pack('<I', 0xDEADBEEF)) # placeholder
+ else:
+ if tagtype == TIFF_SHORT:
+ tagdict[tagid] = OfflineTag(
+ tagtype, tagnvalues, struct.unpack('<HH', tagvalueoroffset_raw), -1)
+ elif tagtype == TIFF_LONG:
+ tagdict[tagid] = OfflineTag(
+ tagtype, tagnvalues, struct.unpack('<I', tagvalueoroffset_raw), -1)
+ out_f.write(struct.pack('<I', tagvalueoroffset))
+ else:
+ curinoff = in_f.tell()
+ in_f.seek(tagvalueoroffset)
+ tagdata = in_f.read(tagvalsize)
+ in_f.seek(curinoff)
+
+ if reuse_offlinedata and tagdata in offlinedata_to_offset:
+ tagdict[tagid] = OfflineTag(
+ tagtype, tagnvalues, tagdata, -1)
+ out_f.write(struct.pack(
+ '<I', offlinedata_to_offset[tagdata]))
+ else:
+ tagdict[tagid] = OfflineTag(
+ tagtype, tagnvalues, tagdata, out_f.tell())
+ out_f.write(struct.pack('<I', 0xDEADBEEF)) # placeholder
+
+ next_ifd_offset = struct.unpack('<I', in_f.read(4))[0]
+ next_ifd_offset_out_offset = out_f.tell()
+ # placeholder for pointer to next IFD
+ out_f.write(struct.pack('<I', 0xDEADBEEF))
+
+ # Write data for all out-of-line tags,
+ # except the offset and byte count ones, and the GDAL metadata for the
+ # IFDs after the first one, and patch IFD entries
+ for id in tagdict:
+ if tagdict[id].fileoffset_in_out_ifd < 0:
+ continue
+ if id in (TIFFTAG_STRIPOFFSETS, TIFFTAG_STRIPBYTECOUNTS,
+ TIFFTAG_TILEOFFSETS, TIFFTAG_TILEBYTECOUNTS):
+ continue
+ if id == TIFFTAG_GDAL_METADATA:
+ if len(ifds) != 0:
+ continue
+ cur_pos = out_f.tell()
+ out_f.seek(tagdict[id].fileoffset_in_out_ifd)
+ out_f.write(struct.pack('<I', cur_pos))
+ out_f.seek(cur_pos)
+ out_f.write(tagdict[id].data)
+ if reuse_offlinedata:
+ offlinedata_to_offset[tagdict[id].data] = cur_pos
+
+ ifds.append(ifd)
+
+ # Write GDAL_METADATA of ifds other than the first one
+ for ifd in ifds[1:]:
+ tagdict = ifd.tagdict
+
+ if TIFFTAG_GDAL_METADATA in tagdict:
+ id = TIFFTAG_GDAL_METADATA
+ cur_pos = out_f.tell()
+ out_f.seek(tagdict[id].fileoffset_in_out_ifd)
+ out_f.write(struct.pack('<I', cur_pos))
+ out_f.seek(cur_pos)
+ out_f.write(tagdict[id].data)
+
+ metadata_hint = ('-- Metadata size: %06d --\n' %
+ out_f.tell()).encode('ASCII')
+ assert len(metadata_hint) == len(dummy_metadata_hint)
+ out_f.seek(essential_metadata_size_hint_offset_to_patch)
+ out_f.write(metadata_hint)
+ out_f.seek(0, os.SEEK_END)
+
+ # Write strile bytecounts and dummy offsets
+ for idx_ifd, ifd in enumerate(ifds):
+ tagdict = ifd.tagdict
+
+ for id in tagdict:
+ if tagdict[id].fileoffset_in_out_ifd < 0:
+ continue
+ if id not in (TIFFTAG_STRIPOFFSETS, TIFFTAG_STRIPBYTECOUNTS,
+ TIFFTAG_TILEOFFSETS, TIFFTAG_TILEBYTECOUNTS):
+ continue
+
+ cur_pos = out_f.tell()
+ out_f.seek(tagdict[id].fileoffset_in_out_ifd)
+ out_f.write(struct.pack('<I', cur_pos))
+ out_f.seek(cur_pos)
+ if id in (TIFFTAG_STRIPOFFSETS, TIFFTAG_TILEOFFSETS):
+ ifd.offset_out_offsets = out_f.tell()
+ # dummy. to be rewritten
+ out_f.write(b'\00' * len(tagdict[id].data))
+ else:
+ out_f.write(tagdict[id].data) # bytecounts don't change
+
+ # Write blocks in a band-interleaved way
+ for ifd in ifds:
+ tagdict = ifd.tagdict
+
+ if TIFFTAG_STRIPOFFSETS in tagdict:
+ ifd.num_striles = tagdict[TIFFTAG_STRIPOFFSETS].nvalues
+ assert ifd.num_striles == tagdict[TIFFTAG_STRIPBYTECOUNTS].nvalues
+ ifd.strile_offset_in = tagdict[TIFFTAG_STRIPOFFSETS].unpack_array()
+ ifd.strile_length_in = tagdict[TIFFTAG_STRIPBYTECOUNTS].unpack_array(
+ )
+ else:
+ ifd.num_striles = tagdict[TIFFTAG_TILEOFFSETS].nvalues
+ assert ifd.num_striles == tagdict[TIFFTAG_TILEBYTECOUNTS].nvalues
+ ifd.strile_offset_in = tagdict[TIFFTAG_TILEOFFSETS].unpack_array()
+ ifd.strile_length_in = \
+ tagdict[TIFFTAG_TILEBYTECOUNTS].unpack_array()
+
+ ifd.strile_offset_out = [0] * ifd.num_striles
+ if ifd.planarconfig_contig:
+ ifd.num_striles_per_band = ifd.num_striles
+ else:
+ assert (ifd.num_striles % ifd.nbands) == 0
+ ifd.num_striles_per_band = ifd.num_striles // ifd.nbands
+
+ if ifd.planarconfig_contig:
+ list_bands = (0,)
+ elif first_band_to_put_at_end:
+ list_bands = range(first_band_to_put_at_end - 1)
+ else:
+ list_bands = range(ifd.nbands)
+
+ for i in range(ifd.num_striles_per_band):
+ for iband in list_bands:
+ idx_strile = ifd.num_striles_per_band * iband + i
+ in_f.seek(ifd.strile_offset_in[idx_strile])
+ data = in_f.read(ifd.strile_length_in[idx_strile])
+ ifd.strile_offset_out[idx_strile] = out_f.tell()
+ out_f.write(data)
+
+ # And then the errors at end for NTv2 like products
+ if not ifd.planarconfig_contig and first_band_to_put_at_end and \
+ ifd.nbands >= first_band_to_put_at_end:
+ for ifd in ifds:
+ if ifd.nbands < first_band_to_put_at_end:
+ continue
+
+ for i in range(ifd.num_striles_per_band):
+ for iband in range(first_band_to_put_at_end-1, ifd.nbands):
+ idx_strile = ifd.num_striles_per_band * iband + i
+ in_f.seek(ifd.strile_offset_in[idx_strile])
+ data = in_f.read(ifd.strile_length_in[idx_strile])
+ ifd.strile_offset_out[idx_strile] = out_f.tell()
+ out_f.write(data)
+
+ # Write strile offset arrays
+ for ifd in ifds:
+ tagdict = ifd.tagdict
+
+ if TIFFTAG_STRIPOFFSETS in tagdict:
+ tagtype = tagdict[TIFFTAG_STRIPOFFSETS].tagtype
+ else:
+ tagtype = tagdict[TIFFTAG_TILEOFFSETS].tagtype
+
+ if ifd.offset_out_offsets < 0:
+ assert ifd.offset_out_offsets != -1
+ out_f.seek(-ifd.offset_out_offsets)
+ else:
+ out_f.seek(ifd.offset_out_offsets)
+ if tagtype == TIFF_SHORT:
+ for v in ifd.strile_offset_out:
+ assert v < 65536
+ out_f.write(struct.pack('<H', v))
+ else:
+ for v in ifd.strile_offset_out:
+ out_f.write(struct.pack('<I', v))
+
+ # Patch pointer to last IFD
+ out_f.seek(next_ifd_offset_out_offset)
+ out_f.write(struct.pack('<I', 0))
+
+ in_f.close()
+ out_f.close()
+
+
+if __name__ == '__main__':
+
+ args = get_args()
+
+ generate_optimized_file(args.source, args.dest)
diff --git a/grid_tools/convert_gr3df97a.py b/grid_tools/convert_gr3df97a.py
new file mode 100755
index 0000000..f70927d
--- /dev/null
+++ b/grid_tools/convert_gr3df97a.py
@@ -0,0 +1,146 @@
+#!/usr/bin/env python
+###############################################################################
+# $Id$
+#
+# Project: PROJ
+# Purpose: Convert IGN France geocentric gr3df97a.txt grid
+# Author: Even Rouault <even.rouault at spatialys.com>
+#
+###############################################################################
+# Copyright (c) 2019, Even Rouault <even.rouault at spatialys.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining a
+# copy of this software and associated documentation files (the "Software"),
+# to deal in the Software without restriction, including without limitation
+# the rights to use, copy, modify, merge, publish, distribute, sublicense,
+# and/or sell copies of the Software, and to permit persons to whom the
+# Software is furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included
+# in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
+# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+# DEALINGS IN THE SOFTWARE.
+###############################################################################
+
+from osgeo import gdal
+from osgeo import osr
+import datetime
+import struct
+
+# https://geodesie.ign.fr/contenu/fichiers/gr3df97a.txt
+
+with open('gr3df97a.txt', 'rt') as f:
+ count_val_rows = 0
+ for line_no, l in enumerate(f.readlines()):
+ if l[-1] == '\n':
+ l = l[0:-1]
+ if l[-1] == '\r':
+ l = l[0:-1]
+ if line_no == 0:
+ assert l.startswith(' GR3D ')
+ elif line_no == 1:
+ assert l.startswith(' GR3D1 ')
+ tokens = []
+ for t in l.split(' '):
+ if t:
+ tokens.append(t)
+ assert len(tokens) == 7
+ minx = float(tokens[1])
+ maxx = float(tokens[2])
+ miny = float(tokens[3])
+ maxy = float(tokens[4])
+ resx = float(tokens[5])
+ resy = float(tokens[6])
+ assert minx < maxx
+ assert miny < maxy
+ assert resx > 0
+ assert resy > 0
+ cols = 1 + (maxx - minx) / resx
+ assert cols - int(cols + 0.5) < 1e-10
+ cols = int(cols + 0.5)
+ rows = 1 + (maxy - miny) / resy
+ assert rows - int(rows + 0.5) < 1e-10
+ rows = int(rows + 0.5)
+ ds = gdal.GetDriverByName('GTiff').Create(
+ '/vsimem/gr3df97a.tif', cols, rows, 3, gdal.GDT_Float32)
+ ds.SetMetadataItem(
+ 'TIFFTAG_COPYRIGHT', 'Derived from work by IGN France. Open License https://www.etalab.gouv.fr/wp-content/uploads/2014/05/Open_Licence.pdf')
+ ds.SetMetadataItem('TIFFTAG_IMAGEDESCRIPTION',
+ 'Geocentric translation from NTF (IGNF:NTF) to RGF93 (EPSG:4964). Converted from gr3df97a.txt')
+ datetime = datetime.date.today().strftime("%Y:%m:%d %H:%M:%S")
+ ds.SetMetadataItem('TIFFTAG_DATETIME', datetime)
+ ds.SetMetadataItem('AREA_OR_POINT', 'Point')
+ ds.SetMetadataItem('TYPE', 'GEOCENTRIC_TRANSLATION')
+ ds.SetMetadataItem('area_of_use', 'France')
+ ds.SetMetadataItem('source_crs_wkt', """GEODCRS["NTF cartesiennes",
+ DATUM["Nouvelle Triangulation Francaise",
+ ELLIPSOID["Clarke 1880 (IGN)",6378249.2,293.466021293627,
+ LENGTHUNIT["metre",1]]],
+ PRIMEM["Greenwich",0,
+ ANGLEUNIT["degree",0.0174532925199433]],
+ CS[Cartesian,3],
+ AXIS["(X)",geocentricX,
+ ORDER[1],
+ LENGTHUNIT["metre",1]],
+ AXIS["(Y)",geocentricY,
+ ORDER[2],
+ LENGTHUNIT["metre",1]],
+ AXIS["(Z)",geocentricZ,
+ ORDER[3],
+ LENGTHUNIT["metre",1]],
+ USAGE[
+ SCOPE["unknown"],
+ AREA["FRANCE METROPOLITAINE (CORSE COMPRISE)"],
+ BBOX[41,-5.5,52,10]],
+ ID["IGNF","NTF"]]""")
+ ds.SetMetadataItem('target_crs_epsg_code', '4964')
+ ds.SetGeoTransform(
+ [minx - resx/2, resx, 0, maxy + resy/2, 0, -resy])
+ ds.GetRasterBand(1).SetUnitType('metre')
+ ds.GetRasterBand(1).SetDescription('x_translation')
+ ds.GetRasterBand(2).SetUnitType('metre')
+ ds.GetRasterBand(2).SetDescription('y_translation')
+ ds.GetRasterBand(3).SetUnitType('metre')
+ ds.GetRasterBand(3).SetDescription('z_translation')
+ sr = osr.SpatialReference()
+ sr.ImportFromEPSG(4171) # RGF93
+ ds.SetProjection(sr.ExportToWkt())
+ elif line_no == 2:
+ assert l.startswith(' GR3D2 ')
+ elif line_no == 3:
+ assert l.startswith(' GR3D3 ')
+ else:
+ assert l.startswith('00002 ')
+ tokens = []
+ for t in l.split(' '):
+ if t:
+ tokens.append(t)
+ assert len(tokens) == 8
+ lon = float(tokens[1])
+ lat = float(tokens[2])
+ dx = float(tokens[3])
+ dy = float(tokens[4])
+ dz = float(tokens[5])
+ iy = (line_no - 4) % rows
+ ix = (line_no - 4) // rows
+ assert abs(lon - (minx + ix * resx)
+ ) < 1e-10, (line_no, lon, minx + ix * resx)
+ assert abs(lat - (miny + iy * resy)
+ ) < 1e-10, (line_no, lat, miny + iy * resy)
+ ds.GetRasterBand(1).WriteRaster(
+ ix, rows - iy - 1, 1, 1, struct.pack('f', dx))
+ ds.GetRasterBand(2).WriteRaster(
+ ix, rows - iy - 1, 1, 1, struct.pack('f', dy))
+ ds.GetRasterBand(3).WriteRaster(
+ ix, rows - iy - 1, 1, 1, struct.pack('f', dz))
+ count_val_rows += 1
+
+ assert count_val_rows == rows * cols
+ gdal.GetDriverByName('GTiff').CreateCopy('gr3df97a.tif', ds, options=[
+ 'COMPRESS=DEFLATE', 'PREDICTOR=3', 'INTERLEAVE=BAND'])
diff --git a/grid_tools/convert_gr3dnc01b.py b/grid_tools/convert_gr3dnc01b.py
new file mode 100755
index 0000000..b3ed17e
--- /dev/null
+++ b/grid_tools/convert_gr3dnc01b.py
@@ -0,0 +1,121 @@
+#!/usr/bin/env python
+###############################################################################
+# $Id$
+#
+# Project: PROJ
+# Purpose: Convert New Caledonia geocentric gr3dnc01b.mnt grid
+# Author: Even Rouault <even.rouault at spatialys.com>
+#
+###############################################################################
+# Copyright (c) 2019, Even Rouault <even.rouault at spatialys.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining a
+# copy of this software and associated documentation files (the "Software"),
+# to deal in the Software without restriction, including without limitation
+# the rights to use, copy, modify, merge, publish, distribute, sublicense,
+# and/or sell copies of the Software, and to permit persons to whom the
+# Software is furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included
+# in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
+# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+# DEALINGS IN THE SOFTWARE.
+###############################################################################
+
+from osgeo import gdal
+from osgeo import osr
+import datetime
+import struct
+
+with open('gr3dnc01b.mnt', 'rt') as f:
+ count_val_rows = 0
+ line_no = 0
+ for l in f.readlines():
+ if l[-1] == '\n':
+ l = l[0:-1]
+ if l[-1] == '\r':
+ l = l[0:-1]
+ if not l:
+ continue
+ if line_no == 0:
+ tokens = []
+ for t in l.split(' '):
+ if t:
+ tokens.append(t)
+ assert len(tokens) >= 13
+ minx = float(tokens[0])
+ maxx = float(tokens[1])
+ miny = float(tokens[2])
+ maxy = float(tokens[3])
+ resx = float(tokens[4])
+ resy = float(tokens[5])
+ assert minx < maxx
+ assert miny < maxy
+ assert resx > 0
+ assert resy > 0
+ cols = 1 + (maxx - minx) / resx
+ assert cols - int(cols + 0.5) < 1e-10
+ cols = int(cols + 0.5)
+ rows = 1 + (maxy - miny) / resy
+ assert rows - int(rows + 0.5) < 1e-10
+ rows = int(rows + 0.5)
+ ds = gdal.GetDriverByName('GTiff').Create(
+ '/vsimem/gr3dnc01b.tif', cols, rows, 3, gdal.GDT_Float32)
+ ds.SetMetadataItem(
+ 'TIFFTAG_COPYRIGHT', 'Derived from work by Service Topographique, DITTT, GNC. License unclear')
+ ds.SetMetadataItem('TIFFTAG_IMAGEDESCRIPTION',
+ 'Geocentric translation from IGN72 GRANDE TERRE (EPSG:4662) to RGNC91-93 (EPSG:4749). Converted from gr3dnc01b.mnt')
+ datetime = datetime.date.today().strftime("%Y:%m:%d %H:%M:%S")
+ ds.SetMetadataItem('TIFFTAG_DATETIME', datetime)
+ ds.SetMetadataItem('AREA_OR_POINT', 'Point')
+ ds.SetMetadataItem('TYPE', 'GEOCENTRIC_TRANSLATION')
+ ds.SetMetadataItem('area_of_use', 'France')
+ ds.SetMetadataItem('source_crs_code', '4662')
+ ds.SetMetadataItem('target_crs_epsg_code', '4749')
+ ds.SetGeoTransform(
+ [minx - resx/2, resx, 0, maxy + resy/2, 0, -resy])
+ ds.GetRasterBand(1).SetUnitType('metre')
+ ds.GetRasterBand(1).SetDescription('x_translation')
+ ds.GetRasterBand(2).SetUnitType('metre')
+ ds.GetRasterBand(2).SetDescription('y_translation')
+ ds.GetRasterBand(3).SetUnitType('metre')
+ ds.GetRasterBand(3).SetDescription('z_translation')
+ sr = osr.SpatialReference()
+ sr.ImportFromEPSG(4749)
+ ds.SetProjection(sr.ExportToWkt())
+ else:
+ tokens = []
+ for t in l.split(' '):
+ if t:
+ tokens.append(t)
+ assert len(tokens) == 6
+ lon = float(tokens[0])
+ lat = float(tokens[1])
+ dx = float(tokens[2])
+ dy = float(tokens[3])
+ dz = float(tokens[4])
+ iy = (line_no - 1) % rows
+ ix = (line_no - 1) // rows
+ assert abs(lon - (minx + ix * resx)
+ ) < 1e-10, (line_no, lon, minx + ix * resx)
+ assert abs(lat - (miny + iy * resy)
+ ) < 1e-10, (line_no, lat, miny + iy * resy)
+ ds.GetRasterBand(1).WriteRaster(
+ ix, rows - iy - 1, 1, 1, struct.pack('f', dx))
+ ds.GetRasterBand(2).WriteRaster(
+ ix, rows - iy - 1, 1, 1, struct.pack('f', dy))
+ ds.GetRasterBand(3).WriteRaster(
+ ix, rows - iy - 1, 1, 1, struct.pack('f', dz))
+ count_val_rows += 1
+
+ line_no += 1
+
+ assert count_val_rows == rows * cols
+ gdal.GetDriverByName('GTiff').CreateCopy('gr3dnc01b.tif', ds, options=[
+ 'COMPRESS=DEFLATE', 'PREDICTOR=3', 'INTERLEAVE=BAND'])
diff --git a/grid_tools/convert_gr3dnc03a.py b/grid_tools/convert_gr3dnc03a.py
new file mode 100755
index 0000000..85c35a3
--- /dev/null
+++ b/grid_tools/convert_gr3dnc03a.py
@@ -0,0 +1,121 @@
+#!/usr/bin/env python
+###############################################################################
+# $Id$
+#
+# Project: PROJ
+# Purpose: Convert New Caledonia geocentric gr3dnc03a.mnt grid
+# Author: Even Rouault <even.rouault at spatialys.com>
+#
+###############################################################################
+# Copyright (c) 2019, Even Rouault <even.rouault at spatialys.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining a
+# copy of this software and associated documentation files (the "Software"),
+# to deal in the Software without restriction, including without limitation
+# the rights to use, copy, modify, merge, publish, distribute, sublicense,
+# and/or sell copies of the Software, and to permit persons to whom the
+# Software is furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included
+# in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
+# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+# DEALINGS IN THE SOFTWARE.
+###############################################################################
+
+from osgeo import gdal
+from osgeo import osr
+import datetime
+import struct
+
+with open('gr3dnc03a.mnt', 'rt') as f:
+ count_val_rows = 0
+ line_no = 0
+ for l in f.readlines():
+ if l[-1] == '\n':
+ l = l[0:-1]
+ if l[-1] == '\r':
+ l = l[0:-1]
+ if not l:
+ continue
+ if line_no == 0:
+ tokens = []
+ for t in l.split(' '):
+ if t:
+ tokens.append(t)
+ assert len(tokens) >= 13
+ minx = float(tokens[0])
+ maxx = float(tokens[1])
+ miny = float(tokens[2])
+ maxy = float(tokens[3])
+ resx = float(tokens[4])
+ resy = float(tokens[5])
+ assert minx < maxx
+ assert miny < maxy
+ assert resx > 0
+ assert resy > 0
+ cols = 1 + (maxx - minx) / resx
+ assert cols - int(cols + 0.5) < 1e-10
+ cols = int(cols + 0.5)
+ rows = 1 + (maxy - miny) / resy
+ assert rows - int(rows + 0.5) < 1e-10
+ rows = int(rows + 0.5)
+ ds = gdal.GetDriverByName('GTiff').Create(
+ '/vsimem/gr3dnc03a.tif', cols, rows, 3, gdal.GDT_Float32)
+ ds.SetMetadataItem(
+ 'TIFFTAG_COPYRIGHT', 'Derived from work by Service Topographique, DITTT, GNC. License unclear')
+ ds.SetMetadataItem('TIFFTAG_IMAGEDESCRIPTION',
+ 'Geocentric translation from NEA74 Noumea (EPSG:4644) to RGNC91-93 (EPSG:4749). Converted from gr3dnc03a.mnt')
+ datetime = datetime.date.today().strftime("%Y:%m:%d %H:%M:%S")
+ ds.SetMetadataItem('TIFFTAG_DATETIME', datetime)
+ ds.SetMetadataItem('AREA_OR_POINT', 'Point')
+ ds.SetMetadataItem('TYPE', 'GEOCENTRIC_TRANSLATION')
+ ds.SetMetadataItem('area_of_use', 'France')
+ ds.SetMetadataItem('source_crs_code', '4644')
+ ds.SetMetadataItem('target_crs_epsg_code', '4749')
+ ds.SetGeoTransform(
+ [minx - resx/2, resx, 0, maxy + resy/2, 0, -resy])
+ ds.GetRasterBand(1).SetUnitType('metre')
+ ds.GetRasterBand(1).SetDescription('x_translation')
+ ds.GetRasterBand(2).SetUnitType('metre')
+ ds.GetRasterBand(2).SetDescription('y_translation')
+ ds.GetRasterBand(3).SetUnitType('metre')
+ ds.GetRasterBand(3).SetDescription('z_translation')
+ sr = osr.SpatialReference()
+ sr.ImportFromEPSG(4749)
+ ds.SetProjection(sr.ExportToWkt())
+ else:
+ tokens = []
+ for t in l.split(' '):
+ if t:
+ tokens.append(t)
+ assert len(tokens) == 6
+ lon = float(tokens[0])
+ lat = float(tokens[1])
+ dx = float(tokens[2])
+ dy = float(tokens[3])
+ dz = float(tokens[4])
+ iy = (line_no - 1) % rows
+ ix = (line_no - 1) // rows
+ assert abs(lon - (minx + ix * resx)
+ ) < 1e-10, (line_no, lon, minx + ix * resx)
+ assert abs(lat - (miny + iy * resy)
+ ) < 1e-10, (line_no, lat, miny + iy * resy)
+ ds.GetRasterBand(1).WriteRaster(
+ ix, rows - iy - 1, 1, 1, struct.pack('f', dx))
+ ds.GetRasterBand(2).WriteRaster(
+ ix, rows - iy - 1, 1, 1, struct.pack('f', dy))
+ ds.GetRasterBand(3).WriteRaster(
+ ix, rows - iy - 1, 1, 1, struct.pack('f', dz))
+ count_val_rows += 1
+
+ line_no += 1
+
+ assert count_val_rows == rows * cols
+ gdal.GetDriverByName('GTiff').CreateCopy('gr3dnc03a.tif', ds, options=[
+ 'COMPRESS=DEFLATE', 'PREDICTOR=3', 'INTERLEAVE=BAND'])
diff --git a/grid_tools/ntv2_to_gtiff.py b/grid_tools/ntv2_to_gtiff.py
new file mode 100755
index 0000000..5c2449d
--- /dev/null
+++ b/grid_tools/ntv2_to_gtiff.py
@@ -0,0 +1,517 @@
+#!/usr/bin/env python
+###############################################################################
+# $Id$
+#
+# Project: PROJ
+# Purpose: Convert a NTv2 file into an optimized GTiff
+# Author: Even Rouault <even.rouault at spatialys.com>
+#
+###############################################################################
+# Copyright (c) 2019, Even Rouault <even.rouault at spatialys.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining a
+# copy of this software and associated documentation files (the "Software"),
+# to deal in the Software without restriction, including without limitation
+# the rights to use, copy, modify, merge, publish, distribute, sublicense,
+# and/or sell copies of the Software, and to permit persons to whom the
+# Software is furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included
+# in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
+# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+# DEALINGS IN THE SOFTWARE.
+###############################################################################
+
+# python ./ntv2_to_gtiff.py --copyright "Derived from work by IGN France. Open License https://www.etalab.gouv.fr/wp-content/uploads/2014/05/Open_Licence.pdf" --source-crs EPSG:4275 --target-crs EPSG:4171 /home/even/proj/proj-datumgrid/ntf_r93.gsb ntf_r93.tif
+# python ./ntv2_to_gtiff.py --copyright "Derived from work by LGL-BW.DE. Data licence Germany - attribution - version 2.0: https://www.govdata.de/dl-de/by-2-0" --source-crs EPSG:4314 --target-crs EPSG:4258 /home/even/proj/proj-datumgrid/europe/BWTA2017.gsb BWTA2017.tif --do-not-write-accuracy-samples
+# python ./ntv2_to_gtiff.py --copyright "Derived from work by Natural Resources Canada. Open Government Licence - Canada: http://open.canada.ca/en/open-government-licence-canada" --source-crs EPSG:4267 --target-crs EPSG:4269 /home/even/proj/proj-datumgrid/north-america/ntv2_0.gsb ntv2_0.tif
+# python ./ntv2_to_gtiff.py --copyright "Derived from work by ICSM. Creative Commons Attribution 4.0: https://creativecommons.org/licenses/by/4.0/" --source-crs EPSG:4283 --target-crs EPSG:7844 /home/even/proj/proj-datumgrid/oceania/GDA94_GDA2020_conformal.gsb GDA94_GDA2020_conformal.tif
+
+
+from osgeo import gdal
+from osgeo import osr
+from cloud_optimize_gtiff import generate_optimized_file
+import argparse
+import datetime
+import math
+import os
+import struct
+
+
+def get_args():
+ parser = argparse.ArgumentParser(
+ description='Convert NTv2 grid into PROJ GeoTIFF.')
+ parser.add_argument('source',
+ help='Source NTv2 file')
+ parser.add_argument('dest',
+ help='Destination GeoTIFF file')
+
+ parser.add_argument('--source-crs', dest='source_crs', required=True,
+ help='Source CRS as EPSG:XXXX or WKT')
+
+ parser.add_argument('--target-crs', dest='target_crs', required=True,
+ help='Target CRS as EPSG:XXXX or WKT')
+
+ parser.add_argument('--copyright', dest='copyright', required=True,
+ help='Copyright info')
+
+ parser.add_argument('--description', dest='description',
+ help='Description')
+
+ parser.add_argument('--do-not-write-accuracy-samples', dest='do_not_write_accuracy_samples',
+ action='store_true')
+
+ parser.add_argument('--positive-longitude-shift-value',
+ dest='positive_longitude_shift_value',
+ choices=['east', 'west'],
+ default='east',
+ help='Whether positive values in the longitude_offset channel should be shift to the east or the west')
+
+ parser.add_argument('--uint16-encoding', dest='uint16_encoding',
+ action='store_true',
+ help='Use uint16 storage with linear scaling/offseting')
+
+ parser.add_argument('--datetime', dest='datetime',
+ help='Value for TIFF DateTime tag as YYYY:MM:DD HH:MM:SS, or "NONE" to not write it. If not specified, current date time is used')
+
+ parser.add_argument('--accuracy-unit', dest='accuracy_unit',
+ choices=['arc-second', 'metre', 'unknown'],
+ help='Unit of accuracy channels')
+
+ parser.add_argument('--area-of-use', dest='area_of_use',
+ help='Area of use')
+
+ return parser.parse_args()
+
+
+def get_year_month_day(src_date, src_basename):
+ if len(src_date) == 4 and src_basename == 'GS7783.GSB':
+ # CREATED=1991
+ return int(src_date), 1, 1
+
+ if len(src_date) == 7 and src_basename == 'NB2783v2.GSB':
+ # CREATED=06/2011
+ month = int(src_date[0:2])
+ year = int(src_date[3:7])
+ return year, month, 1
+
+ assert len(src_date) == 8
+ if (src_date[2] == '-' and src_date[5] == '-') or \
+ (src_date[2] == '/' and src_date[5] == '/'):
+ if src_basename.startswith('rdtrans') or \
+ src_basename.startswith('rdcorr') or \
+ src_basename.startswith('ntf_r93') or \
+ src_basename.startswith('BWTA2017') or \
+ src_basename.startswith('BETA2007') or \
+ src_basename.startswith('D73_ETRS89_geo') or \
+ src_basename.startswith('DLx_ETRS89_geo'):
+ # rdtrans2018.gsb has 22-11-18 &
+ # ntf_r93.gsb has 31/10/07, hence D-M-Y
+ day = int(src_date[0:2])
+ month = int(src_date[3:5])
+ year = int(src_date[6:8])
+ else:
+ # CHENyx06a.gsb has 09-07-22 & ntv2_0.gsb and
+ # (other NRCan datasets) has 95-06-30, hence Y-M-D
+ year = int(src_date[0:2])
+ month = int(src_date[3:5])
+ day = int(src_date[6:8])
+ if year >= 90:
+ year += 1900
+ else:
+ assert year <= 50
+ year += 2000
+ else:
+ if src_basename in ('nzgd2kgrid0005.gsb',
+ 'A66_National_13_09_01.gsb',
+ 'National_84_02_07_01.gsb',
+ 'AT_GIS_GRID.gsb',
+ '100800401.gsb') or \
+ src_basename.startswith('GDA94_GDA2020'):
+ # nzgd2kgrid0005 has 20111999, hence D-M-Y
+ day = int(src_date[0:2])
+ month = int(src_date[2:4])
+ year = int(src_date[4:8])
+ elif src_basename == 'bd72lb72_etrs89lb08.gsb':
+ # bd72lb72_etrs89lb08 has 20142308, hence Y-D-M
+ year = int(src_date[0:4])
+ day = int(src_date[4:6])
+ month = int(src_date[6:8])
+ else:
+ year = int(src_date[0:4])
+ month = int(src_date[4:6])
+ day = int(src_date[6:8])
+ return year, month, day
+
+
+def create_unoptimized_file(sourcefilename, tmpfilename, args):
+ src_ds = gdal.Open(sourcefilename)
+ assert src_ds.GetDriver().ShortName in ('NTv2', 'NTv1', 'CTable2')
+ subdatsets = [(sourcefilename, None)]
+ subdatsets += src_ds.GetSubDatasets()
+
+ if src_ds.GetDriver().ShortName in ('NTv1', 'CTable2'):
+ args.do_not_write_accuracy_samples = True
+
+ # Build a subgrids dict whose key is a parent grid name and the
+ # value the list of subgrids
+ subgrids = {}
+ for subds in subdatsets:
+ src_ds = gdal.Open(subds[0])
+ parent_grid_name = src_ds.GetMetadataItem('PARENT')
+ if parent_grid_name is None or parent_grid_name == 'NONE':
+ continue
+ grid_name = src_ds.GetMetadataItem('SUB_NAME')
+ if parent_grid_name in subgrids:
+ subgrids[parent_grid_name].append(grid_name)
+ else:
+ subgrids[parent_grid_name] = [grid_name]
+
+ src_basename = os.path.basename(args.source)
+
+ if args.do_not_write_accuracy_samples:
+ nbands = 2
+ else:
+ _, max = src_ds.GetRasterBand(3).ComputeRasterMinMax()
+ if max <= 0:
+ print('Omitting accuracy bands which contain only <= 0 values')
+ nbands = 2
+ args.do_not_write_accuracy_samples = True
+ else:
+ if not args.accuracy_unit:
+ if src_basename in ('rdtrans2008.gsb',
+ 'rdtrans2018.gsb',
+ 'bd72lb72_etrs89lb08.gsb',
+ 'ntv2_0.gsb',
+ 'MAY76V20.gsb',
+ 'ABCSRSV4.GSB',
+ 'BC_27_05.GSB',
+ 'BC_93_05.GSB',
+ 'CQ77SCRS.GSB',
+ 'CRD27_00.GSB',
+ 'CRD93_00.GSB',
+ 'GS7783.GSB',
+ 'NA27SCRS.GSB',
+ 'NA83SCRS.GSB',
+ 'NB2783v2.GSB',
+ 'NB7783v2.GSB',
+ 'NS778302.GSB',
+ 'NVI93_05.GSB',
+ 'ON27CSv1.GSB',
+ 'ON76CSv1.GSB',
+ 'ON83CSv1.GSB',
+ 'PE7783V2.GSB',
+ 'SK27-98.GSB',
+ 'SK83-98.GSB',
+ 'TO27CSv1.GSB',
+ 'rdcorr2018.gsb',):
+ args.accuracy_unit = 'metre'
+
+ elif src_basename in ('ntf_r93.gsb',
+ 'nzgd2kgrid0005.gsb',
+ 'OSTN15_NTv2_OSGBtoETRS.gsb',
+ 'CHENyx06a.gsb',
+ 'CHENyx06_ETRS.gsb',
+ 'A66_National_13_09_01.gsb',
+ 'National_84_02_07_01.gsb',
+ 'GDA94_GDA2020_conformal_and_distortion.gsb',
+ 'DLx_ETRS89_geo.gsb',
+ 'D73_ETRS89_geo.gsb'):
+ args.accuracy_unit = 'arc-second'
+
+ else:
+ raise Exception(
+ '--accuracy-unit=arc-second/metre should be specified')
+ nbands = 4
+
+ compact_md = True if len(subdatsets) > 50 else False
+
+ for idx_ifd, subds in enumerate(subdatsets):
+ src_ds = gdal.Open(subds[0])
+ if src_ds.GetDriver().ShortName == 'NTv2':
+ assert src_ds.GetMetadataItem('GS_TYPE') == 'SECONDS'
+
+ tmp_ds = gdal.GetDriverByName('GTiff').Create('/vsimem/tmp',
+ src_ds.RasterXSize,
+ src_ds.RasterYSize,
+ nbands,
+ gdal.GDT_Float32 if not args.uint16_encoding else gdal.GDT_UInt16)
+ src_crs = osr.SpatialReference()
+ src_crs.SetFromUserInput(args.source_crs)
+ tmp_ds.SetSpatialRef(src_crs)
+ tmp_ds.SetGeoTransform(src_ds.GetGeoTransform())
+ tmp_ds.SetMetadataItem('AREA_OR_POINT', 'Point')
+
+ grid_name = src_ds.GetMetadataItem('SUB_NAME')
+ if grid_name:
+ if src_basename == 'NVI93_05.GSB' and grid_name == 'NVIsib':
+ grid_name = grid_name + str(idx_ifd+1)
+ print('Fixing wrong SUB_NAME of NVI93_05.GSB to ' + grid_name)
+ tmp_ds.SetMetadataItem('grid_name', grid_name)
+ parent_grid_name = src_ds.GetMetadataItem('PARENT')
+ if parent_grid_name is None or parent_grid_name == 'NONE':
+ tmp_ds.SetMetadataItem('TYPE', 'HORIZONTAL_OFFSET')
+ else:
+ tmp_ds.SetMetadataItem('parent_grid_name', parent_grid_name)
+ if grid_name in subgrids:
+ tmp_ds.SetMetadataItem(
+ 'number_of_nested_grids', str(len(subgrids[grid_name])))
+
+ if idx_ifd == 0 or not compact_md:
+ # Indicates that positive shift values are corrections to the west !
+ tmp_ds.GetRasterBand(2).SetMetadataItem('positive_value',
+ args.positive_longitude_shift_value)
+
+ if args.uint16_encoding:
+ assert src_ds.GetDriver().ShortName == 'NTv2'
+ for i in (1, 2):
+ min, max = src_ds.GetRasterBand(i).ComputeRasterMinMax()
+ data = src_ds.GetRasterBand(i).ReadAsArray()
+ scale = (max - min) / 65535
+ if i == 2 and args.positive_longitude_shift_value == 'east':
+ data = -data
+ data = (data - min) / scale
+ tmp_ds.GetRasterBand(i).WriteArray(data)
+ tmp_ds.GetRasterBand(i).SetOffset(min)
+ tmp_ds.GetRasterBand(i).SetScale(scale)
+ if idx_ifd == 0 or not compact_md:
+ tmp_ds.GetRasterBand(i).SetDescription(
+ 'latitude_offset' if i == 1 else 'longitude_offset')
+ tmp_ds.GetRasterBand(i).SetUnitType('arc-second')
+
+ if nbands == 4:
+ for i in (3, 4):
+ min, max = src_ds.GetRasterBand(i).ComputeRasterMinMax()
+ data = src_ds.GetRasterBand(i).ReadAsArray()
+ scale = (max - min) / 65535
+ if scale == 0:
+ data = 0 * data
+ else:
+ data = (data - min) / scale
+ tmp_ds.GetRasterBand(i).WriteArray(data)
+ tmp_ds.GetRasterBand(i).SetOffset(min)
+ tmp_ds.GetRasterBand(i).SetScale(scale)
+ if idx_ifd == 0 or not compact_md:
+ tmp_ds.GetRasterBand(i).SetDescription(
+ 'latitude_offset_accuracy' if i == 3 else 'longitude_offset_accuracy')
+ tmp_ds.GetRasterBand(i).SetUnitType(args.accuracy_unit)
+
+ else:
+ for i in (1, 2):
+ data = src_ds.GetRasterBand(i).ReadRaster(
+ buf_type=gdal.GDT_Float32)
+
+ if src_ds.GetDriver().ShortName == 'CTable2':
+ nvalues = src_ds.RasterXSize * src_ds.RasterYSize
+ out_data = b''
+ # From radian to arc-seconds
+ for v in struct.unpack('f' * nvalues, data):
+ out_data += struct.pack('f',
+ v / math.pi * 180.0 * 3600)
+ data = out_data
+
+ if i == 2 and args.positive_longitude_shift_value == 'east':
+ nvalues = src_ds.RasterXSize * src_ds.RasterYSize
+ out_data = b''
+ for v in struct.unpack('f' * nvalues, data):
+ out_data += struct.pack('f', -v)
+ data = out_data
+ tmp_ds.GetRasterBand(i).WriteRaster(0, 0, src_ds.RasterXSize, src_ds.RasterYSize,
+ data)
+ if idx_ifd == 0 or not compact_md:
+ tmp_ds.GetRasterBand(i).SetDescription(
+ 'latitude_offset' if i == 1 else 'longitude_offset')
+ tmp_ds.GetRasterBand(i).SetUnitType('arc-second')
+
+ if nbands == 4:
+ for i in (3, 4):
+ data = src_ds.GetRasterBand(i).ReadRaster()
+ tmp_ds.GetRasterBand(i).WriteRaster(0, 0, src_ds.RasterXSize, src_ds.RasterYSize,
+ data)
+ if idx_ifd == 0 or not compact_md:
+ tmp_ds.GetRasterBand(i).SetDescription(
+ 'latitude_offset_accuracy' if i == 3 else 'longitude_offset_accuracy')
+ tmp_ds.GetRasterBand(i).SetUnitType(args.accuracy_unit)
+
+ dst_crs = osr.SpatialReference()
+ dst_crs.SetFromUserInput(args.target_crs)
+ dst_auth_name = dst_crs.GetAuthorityName(None)
+ dst_auth_code = dst_crs.GetAuthorityCode(None)
+ if idx_ifd == 0 or not compact_md:
+ if dst_auth_name == 'EPSG' and dst_auth_code:
+ tmp_ds.SetMetadataItem('target_crs_epsg_code', dst_auth_code)
+ else:
+ tmp_ds.SetMetadataItem(
+ 'target_crs_wkt', dst_crs.ExportToWkt(['FORMAT=WKT2_2018']))
+
+ if idx_ifd == 0:
+ desc = args.description
+ if not desc:
+ src_auth_name = src_crs.GetAuthorityName(None)
+ src_auth_code = src_crs.GetAuthorityCode(None)
+
+ desc = src_crs.GetName()
+ if src_auth_name and src_auth_code:
+ desc += ' (' + src_auth_name + ':' + src_auth_code + ')'
+ desc += ' to '
+ desc += dst_crs.GetName()
+ if dst_auth_name and dst_auth_code:
+ desc += ' (' + dst_auth_name + ':' + dst_auth_code + ')'
+ desc += '. Converted from '
+ desc += src_basename
+
+ extra_info = []
+ version = src_ds.GetMetadataItem('VERSION')
+ if version:
+ version = version.strip()
+ if version not in ('NTv2.0',):
+ extra_info.append('version ' + version)
+
+ src_date = src_ds.GetMetadataItem('UPDATED')
+ created_date = src_ds.GetMetadataItem('CREATED')
+ if src_date:
+ src_date = src_date.strip()
+ if created_date:
+ created_date = created_date.strip()
+ if not src_date:
+ src_date = created_date
+
+ if created_date and src_date and len(src_date) < len(created_date):
+ # SK27-98.GSB
+ # CREATED=00-02-04
+ # UPDATED=0-15-00
+ # SK83-98.GSB
+ # CREATED=98-12-18
+ # UPDATED=0-15-06
+ src_date = created_date
+
+ if src_date:
+ year, month, day = get_year_month_day(
+ src_date, src_basename)
+
+ # Various sanity checks
+ assert day >= 1 and day <= 31
+ assert month >= 1 and month <= 12
+ assert year >= 1980
+ assert year <= datetime.datetime.now().year
+ # assume agencies only work monday to friday...
+ # except in Belgium where they work on sundays
+ # and in NZ on saturdays
+ if src_basename not in ('nzgd2kgrid0005.gsb',
+ 'bd72lb72_etrs89lb08.gsb',
+ 'GS7783.GSB',
+ 'NB2783v2.GSB',
+ 'ON27CSv1.GSB',
+ 'ON76CSv1.GSB',
+ ):
+ assert datetime.datetime(
+ year, month, day).weekday() <= 4
+
+ # Sanity check that creation_date <= last_updated_date
+ if created_date:
+ year_created, month_created, day_created = get_year_month_day(
+ created_date, src_basename)
+ assert year_created * 10000 + month_created * 100 + \
+ day_created <= year * 10000 + month * 100 + day
+
+ extra_info.append(
+ 'last updated on %04d-%02d-%02d' % (year, month, day))
+
+ if extra_info:
+ desc += ' (' + ', '.join(extra_info) + ')'
+
+ tmp_ds.SetMetadataItem('TIFFTAG_IMAGEDESCRIPTION', desc)
+ if args.copyright:
+ tmp_ds.SetMetadataItem('TIFFTAG_COPYRIGHT', args.copyright)
+ if args.datetime and args.datetime != 'NONE':
+ tmp_ds.SetMetadataItem('TIFFTAG_DATETIME', args.datetime)
+ if args.area_of_use:
+ tmp_ds.SetMetadataItem('area_of_use', args.area_of_use)
+
+ options = ['PHOTOMETRIC=MINISBLACK',
+ 'COMPRESS=DEFLATE',
+ 'PREDICTOR=3' if not args.uint16_encoding else 'PREDICTOR=2',
+ 'INTERLEAVE=BAND',
+ 'GEOTIFF_VERSION=1.1']
+ if tmp_ds.RasterXSize > 256 and tmp_ds.RasterYSize > 256:
+ options.append('TILED=YES')
+ else:
+ options.append('BLOCKYSIZE=' + str(tmp_ds.RasterYSize))
+ if gdal.VSIStatL(tmpfilename) is not None:
+ options.append('APPEND_SUBDATASET=YES')
+
+ assert gdal.GetDriverByName('GTiff').CreateCopy(tmpfilename, tmp_ds,
+ options=options)
+
+
+def check(sourcefilename, destfilename, args):
+ src_ds = gdal.Open(sourcefilename)
+ assert src_ds.GetDriver().ShortName in ('NTv2', 'NTv1', 'CTable2')
+ src_subdatsets = [(sourcefilename, None)]
+ src_subdatsets += src_ds.GetSubDatasets()
+
+ dst_ds = gdal.Open(destfilename)
+ dst_subdatsets = dst_ds.GetSubDatasets()
+ if not dst_subdatsets:
+ dst_subdatsets = [(destfilename, None)]
+
+ assert len(src_subdatsets) == len(dst_subdatsets)
+ for src_subds, dst_subds in zip(src_subdatsets, dst_subdatsets):
+ src_ds = gdal.Open(src_subds[0])
+ dst_ds = gdal.Open(dst_subds[0])
+ if not args.uint16_encoding:
+ for i in range(min(src_ds.RasterCount, dst_ds.RasterCount)):
+ data = src_ds.GetRasterBand(
+ i+1).ReadRaster(buf_type=gdal.GDT_Float32)
+
+ if src_ds.GetDriver().ShortName == 'CTable2':
+ nvalues = src_ds.RasterXSize * src_ds.RasterYSize
+ out_data = b''
+ # From radian to arc-seconds
+ for v in struct.unpack('f' * nvalues, data):
+ out_data += struct.pack('f',
+ v / math.pi * 180.0 * 3600)
+ data = out_data
+
+ if i+1 == 2 and args.positive_longitude_shift_value == 'east':
+ nvalues = src_ds.RasterXSize * src_ds.RasterYSize
+ out_data = b''
+ for v in struct.unpack('f' * nvalues, data):
+ out_data += struct.pack('f', -v)
+ data = out_data
+ assert dst_ds.GetRasterBand(i+1).ReadRaster() == data
+ else:
+ import numpy as np
+ for i in range(min(src_ds.RasterCount, dst_ds.RasterCount)):
+ src_data = src_ds.GetRasterBand(i+1).ReadAsArray()
+ dst_data = dst_ds.GetRasterBand(i+1).ReadAsArray()
+ offset = dst_ds.GetRasterBand(i+1).GetOffset()
+ scale = dst_ds.GetRasterBand(i+1).GetScale()
+ dst_data = dst_data * scale + offset
+ if i+1 == 2 and args.positive_longitude_shift_value == 'east':
+ dst_data = -dst_data
+ max_error = np.max(abs(dst_data - src_data))
+ assert max_error <= 1.01 * scale / 2, (max_error, scale / 2)
+
+
+if __name__ == '__main__':
+
+ args = get_args()
+
+ tmpfilename = args.dest + '.tmp'
+ gdal.Unlink(tmpfilename)
+
+ if not args.datetime and args.datetime != 'NONE':
+ args.datetime = datetime.date.today().strftime("%Y:%m:%d %H:%M:%S")
+
+ create_unoptimized_file(args.source, tmpfilename, args)
+ generate_optimized_file(tmpfilename, args.dest)
+ check(args.source, args.dest, args)
+
+ gdal.Unlink(tmpfilename)
diff --git a/grid_tools/vertoffset_grid_to_gtiff.py b/grid_tools/vertoffset_grid_to_gtiff.py
new file mode 100755
index 0000000..db06746
--- /dev/null
+++ b/grid_tools/vertoffset_grid_to_gtiff.py
@@ -0,0 +1,389 @@
+#!/usr/bin/env python
+###############################################################################
+# $Id$
+#
+# Project: PROJ
+# Purpose: Convert a GTX or similar file into an optimized GTiff
+# Author: Even Rouault <even.rouault at spatialys.com>
+#
+###############################################################################
+# Copyright (c) 2019, Even Rouault <even.rouault at spatialys.com>
+#
+# Permission is hereby granted, free of charge, to any person obtaining a
+# copy of this software and associated documentation files (the "Software"),
+# to deal in the Software without restriction, including without limitation
+# the rights to use, copy, modify, merge, publish, distribute, sublicense,
+# and/or sell copies of the Software, and to permit persons to whom the
+# Software is furnished to do so, subject to the following conditions:
+#
+# The above copyright notice and this permission notice shall be included
+# in all copies or substantial portions of the Software.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
+# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+# DEALINGS IN THE SOFTWARE.
+###############################################################################
+
+# python vertoffset_grid_to_gtiff.py $HOME/proj/proj-datumgrid/egm96_15.gtx egm96_15.tif --type GEOGRAPHIC_TO_VERTICAL --source-crs EPSG:4326 --target-crs EPSG:5773 --copyright "Public Domain. Derived from work at http://earth-info.nga.mil/GandG/wgs84/gravitymod/egm96/egm96.html"
+# python vertoffset_grid_to_gtiff.py $HOME/proj/proj-datumgrid/world/egm08_25.gtx egm08_25.tif --type GEOGRAPHIC_TO_VERTICAL --source-crs EPSG:4326 --target-crs EPSG:3855 --copyright "Public Domain. Derived from work at http://earth-info.nga.mil/GandG/wgs84/gravitymod/egm2008/egm08_wgs84.html"
+# python vertoffset_grid_to_gtiff.py $HOME/proj/proj-datumgrid/north-america/vertcone.gtx vertcone.tif --type VERTICAL_TO_VERTICAL --source-crs EPSG:7968 --target-crs EPSG:5703 --interpolation-crs EPSG:4267 --copyright "Public Domain. Derived from work at https://www.ngs.noaa.gov/PC_PROD/VERTCON/"
+
+from osgeo import gdal
+from osgeo import osr
+from cloud_optimize_gtiff import generate_optimized_file
+import argparse
+import datetime
+import os
+import struct
+
+
+def get_args():
+ parser = argparse.ArgumentParser(
+ description='Convert a vertical offset grid into PROJ GeoTIFF.')
+ parser.add_argument('source',
+ help='Source file')
+ parser.add_argument('dest',
+ help='Destination GeoTIFF file')
+
+ parser.add_argument('--type', dest='type', required=True,
+ choices=['GEOGRAPHIC_TO_VERTICAL',
+ 'VERTICAL_TO_VERTICAL'],
+ help='Type of grid')
+
+ parser.add_argument('--source-crs', dest='source_crs', required=True,
+ help='Source CRS as EPSG:XXXX or WKT')
+
+ parser.add_argument('--interpolation-crs', dest='interpolation_crs',
+ help='Interpolation CRS as EPSG:XXXX or WKT. Required for type=VERTICAL_TO_VERTICAL')
+
+ parser.add_argument('--target-crs', dest='target_crs', required=True,
+ help='Target CRS as EPSG:XXXX or WKT')
+
+ parser.add_argument('--copyright', dest='copyright', required=True,
+ help='Copyright info')
+
+ parser.add_argument('--description', dest='description',
+ help='Description')
+
+ parser.add_argument('--encoding',
+ dest='encoding',
+ choices=['float32', 'uint16', 'int32-scale-1-1000'],
+ default='float32',
+ help='Binary encoding. int32-scale-1-1000 is for for Canadian .byn')
+
+ parser.add_argument('--ignore-nodata', dest='ignore_nodata',
+ action='store_true',
+ help='Whether to ignore nodata')
+
+ parser.add_argument('--datetime', dest='datetime',
+ help='Value for TIFF DateTime tag as YYYY:MM:DD HH:MM:SS, or "NONE" to not write it. If not specified, current date time is used')
+
+ parser.add_argument('--area-of-use', dest='area_of_use',
+ help='Area of use')
+
+ return parser.parse_args()
+
+
+def create_unoptimized_file(sourcefilename, tmpfilename, args):
+ src_ds = gdal.Open(sourcefilename)
+ subdatsets = src_ds.GetSubDatasets()
+ if not subdatsets:
+ subdatsets = [(sourcefilename, None)]
+
+ src_basename = os.path.basename(args.source)
+ nbands = 1
+
+ compact_md = True if len(subdatsets) > 50 else False
+
+ is_vertcon = src_basename in (
+ 'vertcone.gtx', 'vertconw.gtx', 'vertconc.gtx')
+
+ for idx_ifd, subds in enumerate(subdatsets):
+ src_ds = gdal.Open(subds[0])
+
+ if args.encoding == 'int32-scale-1-1000':
+ target_dt = gdal.GDT_Int32
+ elif args.encoding == 'uint16':
+ target_dt = gdal.GDT_UInt16
+ else:
+ target_dt = gdal.GDT_Float32
+ tmp_ds = gdal.GetDriverByName('GTiff').Create('/vsimem/tmp',
+ src_ds.RasterXSize,
+ src_ds.RasterYSize,
+ nbands,
+ target_dt)
+ interpolation_crs = osr.SpatialReference()
+ if args.type == 'GEOGRAPHIC_TO_VERTICAL':
+ interpolation_crs.SetFromUserInput(args.source_crs)
+ else:
+ interpolation_crs.SetFromUserInput(args.interpolation_crs)
+ assert interpolation_crs.IsGeographic()
+ tmp_ds.SetSpatialRef(interpolation_crs)
+ tmp_ds.SetGeoTransform(src_ds.GetGeoTransform())
+ tmp_ds.SetMetadataItem('AREA_OR_POINT', 'Point')
+
+ if args.type == 'GEOGRAPHIC_TO_VERTICAL':
+ tmp_ds.SetMetadataItem(
+ 'TYPE', 'VERTICAL_OFFSET_GEOGRAPHIC_TO_VERTICAL')
+ else:
+ tmp_ds.SetMetadataItem(
+ 'TYPE', 'VERTICAL_OFFSET_VERTICAL_TO_VERTICAL')
+
+ src_crs = osr.SpatialReference()
+ src_crs.SetFromUserInput(args.source_crs)
+ if args.type == 'VERTICAL_TO_VERTICAL':
+ assert src_crs.IsVertical()
+ src_auth_name = src_crs.GetAuthorityName(None)
+ src_auth_code = src_crs.GetAuthorityCode(None)
+ if idx_ifd == 0 or not compact_md:
+ if src_auth_name == 'EPSG' and src_auth_code:
+ tmp_ds.SetMetadataItem(
+ 'source_crs_epsg_code', src_auth_code)
+ else:
+ tmp_ds.SetMetadataItem(
+ 'source_crs_wkt', src_crs.ExportToWkt(['FORMAT=WKT2_2018']))
+
+ dst_crs = osr.SpatialReference()
+ dst_crs.SetFromUserInput(args.target_crs)
+ assert dst_crs.IsVertical()
+ dst_auth_name = dst_crs.GetAuthorityName(None)
+ dst_auth_code = dst_crs.GetAuthorityCode(None)
+ if idx_ifd == 0 or not compact_md:
+ if dst_auth_name == 'EPSG' and dst_auth_code:
+ tmp_ds.SetMetadataItem('target_crs_epsg_code', dst_auth_code)
+ else:
+ tmp_ds.SetMetadataItem(
+ 'target_crs_wkt', dst_crs.ExportToWkt(['FORMAT=WKT2_2018']))
+
+ def is_false_positive_nodata(y, x):
+ return src_basename == 'egm08_25.gtx' and y == 2277 and x == 6283
+
+ if args.encoding == 'int32-scale-1-1000':
+ assert src_ds.RasterCount == 1
+ scale = 0.001
+ nodata = None if args.ignore_nodata else src_ds.GetRasterBand(
+ 1).GetNoDataValue()
+ data = src_ds.GetRasterBand(1).ReadAsArray()
+ if nodata is None:
+ data = data / scale
+ else:
+ dst_nodata = 9999000
+ nodata_as_f = struct.pack('f', nodata)
+ has_warned_nodata = False
+ for y in range(src_ds.RasterYSize):
+ for x in range(src_ds.RasterXSize):
+ is_nodata = False
+ if struct.pack('f', data[y][x]) == nodata_as_f:
+ if is_false_positive_nodata(y, x):
+ pass
+ elif not has_warned_nodata:
+ print(
+ 'At least one value matches nodata (at %d,%d). Setting it' % (y, x))
+ tmp_ds.GetRasterBand(
+ 1).SetNoDataValue(dst_nodata)
+ has_warned_nodata = True
+ is_nodata = True
+ else:
+ is_nodata = True
+
+ if is_nodata:
+ data[y][x] = dst_nodata
+ else:
+ data[y][x] = data[y][x] / scale
+
+ tmp_ds.GetRasterBand(1).WriteArray(data)
+ tmp_ds.GetRasterBand(1).SetOffset(0)
+ tmp_ds.GetRasterBand(1).SetScale(scale)
+ if idx_ifd == 0 or not compact_md:
+ tmp_ds.GetRasterBand(1).SetDescription(
+ 'geoid_undulation' if args.type == 'GEOGRAPHIC_TO_VERTICAL' else 'vertical_offset')
+ tmp_ds.GetRasterBand(1).SetUnitType('metre')
+
+ elif args.encoding == 'uint16':
+ for i in (1, ):
+ min, max = src_ds.GetRasterBand(i).ComputeRasterMinMax()
+ data = src_ds.GetRasterBand(i).ReadAsArray()
+ if is_vertcon: # in millimetres originally !
+ assert min < -100 or max > 100
+ min = min * 0.001
+ max = max * 0.001
+
+ dst_nodata = 65535
+ scale = (max - min) / 65534 # we use 65535 for nodata
+
+ nodata = None if args.ignore_nodata else src_ds.GetRasterBand(
+ i).GetNoDataValue()
+ if nodata is None:
+ if is_vertcon: # in millimetres originally !
+ data = data * 0.001
+ data = (data - min) / scale
+ else:
+ nodata_as_f = struct.pack('f', nodata)
+ has_warned_nodata = False
+ for y in range(src_ds.RasterYSize):
+ for x in range(src_ds.RasterXSize):
+ is_nodata = False
+ if struct.pack('f', data[y][x]) == nodata_as_f:
+ if is_false_positive_nodata(y, x):
+ pass
+ elif not has_warned_nodata:
+ print(
+ 'At least one value matches nodata (at %d,%d). Setting it' % (y, x))
+ tmp_ds.GetRasterBand(
+ i).SetNoDataValue(dst_nodata)
+ has_warned_nodata = True
+ is_nodata = True
+ else:
+ is_nodata = True
+
+ if is_nodata:
+ data[y][x] = dst_nodata
+ else:
+ if is_vertcon: # in millimetres originally !
+ data[y][x] = (
+ data[y][x] * 0.001 - min) / scale
+ else:
+ data[y][x] = (data[y][x] - min) / scale
+
+ tmp_ds.GetRasterBand(i).WriteArray(data)
+ tmp_ds.GetRasterBand(i).SetOffset(min)
+ tmp_ds.GetRasterBand(i).SetScale(scale)
+ if idx_ifd == 0 or not compact_md:
+ tmp_ds.GetRasterBand(i).SetDescription(
+ 'geoid_undulation' if args.type == 'GEOGRAPHIC_TO_VERTICAL' else 'vertical_offset')
+ tmp_ds.GetRasterBand(i).SetUnitType('metre')
+
+ else:
+ for i in (1,):
+ data = src_ds.GetRasterBand(i).ReadRaster()
+
+ nodata = None if args.ignore_nodata else src_ds.GetRasterBand(
+ i).GetNoDataValue()
+ nvalues = src_ds.RasterXSize * src_ds.RasterYSize
+ dst_nodata = -32768
+ if nodata:
+ packed_src_nodata = struct.pack('f', nodata)
+ packed_dst_nodata = struct.pack('f', dst_nodata)
+ assert src_ds.GetRasterBand(1).DataType == gdal.GDT_Float32
+ has_warned_nodata = False
+ out_data = bytearray(data)
+ for idx in range(len(data)//4):
+ is_nodata = False
+ if data[idx*4:idx*4+4] == packed_src_nodata:
+ y = idx // src_ds.RasterXSize
+ x = idx % src_ds.RasterXSize
+ if is_false_positive_nodata(y, x):
+ pass
+ elif not has_warned_nodata:
+ print(
+ 'At least one value matches nodata (at idx %d,%d). Setting it' % (y, x))
+ tmp_ds.GetRasterBand(
+ i).SetNoDataValue(dst_nodata)
+ has_warned_nodata = True
+ is_nodata = True
+ else:
+ is_nodata = True
+
+ if is_nodata:
+ out_data[idx*4:idx*4+4] = packed_dst_nodata
+
+ data = bytes(out_data)
+
+ if is_vertcon: # in millimetres originally !
+ min, max = src_ds.GetRasterBand(i).ComputeRasterMinMax()
+ assert min < -100 or max > 100
+ assert src_ds.GetRasterBand(1).DataType == gdal.GDT_Float32
+ out_data = b''
+ for v in struct.unpack('f' * nvalues, data):
+ out_data += struct.pack('f', v * 0.001 if v !=
+ dst_nodata else dst_nodata)
+ data = out_data
+
+ tmp_ds.GetRasterBand(i).WriteRaster(0, 0, src_ds.RasterXSize, src_ds.RasterYSize,
+ data)
+ if idx_ifd == 0 or not compact_md:
+ tmp_ds.GetRasterBand(i).SetDescription(
+ 'geoid_undulation' if args.type == 'GEOGRAPHIC_TO_VERTICAL' else 'vertical_offset')
+ tmp_ds.GetRasterBand(i).SetUnitType('metre')
+
+ if idx_ifd == 0:
+ desc = args.description
+ if not desc:
+ src_auth_name = src_crs.GetAuthorityName(None)
+ src_auth_code = src_crs.GetAuthorityCode(None)
+
+ desc = src_crs.GetName()
+ if src_auth_name and src_auth_code:
+ desc += ' (' + src_auth_name + ':' + src_auth_code + ')'
+ desc += ' to '
+ desc += dst_crs.GetName()
+ if dst_auth_name and dst_auth_code:
+ desc += ' (' + dst_auth_name + ':' + dst_auth_code + ')'
+ desc += '. Converted from '
+ desc += src_basename
+ desc += ' (last modified at '
+ desc += datetime.datetime.utcfromtimestamp(
+ os.stat(sourcefilename).st_mtime).strftime("%Y/%m/%d")
+ desc += ')'
+
+ tmp_ds.SetMetadataItem('TIFFTAG_IMAGEDESCRIPTION', desc)
+ if args.copyright:
+ tmp_ds.SetMetadataItem('TIFFTAG_COPYRIGHT', args.copyright)
+ if args.datetime and args.datetime != 'NONE':
+ tmp_ds.SetMetadataItem('TIFFTAG_DATETIME', args.datetime)
+ if args.area_of_use:
+ tmp_ds.SetMetadataItem('area_of_use', args.area_of_use)
+
+ options = ['PHOTOMETRIC=MINISBLACK',
+ 'COMPRESS=DEFLATE',
+ 'PREDICTOR=3' if target_dt == gdal.GDT_Float32 else 'PREDICTOR=2',
+ 'INTERLEAVE=BAND',
+ 'GEOTIFF_VERSION=1.1']
+ if tmp_ds.RasterXSize > 256 and tmp_ds.RasterYSize > 256:
+ options.append('TILED=YES')
+ else:
+ options.append('BLOCKYSIZE=' + str(tmp_ds.RasterYSize))
+ if gdal.VSIStatL(tmpfilename) is not None:
+ options.append('APPEND_SUBDATASET=YES')
+
+ assert gdal.GetDriverByName('GTiff').CreateCopy(tmpfilename, tmp_ds,
+ options=options)
+
+
+def check(sourcefilename, destfilename, args):
+ src_ds = gdal.Open(sourcefilename)
+ src_subdatsets = src_ds.GetSubDatasets()
+ if not src_subdatsets:
+ src_subdatsets = [(sourcefilename, None)]
+
+ dst_ds = gdal.Open(destfilename)
+ dst_subdatsets = dst_ds.GetSubDatasets()
+ if not dst_subdatsets:
+ dst_subdatsets = [(destfilename, None)]
+
+ assert len(src_subdatsets) == len(dst_subdatsets)
+ for dst_subds in dst_subdatsets:
+ dst_ds = gdal.Open(dst_subds[0])
+ dst_ds.GetRasterBand(1).Checksum()
+ assert gdal.GetLastErrorMsg() == ''
+
+
+if __name__ == '__main__':
+
+ args = get_args()
+
+ tmpfilename = args.dest + '.tmp'
+ gdal.Unlink(tmpfilename)
+
+ if not args.datetime and args.datetime != 'NONE':
+ args.datetime = datetime.date.today().strftime("%Y:%m:%d %H:%M:%S")
+
+ create_unoptimized_file(args.source, tmpfilename, args)
+ generate_optimized_file(tmpfilename, args.dest)
+ check(args.source, args.dest, args)
+
+ gdal.Unlink(tmpfilename)