summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorEven Rouault <even.rouault@spatialys.com>2020-04-10 22:14:55 +0200
committerGitHub <noreply@github.com>2020-04-10 22:14:55 +0200
commit01cb6d2b669729acc891ee6309ed97d6a99c1751 (patch)
tree86f11c2a54e642375b13fc8c4e6cc4d49209f317
parentcf29b837994ab52a576f6c3556f6b0657f353ab8 (diff)
parente197ad7af11fc2f072fedfb17b88e7f5a1f5a4fe (diff)
downloadPROJ-data-01cb6d2b669729acc891ee6309ed97d6a99c1751.tar.gz
PROJ-data-01cb6d2b669729acc891ee6309ed97d6a99c1751.zip
Merge pull request #18 from jjimenezshaw/typos
Typos
-rw-r--r--CONTRIBUTING.md4
-rw-r--r--grid_tools/README.md2
2 files changed, 3 insertions, 3 deletions
diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md
index 246fa88..3a616b1 100644
--- a/CONTRIBUTING.md
+++ b/CONTRIBUTING.md
@@ -21,7 +21,7 @@ If a new agency has to be created:
2. Go in that directory
3. Create a {agency_id}_README.txt file whose content is inspired from similar existing files
4. Create a .github subdirectory
-5. Create a symbolic link from the file created at 3. as README.md: ln -s ../{agency_id}_README.txt README.md
+5. Create a symbolic link from the file created at 3. as README.md: ln -s ../{agency_id}_README.txt .github/README.md
6. Edit CMakeLists.txt at the root of the repository to package the new directory
The steps for adding a GeoTIFF grid to an existing subdirectory are:
@@ -49,7 +49,7 @@ the PROJ database (typically they have a EPSG code), a transformation for them u
the grid must be referenced in the PROJ database. Generally, the EPSG database will
already have an entry for the grid, sometimes with a slightly different name.
The relevant file to look into is [grid_transformation.sql](https://github.com/OSGeo/PROJ/blob/master/data/sql/grid_transformation.sql).
-If the gris is not yet registered in the EPSG database, you are *strongly* encouraged to
+If the grid is not yet registered in the EPSG database, you are *strongly* encouraged to
engage with EPSG to register it. This will make its addition to PROJ and its later maintenance
much easier. http://www.epsg.org/EPSGDataset/Makechangerequest.aspx explains the procedure
to follow to submit a change request to EPSG.
diff --git a/grid_tools/README.md b/grid_tools/README.md
index f8cab3c..b717373 100644
--- a/grid_tools/README.md
+++ b/grid_tools/README.md
@@ -47,7 +47,7 @@ Options:
* --do-not-write-accuracy-samples: To prevent accuracy samples from the NTv2 grid to be written in the output. Note: if it is detected that those samples are set to dummy values (negative values), they will automatically be discarded
* --positive-longitude-shift-value {east,west}: To force the convention for the longitude shift value. NTv2 uses a positive-is-west convention, that is confusing. By default, the script will negate the sign of the longitude shift values to output a positive-is-east convention. Setting this option to "west" will preserve the original convention. Not recommended
* --uint16-encoding: Whether values should be encoded on a 16-bit unsigned value, using a offset and scale floating point values. Default behaviour is to use Float32 encoding, which will preserve the binary values of the original NTv2 file
-* --datetime DATETIME: to specify the value of the TIFF DateTime tag. Must be formatted as "YYYY:MM:DD HH:MM:SS" (note the use of column as the separator for the Y-M-D part, as mandated by the TIFF specification). If not specified, the script will try to use the value from the corresponding NTv2 header
+* --datetime DATETIME: to specify the value of the TIFF DateTime tag. Must be formatted as "YYYY:MM:DD HH:MM:SS" (note the use of colon as the separator for the Y-M-D part, as mandated by the TIFF specification). If not specified, the script will try to use the value from the corresponding NTv2 header
* --accuracy-unit {arc-second,metre,unknown}: to specify the unit of the accuracy samples. The NTv2 specification has been [historically interpreted in different ways regarding that](https://github.com/OSGeo/PROJ/wiki/Units-of-NTv2-accuracy-samples-%3F). Mandatory if accuracy samples are written (the script contains a few hardcoded rules for known datasets)
Example: