aboutsummaryrefslogtreecommitdiff
path: root/_sources/community
diff options
context:
space:
mode:
authorPROJ deploybot <proj.bot@proj.bot>2022-03-22 20:00:06 +0000
committerPROJ deploybot <proj.bot@proj.bot>2022-03-22 20:00:06 +0000
commita3f43744feec86272fe532124679d3a013ef9a8c (patch)
tree27e4198db6011e3097eb7bcfe7197684aba7583a /_sources/community
downloadPROJ-gh-pages.tar.gz
PROJ-gh-pages.zip
update with results of commit https://github.com/OSGeo/PROJ/commit/53c07a8bd211b7aee4bc07a9c6726005504b7181gh-pages
Diffstat (limited to '_sources/community')
-rw-r--r--_sources/community/channels.rst.txt46
-rw-r--r--_sources/community/code_contributions.rst.txt177
-rw-r--r--_sources/community/code_of_conduct.rst.txt92
-rw-r--r--_sources/community/contributing.rst.txt175
-rw-r--r--_sources/community/index.rst.txt32
-rw-r--r--_sources/community/rfc/index.rst.txt20
-rw-r--r--_sources/community/rfc/rfc-1.rst.txt172
-rw-r--r--_sources/community/rfc/rfc-2.rst.txt933
-rw-r--r--_sources/community/rfc/rfc-3.rst.txt151
-rw-r--r--_sources/community/rfc/rfc-4.rst.txt804
-rw-r--r--_sources/community/rfc/rfc-5.rst.txt135
-rw-r--r--_sources/community/rfc/rfc-6.rst.txt368
-rw-r--r--_sources/community/rfc/rfc-7.rst.txt135
13 files changed, 3240 insertions, 0 deletions
diff --git a/_sources/community/channels.rst.txt b/_sources/community/channels.rst.txt
new file mode 100644
index 00000000..1d67ef52
--- /dev/null
+++ b/_sources/community/channels.rst.txt
@@ -0,0 +1,46 @@
+.. _channels:
+
+===========================
+Communication channels
+===========================
+
+Mailing list
+-------------------------------------------------------------------------------
+
+Users and developers of the library are using the mailing list to discuss all
+things related to PROJ. The mailing list is the primary forum for asking for
+help with use of PROJ. The mailing list is also used for announcements, discussions
+about the development of the library and from time to time interesting discussions
+on geodesy appear as well. You are more than welcome to join in on the discussions!
+
+
+The PROJ mailing list can be found at https://lists.osgeo.org/mailman/listinfo/proj
+
+
+GitHub
+-------------------------------------------------------------------------------
+
+GitHub is the development platform we use for collaborating on the PROJ code.
+We use GitHub to keep track of the changes in the code and to index bug reports
+and feature requests. We are happy to take contributions in any form, either
+as code, bug reports, documentation or feature requests. See :ref:`contributing`
+for more info on how you can help improve PROJ.
+
+The PROJ GitHub page can be found at https://github.com/OSGeo/PROJ
+
+.. note::
+
+ The issue tracker on GitHub is only meant to keep track of bugs, feature
+ request and other things related to the development of PROJ. Please ask
+ your questions about the use of PROJ on the mailing list instead.
+
+
+Gitter
+-------------------------------------------------------------------------------
+
+Gitter is the instant messaging alternative to the mailing list. PROJ has a
+room under the OSGeo organization. Most of the core developers stop by from
+time to time for an informal chat. You are more than welcome to join the
+discussion.
+
+The Gitter room can be found at https://gitter.im/OSGeo/proj.4
diff --git a/_sources/community/code_contributions.rst.txt b/_sources/community/code_contributions.rst.txt
new file mode 100644
index 00000000..a21b590b
--- /dev/null
+++ b/_sources/community/code_contributions.rst.txt
@@ -0,0 +1,177 @@
+.. _code_contributions:
+
+================================================================================
+Guidelines for PROJ code contributors
+================================================================================
+
+This is a guide for PROJ, casual or regular, code contributors.
+
+Code contributions.
+###############################################################################
+
+Code contributions can be either bug fixes or new features. The process
+is the same for both, so they will be discussed together in this
+section.
+
+Making Changes
+~~~~~~~~~~~~~~
+
+- Create a topic branch from where you want to base your work.
+- You usually should base your topic branch off of the master branch.
+- To quickly create a topic branch: ``git checkout -b my-topic-branch``
+- Make commits of logical units.
+- Check for unnecessary whitespace with ``git diff --check`` before
+ committing.
+- Make sure your commit messages are in the `proper
+ format <http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html>`__.
+- Make sure you have added the necessary tests for your changes.
+- Make sure that all tests pass
+
+Submitting Changes
+~~~~~~~~~~~~~~~~~~
+
+- Push your changes to a topic branch in your fork of the repository.
+- Submit a pull request to the PROJ repository in the OSGeo
+ organization.
+- If your pull request fixes/references an issue, include that issue
+ number in the pull request. For example:
+
+::
+
+ Wiz the bang
+
+ Fixes #123.
+
+- PROJ developers will look at your patch and take an appropriate
+ action.
+
+Coding conventions
+~~~~~~~~~~~~~~~~~~
+
+Programming language
+^^^^^^^^^^^^^^^^^^^^
+
+PROJ was originally developed in ANSI C. Today PROJ is mostly developed in C++11,
+with a few parts of the code base still being C. Most of the older parts of the
+code base is effectively C with a few modifications so that it compiles better as
+C++.
+
+Coding style
+^^^^^^^^^^^^
+
+The parts of the code base that has started its life as C++ is formatted with
+``clang-format`` using the script ``scripts/reformat_cpp.sh``. This is mostly
+contained to the code in `src/iso19111/` but a few other `.cpp`-files are
+covered as well.
+
+For the rest of the code base, which has its origin in C, we don't enforce any
+particular coding style, but please try to keep it as simple as possible. If
+improving existing code, please try to conform with the style of the locally
+surrounding code.
+
+Whitespace
+^^^^^^^^^^
+
+Throughout the PROJ code base you will see differing whitespace use.
+The general rule is to keep whitespace in whatever form it is in the
+file you are currently editing. If the file has a mix of tabs and space
+please convert the tabs to space in a separate commit before making any
+other changes. This makes it a lot easier to see the changes in diffs
+when evaluating the changed code. New files should use spaces as
+whitespace.
+
+
+Tools
+###############################################################################
+
+Reformatting C++ code
+~~~~~~~~~~~~~~~~~~~~~~~~
+
+The script in ``scripts/reformat_cpp.sh`` will reformat C++ code in accordance
+to the project preference.
+
+If you are writing a new ``.cpp``-file it should be added to the list in the
+reformatting script.
+
+
+cppcheck static analyzer
+~~~~~~~~~~~~~~~~~~~~~~~~
+
+You can run locally ``scripts/cppcheck.sh`` that is a wrapper script around the
+cppcheck utility. This tool is used as part of the quality control of the code.
+
+cppcheck can have false positives. In general, it is preferable to rework the
+code a bit to make it more 'obvious' and avoid those false positives. When not
+possible, you can add a comment in the code like
+
+::
+
+ /* cppcheck-suppress duplicateBreak */
+
+in the preceding line. Replace
+duplicateBreak with the actual name of the violated rule emitted by cppcheck.
+
+Clang Static Analyzer (CSA)
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+CSA is run by a GitHub Actions workflow. You may also run it locally.
+
+Preliminary step: install clang. For example:
+
+::
+
+ wget https://releases.llvm.org/9.0.0/clang+llvm-9.0.0-x86_64-linux-gnu-ubuntu-18.04.tar.xz
+ tar xJf clang+llvm-9.0.0-x86_64-linux-gnu-ubuntu-18.04.tar.xz
+ mv clang+llvm-9.0.0-x86_64-linux-gnu-ubuntu-18.04 clang+llvm-9
+ export PATH=$PWD/clang+llvm-9/bin:$PATH
+
+Configure PROJ with the :program:`scan-build` utility of clang:
+
+::
+ mkdir csa_build
+ cd csa_build
+ scan-build cmake ..
+
+Build using :program:`scan-build`:
+
+::
+
+ scan-build make [-j8]
+
+If CSA finds errors, they will be emitted during the build. And in which case,
+at the end of the build process, :program:`scan-build` will emit a warning message
+indicating errors have been found and how to display the error report. This
+is with something like
+
+::
+
+ scan-view /tmp/scan-build-2021-03-15-121416-17476-1
+
+
+This will open a web browser with the interactive report.
+
+CSA may also have false positives. In general, this happens when the code is
+non-trivial / makes assumptions that hard to check at first sight. You will
+need to add extra checks or rework it a bit to make it more "obvious" for CSA.
+This will also help humans reading your code !
+
+Typo detection and fixes
+~~~~~~~~~~~~~~~~~~~~~~~~
+
+Run ``scripts/fix_typos.sh``
+
+Include What You Use (IWYU)
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Managing C includes is a pain. IWYU makes updating headers a bit
+easier. IWYU scans the code for functions that are called and makes
+sure that the headers for all those functions are present and in
+sorted order. However, you cannot blindly apply IWYU to PROJ. It
+does not understand ifdefs, other platforms, or the order requirements
+of PROJ internal headers. So the way to use it is to run it on a copy
+of the source and merge in only the changes that make sense.
+Additions of standard headers should always be safe to merge. The
+rest require careful evaluation. See the IWYU documentation for
+motivation and details.
+
+`IWYU docs <https://github.com/include-what-you-use/include-what-you-use/tree/master/docs>`_
diff --git a/_sources/community/code_of_conduct.rst.txt b/_sources/community/code_of_conduct.rst.txt
new file mode 100644
index 00000000..175b184a
--- /dev/null
+++ b/_sources/community/code_of_conduct.rst.txt
@@ -0,0 +1,92 @@
+.. _code_of_conduct:
+
+===========================
+Code of Conduct
+===========================
+
+The PROJ project has adopted the
+`Contributor Covenant Code of Conduct <https://www.contributor-covenant.org/>`_.
+Everyone who participates in the PROJ community is expected to follow the
+code of conduct as written below.
+
+
+Our Pledge
+----------
+
+In the interest of fostering an open and welcoming environment, we as
+contributors and maintainers pledge to make participation in our project and
+our community a harassment-free experience for everyone, regardless of age, body
+size, disability, ethnicity, sex characteristics, gender identity and expression,
+level of experience, education, socio-economic status, nationality, personal
+appearance, race, religion, or sexual identity and orientation.
+
+Our Standards
+-------------
+
+Examples of behavior that contributes to creating a positive environment
+include:
+
+* Using welcoming and inclusive language
+* Being respectful of differing viewpoints and experiences
+* Gracefully accepting constructive criticism
+* Focusing on what is best for the community
+* Showing empathy towards other community members
+
+Examples of unacceptable behavior by participants include:
+
+* The use of sexualized language or imagery and unwelcome sexual attention or
+ advances
+* Trolling, insulting/derogatory comments, and personal or political attacks
+* Public or private harassment
+* Publishing others' private information, such as a physical or electronic
+ address, without explicit permission
+* Other conduct which could reasonably be considered inappropriate in a
+ professional setting
+
+Our Responsibilities
+--------------------
+
+Project maintainers are responsible for clarifying the standards of acceptable
+behavior and are expected to take appropriate and fair corrective action in
+response to any instances of unacceptable behavior.
+
+Project maintainers have the right and responsibility to remove, edit, or
+reject comments, commits, code, wiki edits, issues, and other contributions
+that are not aligned to this Code of Conduct, or to ban temporarily or
+permanently any contributor for other behaviors that they deem inappropriate,
+threatening, offensive, or harmful.
+
+Scope
+-----
+
+This Code of Conduct applies within all project spaces, and it also applies when
+an individual is representing the project or its community in public spaces.
+Examples of representing a project or community include using an official
+project e-mail address, posting via an official social media account, or acting
+as an appointed representative at an online or offline event. Representation of
+a project may be further defined and clarified by project maintainers.
+
+Enforcement
+-----------
+
+Instances of abusive, harassing, or otherwise unacceptable behavior may be
+reported by contacting the project team at `kristianevers@gmail.com <mailto:kristianevers@gmail.com>`_.
+All complaints will be reviewed and investigated and will result in a response that
+is deemed necessary and appropriate to the circumstances. The project team is
+obligated to maintain confidentiality with regard to the reporter of an incident.
+Further details of specific enforcement policies may be posted separately.
+
+Project maintainers who do not follow or enforce the Code of Conduct in good
+faith may face temporary or permanent repercussions as determined by other
+members of the project's leadership.
+
+Attribution
+-----------
+
+This Code of Conduct is adapted from the
+`Contributor Covenant <https://www.contributor-covenant.org>`_, version 1.4,
+available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
+
+For answers to common questions about this code of conduct, see
+https://www.contributor-covenant.org/faq
+
diff --git a/_sources/community/contributing.rst.txt b/_sources/community/contributing.rst.txt
new file mode 100644
index 00000000..5d24e111
--- /dev/null
+++ b/_sources/community/contributing.rst.txt
@@ -0,0 +1,175 @@
+.. _contributing:
+
+===========================
+Contributing
+===========================
+
+PROJ has a wide and varied user base. Some are highly skilled
+geodesists with a deep knowledge of map projections and reference
+systems, some are GIS software developers and others are GIS users. All
+users, regardless of the profession or skill level, has the ability to
+contribute to PROJ. Here's a few suggestion on how:
+
+- Help PROJ-users that is less experienced than yourself.
+- Write a bug report
+- Request a new feature
+- Write documentation for your favorite map projection
+- Fix a bug
+- Implement a new feature
+
+In the following sections you can find some guidelines on how to
+contribute. As PROJ is managed on GitHub most contributions require
+that you have a GitHub account. Familiarity with
+`issues <https://guides.github.com/features/issues/>`__ and the `GitHub
+Flow <https://guides.github.com/introduction/flow/>`__ is an advantage.
+
+Help a fellow PROJ user
+-------------------------
+
+The main forum for support for PROJ is the mailing list. You can
+subscribe to the mailing list
+`here <http://lists.maptools.org/mailman/listinfo/proj>`__ and read the
+archive `here <http://lists.maptools.org/pipermail/proj/>`__.
+
+If you have questions about the usage of PROJ the mailing list is also
+the place to go. Please *do not* use the GitHub issue tracker as a
+support forum. Your question is much more likely to be answered on the
+mailing list, as many more people follow that than the issue tracker.
+
+.. _add_bug_report:
+
+Adding bug reports
+------------------
+
+Bug reports are handled in the `issue
+tracker <https://github.com/OSGeo/PROJ/issues>`__ on PROJ's home on
+GitHub. Writing a good bug report is not easy. But fixing a poorly
+documented bug is not easy either, so please put in the effort it takes
+to create a thorough bug report.
+
+A good bug report includes at least:
+
+- A title that quickly explains the problem
+- A description of the problem and how it can be reproduced
+- Version of PROJ being used
+- Version numbers of any other relevant software being used, e.g.
+ operating system
+- A description of what already has been done to solve the problem
+
+The more information that is given up front, the more likely it is that
+a developer will find interest in solving the problem. You will probably
+get follow-up questions after submitting a bug report. Please answer
+them in a timely manner if you have an interest in getting the issue
+solved.
+
+Finally, please only submit bug reports that are actually related to
+PROJ. If the issue materializes in software that uses PROJ it is
+likely a problem with that particular software. Make sure that it
+actually is a PROJ problem before you submit an issue. If you can
+reproduce the problem only by using tools from PROJ it is definitely a
+problem with PROJ.
+
+Feature requests
+----------------
+
+Got an idea for a new feature in PROJ? Submit a thorough description
+of the new feature in the `issue
+tracker <https://github.com/OSGeo/PROJ/issues>`__. Please include any
+technical documents that can help the developer make the new feature a
+reality. An example of this could be a publicly available academic paper
+that describes a new projection. Also, including a numerical test case
+will make it much easier to verify that an implementation of your
+requested feature actually works as you expect.
+
+Note that not all feature requests are accepted.
+
+Write documentation
+-------------------
+
+PROJ is in dire need of better documentation. Any contributions of
+documentation are greatly appreciated. The PROJ documentation is
+available on `proj.org <https://proj.org>`__. The website is generated
+with `Sphinx <http://www.sphinx-doc.org/en/stable/>`__. Contributions to
+the documentation should be made as `Pull
+Requests <https://github.com/OSGeo/PROJ/pulls>`__ on GitHub.
+
+If you intend to document one of PROJ's supported projections please
+use the :doc:`Mercator projection <../operations/projections/merc>`
+as a template.
+
+Code contributions
+------------------
+
+See :doc:`Code contributions <code_contributions>`
+
+Legalese
+~~~~~~~~
+
+Committers are the front line gatekeepers to keep the code base clear of
+improperly contributed code. It is important to the PROJ users,
+developers and the OSGeo foundation to avoid contributing any code to
+the project without it being clearly licensed under the project license.
+
+Generally speaking the key issues are that those providing code to be
+included in the repository understand that the code will be released
+under the MIT/X license, and that the person providing the code has the
+right to contribute the code. For the committer themselves understanding
+about the license is hopefully clear. For other contributors, the
+committer should verify the understanding unless the committer is very
+comfortable that the contributor understands the license (for instance
+frequent contributors).
+
+If the contribution was developed on behalf of an employer (on work
+time, as part of a work project, etc) then it is important that an
+appropriate representative of the employer understand that the code will
+be contributed under the MIT/X license. The arrangement should be
+cleared with an authorized supervisor/manager, etc.
+
+The code should be developed by the contributor, or the code should be
+from a source which can be rightfully contributed such as from the
+public domain, or from an open source project under a compatible
+license.
+
+All unusual situations need to be discussed and/or documented.
+
+Committers should adhere to the following guidelines, and may be
+personally legally liable for improperly contributing code to the source
+repository:
+
+- Make sure the contributor (and possibly employer) is aware of the
+ contribution terms.
+- Code coming from a source other than the contributor (such as adapted
+ from another project) should be clearly marked as to the original
+ source, copyright holders, license terms and so forth. This
+ information can be in the file headers, but should also be added to
+ the project licensing file if not exactly matching normal project
+ licensing (COPYING).
+- Existing copyright headers and license text should never be stripped
+ from a file. If a copyright holder wishes to give up copyright they
+ must do so in writing to the foundation before copyright messages are
+ removed. If license terms are changed it has to be by agreement
+ (written in email is ok) of the copyright holders.
+- Code with licenses requiring credit, or disclosure to users should be
+ added to COPYING.
+- When substantial contributions are added to a file (such as
+ substantial patches) the author/contributor should be added to the
+ list of copyright holders for the file.
+- If there is uncertainty about whether a change is proper to
+ contribute to the code base, please seek more information from the
+ project steering committee, or the foundation legal counsel.
+
+Additional Resources
+--------------------
+
+- `General GitHub documentation <https://help.github.com/>`__
+- `GitHub pull request
+ documentation <https://help.github.com/articles/about-pull-requests/>`__
+
+Acknowledgements
+----------------
+
+The *code contribution* section of this CONTRIBUTING file is inspired by
+`PDAL's <https://github.com/PDAL/PDAL/blob/master/CONTRIBUTING.md>`__
+and the *legalese* section is modified from `GDAL committer
+guidelines <https://trac.osgeo.org/gdal/wiki/rfc3_commiters>`__
+
diff --git a/_sources/community/index.rst.txt b/_sources/community/index.rst.txt
new file mode 100644
index 00000000..e5e94d7e
--- /dev/null
+++ b/_sources/community/index.rst.txt
@@ -0,0 +1,32 @@
+.. _community:
+
+Community
+===============================================================================
+
+The PROJ community is what makes the software stand out from its competitors.
+PROJ is used and developed by group of very enthusiastic, knowledgeable and
+friendly people. Whether you are a first time user of PROJ or a long-time
+contributor the community is always very welcoming.
+
+.. toctree::
+ :maxdepth: 1
+
+ channels
+ contributing
+ code_contributions
+ code_of_conduct
+ rfc/index
+
+Conference
+----------
+
+.. image:: ../../images/foss4g2021.jpg
+ :alt: FOSS4G 2021
+ :target: https://2021.foss4g.org/
+
+`FOSS4G 2021 <https://2021.foss4g.org/>`_ is the leading annual conference for
+free and open source geospatial software. It will include presentations related
+to PROJ, and some of the PROJ development community will be attending. It is the
+event for those interested in PROJ, other FOSS geospatial technologies and the
+community around them. The conference will due to COVID-19 be held in a virtual
+setting September 27th - October 2nd, 2021.
diff --git a/_sources/community/rfc/index.rst.txt b/_sources/community/rfc/index.rst.txt
new file mode 100644
index 00000000..53087129
--- /dev/null
+++ b/_sources/community/rfc/index.rst.txt
@@ -0,0 +1,20 @@
+.. _rfcs:
+
+*****************************************************************************
+ Request for Comments
+*****************************************************************************
+
+A PROJ RFC describes a major change in the technological underpinnings of
+PROJ, major additions to functionality, or changes in the direction of
+the project.
+
+.. toctree::
+ :maxdepth: 1
+
+ rfc-1
+ rfc-2
+ rfc-3
+ rfc-4
+ rfc-5
+ rfc-6
+ rfc-7
diff --git a/_sources/community/rfc/rfc-1.rst.txt b/_sources/community/rfc/rfc-1.rst.txt
new file mode 100644
index 00000000..f3228e0c
--- /dev/null
+++ b/_sources/community/rfc/rfc-1.rst.txt
@@ -0,0 +1,172 @@
+.. _rfc1:
+
+====================================================================
+PROJ RFC 1: Project Committee Guidelines
+====================================================================
+
+:Author: Frank Warmerdam, Howard Butler
+:Contact: howard@hobu.co
+:Status: Passed
+:Last Updated: 2018-06-08
+
+Summary
+-----------
+
+This document describes how the PROJ Project Steering Committee (PSC)
+determines membership, and makes decisions on all aspects of the
+PROJ project - both technical and non-technical.
+
+Examples of PSC management responsibilities:
+
+* setting the overall development road map
+* developing technical standards and policies (e.g. coding standards,
+ file naming conventions, etc...)
+* ensuring regular releases (major and maintenance) of PROJ software
+* reviewing RFC for technical enhancements to the software
+* project infrastructure (e.g. GitHub, continuous integration hosting options, etc...)
+* formalization of affiliation with external entities such as OSGeo
+* setting project priorities, especially with respect to project sponsorship
+* creation and oversight of specialized sub-committees (e.g. project
+ infrastructure, training)
+
+In brief the project team votes on proposals on the `proj mailing list`_. Proposals are available for review for at least two days, and a single
+veto is sufficient delay progress though ultimately a majority of members can
+pass a proposal.
+
+.. _`proj mailing list`: http://lists.maptools.org/mailman/listinfo/proj
+
+List of PSC Members
+-------------------
+(up-to-date as of 2018-06)
+
+* Kristian Evers `@kbevers <https://github.com/kbevers>`_ (DK) **Chair**
+* Howard Butler `@hobu <https://github.com/hobu>`_ (USA)
+* Charles Karney `@cffk <https://github.com/cffk>`_ (USA)
+* Thomas Knudsen `@busstoptaktik <https://github.com/busstoptaktik>`_ (DK)
+* Even Rouault `@rouault <https://github.com/rouault>`_ (FR)
+* Kurt Schwehr `@schwehr <https://github.com/schwehr>`_ (USA)
+* Frank Warmerdam `@warmerdam <https://github.com/warmerdam>`_ (USA) **Emeritus**
+
+Detailed Process
+-----------------------
+
+* Proposals are written up and submitted on the `proj mailing list`_
+ for discussion and voting, by any interested party, not just
+ committee members.
+* Proposals need to be available for review for at least two business
+ days before a final decision can be made.
+* Respondents may vote "+1" to indicate support for the proposal and a
+ willingness to support implementation.
+* Respondents may vote "-1" to veto a proposal, but must provide clear
+ reasoning and alternate approaches to resolving the problem within
+ the two days.
+* A vote of -0 indicates mild disagreement, but has no effect. A 0
+ indicates no opinion. A +0 indicate mild support, but has no
+ effect.
+* Anyone may comment on proposals on the list, but only members of the
+ Project Steering Committee's votes will be counted.
+* A proposal will be accepted if it receives +2 (including the
+ author) and no vetoes (-1).
+* If a proposal is vetoed, and it cannot be revised to satisfy all
+ parties, then it can be resubmitted for an override vote in which a
+ majority of all eligible voters indicating +1 is sufficient to pass it.
+ Note that this is a majority of all committee members, not just those who
+ actively vote.
+* Upon completion of discussion and voting the author should announce
+ whether they are proceeding (proposal accepted) or are withdrawing
+ their proposal (vetoed).
+* The Chair gets a vote.
+* The Chair is responsible for keeping track of who is a member of the
+ Project Steering Committee (perhaps as part of a PSC file in CVS).
+* Addition and removal of members from the committee, as well as selection
+ of a Chair should be handled as a proposal to the committee.
+* The Chair adjudicates in cases of disputes about voting.
+
+RFC Origin
+.............
+
+PROJ RFC and Project Steering Committee is derived from similar governance
+bodies in both the `GDAL <https://trac.osgeo.org/gdal/wiki/rfc1_pmc>`__ and
+`MapServer <http://mapserver.org/development/rfc/ms-rfc-23.html>`__ software
+projects.
+
+When is Vote Required?
+-----------------------
+
+* Any change to committee membership (new members, removing inactive members)
+* Changes to project infrastructure (e.g. tool, location or substantive
+ configuration)
+* Anything that could cause backward compatibility issues.
+* Adding substantial amounts of new code.
+* Changing inter-subsystem APIs, or objects.
+* Issues of procedure.
+* When releases should take place.
+* Anything dealing with relationships with external entities such as OSGeo
+* Anything that might be controversial.
+
+Observations
+----------------
+
+* The Chair is the ultimate adjudicator if things break down.
+* The absolute majority rule can be used to override an obstructionist
+ veto, but it is intended that in normal circumstances vetoers need to be
+ convinced to withdraw their veto. We are trying to reach consensus.
+
+Committee Membership
+---------------------
+
+The PSC is made up of individuals consisting of technical contributors
+(e.g. developers) and prominent members of the PROJ user community.
+There is no set number of members for the PSC although the initial desire
+is to set the membership at 6.
+
+Adding Members
+..............
+
+Any member of the `proj mailing list`_ may nominate someone for
+committee membership at any time. Only existing PSC committee members may
+vote on new members. Nominees must receive a majority vote from existing
+members to be added to the PSC.
+
+Stepping Down
+.............
+
+If for any reason a PSC member is not able to fully participate then they
+certainly are free to step down. If a member is not active (e.g. no
+voting, no IRC or email participation) for a period of two months then
+the committee reserves the right to seek nominations to fill that position.
+Should that person become active again (hey, it happens) then they would
+certainly be welcome, but would require a nomination.
+
+Membership Responsibilities
+-----------------------------
+
+Guiding Development
+...............................
+
+Members should take an active role guiding the development of new features
+they feel passionate about. Once a change request has been accepted
+and given a green light to proceed does not mean the members are free of
+their obligation. PSC members voting "+1" for a change request are
+expected to stay engaged and ensure the change is implemented and
+documented in a way that is most beneficial to users. Note that this
+applies not only to change requests that affect code, but also those
+that affect the web site, technical infrastructure, policies and standards.
+
+Mailing List Participation
+...............................
+
+PSC members are expected to be active on the
+`proj mailing list`_, subject to Open Source mailing list
+etiquette. Non-developer members of the PSC are not expected to respond
+to coding level questions on the developer mailing list, however they
+are expected to provide their thoughts and opinions on user level
+requirements and compatibility issues when RFC discussions take place.
+
+
+Updates
+---------
+
+**June 2018**
+
+RFC 1 was ratified by the following members
diff --git a/_sources/community/rfc/rfc-2.rst.txt b/_sources/community/rfc/rfc-2.rst.txt
new file mode 100644
index 00000000..7da1bc4f
--- /dev/null
+++ b/_sources/community/rfc/rfc-2.rst.txt
@@ -0,0 +1,933 @@
+.. _rfc2:
+
+====================================================================
+PROJ RFC 2: Initial integration of "GDAL SRS barn" work
+====================================================================
+
+:Author: Even Rouault
+:Contact: even.rouault at spatialys.com
+:Status: Adopted, implemented in PROJ 6.0
+:Initial version: 2018-10-09
+:Last Updated: 2018-10-31
+
+Summary
+-------
+
+This RFC is the result of a first phase of the `GDAL Coordinate System Barn Raising`_
+efforts. In its current state, this work mostly consists of:
+
+ - a C++ implementation of the ISO-19111:2018 / OGC Topic 2 "Referencing by
+ coordinates" classes to represent Datums, Coordinate systems, CRSs
+ (Coordinate Reference Systems) and Coordinate Operations.
+ - methods to convert between this C++ modeling and WKT1, WKT2 and PROJ string representations of those objects
+ - management and query of a SQLite3 database of CRS and Coordinate Operation definition
+ - a C API binding part of those capabilities
+
+.. _`GDAL Coordinate System Barn Raising`: https://gdalbarn.com/
+
+
+Related standards
+-----------------
+
+Consult `Applicable standards`_
+
+.. _`Applicable standards`: http://even.rouault.free.fr/proj_cpp_api/html/general_doc.html#standards
+
+(They will be linked from the PROJ documentation)
+
+
+Details
+-------
+
+Structure in packages / namespaces
+**********************************
+
+The C++ implementation of the (upcoming) ISO-19111:2018 / OGC Topic 2 "Referencing by
+coordinates" classes follows this abstract modeling as much as possible, using
+package names as C++ namespaces, abstract classes and method names. A new
+BoundCRS class has been added to cover the modeling of the WKT2 BoundCRS
+construct, that is a generalization of the WKT1 TOWGS84 concept. It is
+strongly recommended to have the ISO-19111 standard open to have an introduction
+for the concepts when looking at the code. A few classes have also been
+inspired by the GeoAPI
+
+The classes are organized into several namespaces:
+
+ - osgeo::proj::util
+ A set of base types from ISO 19103, GeoAPI and other PROJ "technical"
+ specific classes
+
+ Template optional<T>, classes BaseObject, IComparable, BoxedValue,
+ ArrayOfBaseObject, PropertyMap, LocalName, NameSpace, GenericName,
+ NameFactory, CodeList, Exception, InvalidValueTypeException,
+ UnsupportedOperationException
+
+ - osgeo::proj::metadata:
+ Common classes from ISO 19115 (Metadata) standard
+
+ Classes Citation, GeographicExtent, GeographicBoundingBox,
+ TemporalExtent, VerticalExtent, Extent, Identifier, PositionalAccuracy,
+
+ - osgeo::proj::common:
+ Common classes: UnitOfMeasure, Measure, Scale, Angle, Length, DateTime,
+ DateEpoch, IdentifiedObject, ObjectDomain, ObjectUsage
+
+ - osgeo::proj::cs:
+ Coordinate systems and their axis
+
+ Classes AxisDirection, Meridian, CoordinateSystemAxis, CoordinateSystem,
+ SphericalCS, EllipsoidalCS, VerticalCS, CartesianCS, OrdinalCS,
+ ParametricCS, TemporalCS, DateTimeTemporalCS, TemporalCountCS,
+ TemporalMeasureCS
+
+ - osgeo::proj::datum:
+ Datum (the relationship of a coordinate system to the body)
+
+ Classes Ellipsoid, PrimeMeridian, Datum, DatumEnsemble,
+ GeodeticReferenceFrame, DynamicGeodeticReferenceFrame,
+ VerticalReferenceFrame, DynamicVerticalReferenceFrame, TemporalDatum,
+ EngineeringDatum, ParametricDatum
+
+ - osgeo::proj::crs:
+ CRS = coordinate reference system = coordinate system with a datum
+
+ Classes CRS, GeodeticCRS, GeographicCRS, DerivedCRS, ProjectedCRS,
+ VerticalCRS, CompoundCRS, BoundCRS, TemporalCRS, EngineeringCRS,
+ ParametricCRS, DerivedGeodeticCRS, DerivedGeographicCRS,
+ DerivedProjectedCRS, DerivedVerticalCRS
+
+ - osgeo::proj::operation:
+ Coordinate operations (relationship between any two coordinate
+ reference systems)
+
+ Classes CoordinateOperation, GeneralOperationParameter,
+ OperationParameter, GeneralParameterValue, ParameterValue,
+ OperationParameterValue, OperationMethod, InvalidOperation,
+ SingleOperation, Conversion, Transformation, PointMotionOperation,
+ ConcatenatedOperation
+
+ - osgeo::proj::io:
+ I/O classes: WKTFormatter, PROJStringFormatter, FormattingException,
+ ParsingException, IWKTExportable, IPROJStringExportable, WKTNode,
+ WKTParser, PROJStringParser, DatabaseContext, AuthorityFactory,
+ FactoryException, NoSuchAuthorityCodeException
+
+What does what?
+***************
+
+The code to parse WKT and PROJ strings and build ISO-19111 objects is
+contained in `io.cpp`_
+
+The code to format WKT and PROJ strings from ISO-19111 objects is mostly
+contained in the related exportToWKT() and exportToPROJString() methods
+overridden in the applicable classes. `io.cpp`_ contains the general mechanics
+to build such strings.
+
+Regarding WKT strings, three variants are handled in import and export:
+
+ - WKT2_2018: variant corresponding to the upcoming ISO-19162:2018 standard
+
+ - WKT2_2015: variant corresponding to the current ISO-19162:2015 standard
+
+ - WKT1_GDAL: variant corresponding to the way GDAL understands the OGC
+ 01-099 and OGC 99-049 standards
+
+Regarding PROJ strings, two variants are handled in import and export:
+
+ - PROJ5: variant used by PROJ >= 5, possibly using pipeline constructs,
+ and avoiding +towgs84 / +nadgrids legacy constructs. This variant honours
+ axis order and input/output units. That is the pipeline for the conversion
+ of EPSG:4326 to EPSG:32631 will assume that the input coordinates are in
+ latitude, longitude order, with degrees.
+
+ - PROJ4: variant used by PROJ 4.x
+
+The raw query of the proj.db database and the upper level construction of
+ISO-19111 objects from the database contents is done in `factory.cpp`_
+
+A few design principles
+***********************
+
+Methods generally take and return xxxNNPtr objects, that is non-null shared
+pointers (pointers with internal reference counting). The advantage of this
+approach is that the user has not to care about the life-cycle of the
+instances (and this makes the code leak-free by design). The only point of
+attention is to make sure no reference cycles are made. This is the case for
+all classes, except the CoordinateOperation class that point to CRS for
+sourceCRS and targetCRS members, whereas DerivedCRS point to a Conversion
+instance (which derives from CoordinateOperation). This issue was detected in
+the ISO-19111 standard. The solution adopted here is to use std::weak_ptr
+in the CoordinateOperation class to avoid the cycle. This design artefact is
+transparent to users.
+
+Another important design point is that all ISO19111 objects are immutable after
+creation, that is they only have getters that do not modify their states.
+Consequently they could possibly use in a thread-safe way. There are however
+classes like PROJStringFormatter, WKTFormatter, DatabaseContext, AuthorityFactory
+and CoordinateOperationContext whose instances are mutable and thus can not be
+used by multiple threads at once.
+
+Example how to build the EPSG:4326 / WGS84 Geographic2D definition from scratch:
+
+::
+
+ auto greenwich = PrimeMeridian::create(
+ util::PropertyMap()
+ .set(metadata::Identifier::CODESPACE_KEY,
+ metadata::Identifier::EPSG)
+ .set(metadata::Identifier::CODE_KEY, 8901)
+ .set(common::IdentifiedObject::NAME_KEY, "Greenwich"),
+ common::Angle(0));
+ // actually predefined as PrimeMeridian::GREENWICH constant
+
+ auto ellipsoid = Ellipsoid::createFlattenedSphere(
+ util::PropertyMap()
+ .set(metadata::Identifier::CODESPACE_KEY, metadata::Identifier::EPSG)
+ .set(metadata::Identifier::CODE_KEY, 7030)
+ .set(common::IdentifiedObject::NAME_KEY, "WGS 84"),
+ common::Length(6378137),
+ common::Scale(298.257223563));
+ // actually predefined as Ellipsoid::WGS84 constant
+
+ auto datum = GeodeticReferenceFrame::create(
+ util::PropertyMap()
+ .set(metadata::Identifier::CODESPACE_KEY, metadata::Identifier::EPSG)
+ .set(metadata::Identifier::CODE_KEY, 6326)
+ .set(common::IdentifiedObject::NAME_KEY, "World Geodetic System 1984");
+ ellipsoid
+ util::optional<std::string>(), // anchor
+ greenwich);
+ // actually predefined as GeodeticReferenceFrame::EPSG_6326 constant
+
+ auto geogCRS = GeographicCRS::create(
+ util::PropertyMap()
+ .set(metadata::Identifier::CODESPACE_KEY, metadata::Identifier::EPSG)
+ .set(metadata::Identifier::CODE_KEY, 4326)
+ .set(common::IdentifiedObject::NAME_KEY, "WGS 84"),
+ datum,
+ cs::EllipsoidalCS::createLatitudeLongitude(scommon::UnitOfMeasure::DEGREE));
+ // actually predefined as GeographicCRS::EPSG_4326 constant
+
+Algorithmic focus
+*****************
+
+On the algorithmic side, a somewhat involved logic is the
+CoordinateOperationFactory::createOperations() in `coordinateoperation.cpp`_
+that takes a pair of source and target CRS and returns a set of possible
+`coordinate operations`_ (either single operations like a Conversion or a
+Transformation, or concatenated operations). It uses the intrinsic structure
+of those objects to create the coordinate operation pipeline. That is, if
+going from a ProjectedCRS to another one, by doing first the inverse conversion
+from the source ProjectedCRS to its base GeographicCRS, then finding the
+appropriate transformation(s) from this base GeographicCRS to the base
+GeographicCRS of the target CRS, and then applying the conversion from this
+base GeographicCRS to the target ProjectedCRS. At each step, it queries the
+database to find if one or several transformations are available. The
+resulting coordinate operations are filtered, and sorted, with user provided hints:
+
+ - desired accuracy
+ - area of use, defined as a bounding box in longitude, latitude space (its
+ actual CRS does not matter for the intended use)
+ - if no area of use is defined, if and how the area of use of the source
+ and target CRS should be used. By default, the smallest area of use is
+ used. The rationale is for example when transforming between a national
+ ProjectedCRS and a world-scope GeographicCRS to use the are of use of
+ this ProjectedCRS to select the appropriate datum shifts.
+ - how the area of use of the candidate transformations and the desired area of
+ use (either explicitly or implicitly defined, as explained above) are
+ compared. By default, only transformations whose area of use is fully
+ contained in the desired area of use are selected. It is also possible
+ to relax this test by specifying that only an intersection test must be used.
+ - whether `PROJ transformation grid`_ names should be substituted to the
+ official names, when a match is found in the `grid_alternatives` table
+ of the database. Defaults to true
+ - whether the availability of those grids should be used to filter and sort
+ the results. By default, the transformations using grids available in the
+ system will be presented first.
+
+The results are sorted, with the most relevant ones appearing first in the
+result vector. The criteria used are in that order
+
+ - grid actual availability: operations referencing grids not available will be
+ listed after ones with available grids
+ - grid potential availability: operation referencing grids not known at
+ all in the proj.db will be listed after operations with grids known, but
+ not available.
+ - known accuracy: operations with unknown accuracies will be listed
+ after operations with known accuracy
+ - area of use: operations with smaller area of use (the intersection of the
+ operation area of used with the desired area of use) will be listed after
+ the ones with larger area of use
+ - accuracy: operations with lower accuracy will be listed after operations
+ with higher accuracy (caution: lower accuracy actually means a higher numeric
+ value of the accuracy property, since it is a precision in metre)
+
+All those settings can be specified in the CoordinateOperationContext instance
+passed to createOperations().
+
+An interesting example to understand how those parameters play together is
+to use `projinfo -s EPSG:4267 -t EPSG:4326` (NAD27 to WGS84 conversions),
+and see how specifying desired area of use, spatial criterion, grid availability,
+etc. affects the results.
+
+The following command currently returns 78 results:
+
+::
+
+ projinfo -s EPSG:4267 -t EPSG:4326 --summary --spatial-test intersects
+
+The createOperations() algorithm also does a kind of "CRS routing".
+A typical example is if wanting to transform between CRS A and
+CRS B, but no direct transformation is referenced in proj.db between those.
+But if there are transformations between A <--> C and B <--> C, then it
+is possible to build a concatenated operation A --> C --> B. The typical
+example is when C is WGS84, but the implementation is generic and just finds
+a common pivot from the database. An example of finding a non-WGS84 pivot is
+when searching a transformation between EPSG:4326 and EPSG:6668 (JGD2011 -
+Japanese Geodetic Datum 2011), which has no direct transformation registered
+in the EPSG database . However there are transformations between those two
+CRS and JGD2000 (and also Tokyo datum, but that one involves less accurate
+transformations)
+
+::
+
+ projinfo -s EPSG:4326 -t EPSG:6668 --grid-check none --bbox 135.42,34.84,142.14,41.58 --summary
+
+ Candidate operations found: 7
+ unknown id, Inverse of JGD2000 to WGS 84 (1) + JGD2000 to JGD2011 (1), 1.2 m, Japan - northern Honshu
+ unknown id, Inverse of JGD2000 to WGS 84 (1) + JGD2000 to JGD2011 (2), 2 m, Japan excluding northern main province
+ unknown id, Inverse of Tokyo to WGS 84 (108) + Tokyo to JGD2011 (2), 9.2 m, Japan onshore excluding northern main province
+ unknown id, Inverse of Tokyo to WGS 84 (108) + Tokyo to JGD2000 (2) + JGD2000 to JGD2011 (1), 9.4 m, Japan - northern Honshu
+ unknown id, Inverse of Tokyo to WGS 84 (2) + Tokyo to JGD2011 (2), 13.2 m, Japan - onshore mainland and adjacent islands
+ unknown id, Inverse of Tokyo to WGS 84 (2) + Tokyo to JGD2000 (2) + JGD2000 to JGD2011 (1), 13.4 m, Japan - northern Honshu
+ unknown id, Inverse of Tokyo to WGS 84 (1) + Tokyo to JGD2011 (2), 29.2 m, Asia - Japan and South Korea
+
+
+.. _`coordinate operations`: https://proj.org/operations/index.html
+
+.. _`PROJ transformation grid`: https://proj.org/resource_files.html#transformation-grids
+
+
+Code repository
+---------------
+
+The current state of the work can be found in the
+`iso19111 branch of rouault/proj.4 repository`_ , and is also available as
+a GitHub pull request at https://github.com/OSGeo/proj.4/pull/1040
+
+Here is a not-so-usable `comparison with a fixed snapshot of master branch`_
+
+.. _`iso19111 branch of rouault/proj.4 repository`: https://github.com/rouault/proj.4/tree/iso19111
+
+.. _`comparison with a fixed snapshot of master branch`: https://github.com/OSGeo/proj.4/compare/iso19111_dev...rouault:iso19111
+
+
+Database
+--------
+
+Content
+*******
+
+The database contains CRS and coordinate operation definitions from the `EPSG`_
+database (IOGP’s EPSG Geodetic Parameter Dataset) v9.5.3,
+`IGNF registry`_ (French National Geographic Institute), ESRI database, as well
+as a few customizations.
+
+.. _`EPSG`: http://www.epsg.org/
+.. _`IGNF registry`: https://geodesie.ign.fr/index.php?page=documentation#titre3
+
+Building (for PROJ developers creating the database)
+****************************************************
+
+The building of the database is a several stage process:
+
+Construct SQL scripts for EPSG
+++++++++++++++++++++++++++++++
+
+The first stage consists in constructing .sql scripts mostly with
+CREATE TABLE and INSERT statements to create the database structure and
+populate it. There is one .sql file for each database table, populated
+with the content of the EPSG database, automatically
+generated with the `build_db.py`_ script, which processes the PostgreSQL
+dumps issued by IOGP. A number of other scripts are dedicated to manual
+editing, for example `grid_alternatives.sql`_ file that binds official
+grid names to PROJ grid names
+
+Concert UTF8 SQL to sqlite3 db
+++++++++++++++++++++++++++++++
+
+The second stage is done automatically by the make process. It pipes the
+.sql script, in the right order, to the sqlite3 binary to generate a
+first version of the proj.db SQLite3 database.
+
+Add extra registries
+++++++++++++++++++++
+
+The third stage consists in creating additional .sql files from the
+content of other registries. For that process, we need to bind some
+definitions of those registries to those of the EPSG database, to be
+able to link to existing objects and detect some boring duplicates.
+The `ignf.sql`_ file has been generated using
+the `build_db_create_ignf.py`_ script from the current data/IGNF file
+that contains CRS definitions (and implicit transformations to WGS84)
+as PROJ.4 strings.
+The `esri.sql`_ file has been generated using the `build_db_from_esri.py`_
+script, from the .csv files in
+https://github.com/Esri/projection-engine-db-doc/tree/master/csv
+
+Finalize proj.db
+++++++++++++++++
+
+The last stage runs make again to incorporate the new .sql files generated
+in the previous stage (so the process of building the database involves
+a kind of bootstrapping...)
+
+Building (for PROJ users)
+*************************
+
+The make process just runs the second stage mentioned above from the .sql
+files. The resulting proj.db is currently 5.3 MB large.
+
+Structure
+*********
+
+The database is structured into the following tables and views. They generally
+match a ISO-19111 concept, and is generally close to the general structure of
+the EPSG database. Regarding identification of objects, where the EPSG database
+only contains a 'code' numeric column, the PROJ database identifies objects
+with a (auth_name, code) tuple of string values, allowing several registries to
+be combined together.
+
+- Technical:
+ - `authority_list`: view enumerating the authorities present in the database. Currently: EPSG, IGNF, PROJ
+ - `metadata`: a few key/value pairs, for example to indicate the version of the registries imported in the database
+ - `object_view`: synthetic view listing objects (ellipsoids, datums, CRS, coordinate operations...) code and name, and the table name where they are further described
+ - `alias_names`: list possible alias for the `name` field of object table
+ - `link_from_deprecated_to_non_deprecated`: to handle the link between old ESRI to new ESRI/EPSG codes
+
+- Common:
+ - `unit_of_measure`: table with UnitOfMeasure definitions.
+ - `area`: table with area-of-use (bounding boxes) applicable to CRS and coordinate operations.
+
+- Coordinate systems:
+ - `axis`: table with CoordinateSystemAxis definitions.
+ - `coordinate_system`: table with CoordinateSystem definitions.
+
+- Ellipsoid and datums:
+ - `ellipsoid`: table with ellipsoid definitions.
+ - `prime_meridian`: table with PrimeMeridian definitions.
+ - `geodetic_datum`: table with GeodeticReferenceFrame definitions.
+ - `vertical_datum`: table with VerticalReferenceFrame definitions.
+
+- CRS:
+ - `geodetic_crs`: table with GeodeticCRS and GeographicCRS definitions.
+ - `projected_crs`: table with ProjectedCRS definitions.
+ - `vertical_crs`: table with VerticalCRS definitions.
+ - `compound_crs`: table with CompoundCRS definitions.
+
+- Coordinate operations:
+ - `coordinate_operation_view`: view giving a number of common attributes shared by the concrete tables implementing CoordinateOperation
+ - `conversion`: table with definitions of Conversion (mostly parameter and values of Projection)
+ - `concatenated_operation`: table with definitions of ConcatenatedOperation.
+ - `grid_transformation`: table with all grid-based transformations.
+ - `grid_packages`: table listing packages in which grids can be found. ie "proj-datumgrid", "proj-datumgrid-europe", ...
+ - `grid_alternatives`: table binding official grid names to PROJ grid names. e.g "Und_min2.5x2.5_egm2008_isw=82_WGS84_TideFree.gz" --> "egm08_25.gtx"
+ - `helmert_transformation`: table with all Helmert-based transformations.
+ - `other_transformation`: table with other type of transformations.
+
+The main departure with the structure of the EPSG database is the split of
+the various coordinate operations over several tables. This was done mostly
+for human-readability as the EPSG organization of coordoperation,
+coordoperationmethod, coordoperationparam, coordoperationparamusage,
+coordoperationparamvalue tables makes it hard to grasp at once all the
+parameters and values for a given operation.
+
+
+Utilities
+---------
+
+A new `projinfo` utility has been added. It enables the user to enter a CRS or
+coordinate operation by a AUTHORITY:CODE, PROJ string or WKT string, and see
+it translated in the different flavors of PROJ and WKT strings.
+It also enables to build coordinate operations between two CRSs.
+
+Usage
+*****
+
+::
+
+ usage: projinfo [-o formats] [-k crs|operation] [--summary] [-q]
+ [--bbox min_long,min_lat,max_long,max_lat]
+ [--spatial-test contains|intersects]
+ [--crs-extent-use none|both|intersection|smallest]
+ [--grid-check none|discard_missing|sort]
+ [--boundcrs-to-wgs84]
+ {object_definition} | (-s {srs_def} -t {srs_def})
+
+ -o: formats is a comma separated combination of: all,default,PROJ4,PROJ,WKT_ALL,WKT2_2015,WKT2_2018,WKT1_GDAL
+ Except 'all' and 'default', other format can be preceded by '-' to disable them
+
+Examples
+********
+
+Specify CRS by AUTHORITY:CODE
++++++++++++++++++++++++++++++
+
+::
+
+ $ projinfo EPSG:4326
+
+ PROJ string:
+ +proj=pipeline +step +proj=longlat +ellps=WGS84 +step +proj=unitconvert +xy_in=rad +xy_out=deg +step +proj=axisswap +order=2,1
+
+ WKT2_2015 string:
+ GEODCRS["WGS 84",
+ DATUM["World Geodetic System 1984",
+ ELLIPSOID["WGS 84",6378137,298.257223563,
+ LENGTHUNIT["metre",1]]],
+ PRIMEM["Greenwich",0,
+ ANGLEUNIT["degree",0.0174532925199433]],
+ CS[ellipsoidal,2],
+ AXIS["geodetic latitude (Lat)",north,
+ ORDER[1],
+ ANGLEUNIT["degree",0.0174532925199433]],
+ AXIS["geodetic longitude (Lon)",east,
+ ORDER[2],
+ ANGLEUNIT["degree",0.0174532925199433]],
+ AREA["World"],
+ BBOX[-90,-180,90,180],
+ ID["EPSG",4326]]
+
+
+Specify CRS by PROJ string and specify output formats
+++++++++++++++++++++++++++++++++++++++++++++++++++++++
+
+::
+
+ $ projinfo -o PROJ4,PROJ,WKT1_GDAL,WKT2_2018 "+title=IGN 1972 Nuku Hiva - UTM fuseau 7 Sud +proj=tmerc +towgs84=165.7320,216.7200,180.5050,-0.6434,-0.4512,-0.0791,7.420400 +a=6378388.0000 +rf=297.0000000000000 +lat_0=0.000000000 +lon_0=-141.000000000 +k_0=0.99960000 +x_0=500000.000 +y_0=10000000.000 +units=m +no_defs"
+
+ PROJ string:
+ Error when exporting to PROJ string: BoundCRS cannot be exported as a PROJ.5 string, but its baseCRS might
+
+ PROJ.4 string:
+ +proj=utm +zone=7 +south +ellps=intl +towgs84=165.732,216.72,180.505,-0.6434,-0.4512,-0.0791,7.4204
+
+ WKT2_2018 string:
+ BOUNDCRS[
+ SOURCECRS[
+ PROJCRS["IGN 1972 Nuku Hiva - UTM fuseau 7 Sud",
+ BASEGEOGCRS["unknown",
+ DATUM["unknown",
+ ELLIPSOID["International 1909 (Hayford)",6378388,297,
+ LENGTHUNIT["metre",1,
+ ID["EPSG",9001]]]],
+ PRIMEM["Greenwich",0,
+ ANGLEUNIT["degree",0.0174532925199433],
+ ID["EPSG",8901]]],
+ CONVERSION["unknown",
+ METHOD["Transverse Mercator",
+ ID["EPSG",9807]],
+ PARAMETER["Latitude of natural origin",0,
+ ANGLEUNIT["degree",0.0174532925199433],
+ ID["EPSG",8801]],
+ PARAMETER["Longitude of natural origin",-141,
+ ANGLEUNIT["degree",0.0174532925199433],
+ ID["EPSG",8802]],
+ PARAMETER["Scale factor at natural origin",0.9996,
+ SCALEUNIT["unity",1],
+ ID["EPSG",8805]],
+ PARAMETER["False easting",500000,
+ LENGTHUNIT["metre",1],
+ ID["EPSG",8806]],
+ PARAMETER["False northing",10000000,
+ LENGTHUNIT["metre",1],
+ ID["EPSG",8807]]],
+ CS[Cartesian,2],
+ AXIS["(E)",east,
+ ORDER[1],
+ LENGTHUNIT["metre",1,
+ ID["EPSG",9001]]],
+ AXIS["(N)",north,
+ ORDER[2],
+ LENGTHUNIT["metre",1,
+ ID["EPSG",9001]]]]],
+ TARGETCRS[
+ GEOGCRS["WGS 84",
+ DATUM["World Geodetic System 1984",
+ ELLIPSOID["WGS 84",6378137,298.257223563,
+ LENGTHUNIT["metre",1]]],
+ PRIMEM["Greenwich",0,
+ ANGLEUNIT["degree",0.0174532925199433]],
+ CS[ellipsoidal,2],
+ AXIS["latitude",north,
+ ORDER[1],
+ ANGLEUNIT["degree",0.0174532925199433]],
+ AXIS["longitude",east,
+ ORDER[2],
+ ANGLEUNIT["degree",0.0174532925199433]],
+ ID["EPSG",4326]]],
+ ABRIDGEDTRANSFORMATION["Transformation from unknown to WGS84",
+ METHOD["Position Vector transformation (geog2D domain)",
+ ID["EPSG",9606]],
+ PARAMETER["X-axis translation",165.732,
+ ID["EPSG",8605]],
+ PARAMETER["Y-axis translation",216.72,
+ ID["EPSG",8606]],
+ PARAMETER["Z-axis translation",180.505,
+ ID["EPSG",8607]],
+ PARAMETER["X-axis rotation",-0.6434,
+ ID["EPSG",8608]],
+ PARAMETER["Y-axis rotation",-0.4512,
+ ID["EPSG",8609]],
+ PARAMETER["Z-axis rotation",-0.0791,
+ ID["EPSG",8610]],
+ PARAMETER["Scale difference",1.0000074204,
+ ID["EPSG",8611]]]]
+
+ WKT1_GDAL:
+ PROJCS["IGN 1972 Nuku Hiva - UTM fuseau 7 Sud",
+ GEOGCS["unknown",
+ DATUM["unknown",
+ SPHEROID["International 1909 (Hayford)",6378388,297],
+ TOWGS84[165.732,216.72,180.505,-0.6434,-0.4512,-0.0791,7.4204]],
+ PRIMEM["Greenwich",0,
+ AUTHORITY["EPSG","8901"]],
+ UNIT["degree",0.0174532925199433,
+ AUTHORITY["EPSG","9122"]],
+ AXIS["Longitude",EAST],
+ AXIS["Latitude",NORTH]],
+ PROJECTION["Transverse_Mercator"],
+ PARAMETER["latitude_of_origin",0],
+ PARAMETER["central_meridian",-141],
+ PARAMETER["scale_factor",0.9996],
+ PARAMETER["false_easting",500000],
+ PARAMETER["false_northing",10000000],
+ UNIT["metre",1,
+ AUTHORITY["EPSG","9001"]],
+ AXIS["Easting",EAST],
+ AXIS["Northing",NORTH]]
+
+
+Find transformations between 2 CRS
+++++++++++++++++++++++++++++++++++
+
+Between "Poland zone I" (based on Pulkovo 42 datum) and "UTM WGS84 zone 34N"
+
+Summary view:
+
+::
+
+ $ projinfo -s EPSG:2171 -t EPSG:32634 --summary
+
+ Candidate operations found: 1
+ unknown id, Inverse of Poland zone I + Pulkovo 1942(58) to WGS 84 (1) + UTM zone 34N, 1 m, Poland - onshore
+
+Display of pipelines:
+
+::
+
+ $ PROJ_LIB=data src/projinfo -s EPSG:2171 -t EPSG:32634 -o PROJ
+
+ PROJ string:
+ +proj=pipeline +step +proj=axisswap +order=2,1 +step +inv +proj=sterea +lat_0=50.625 +lon_0=21.0833333333333 +k=0.9998 +x_0=4637000 +y_0=5647000 +ellps=krass +step +proj=cart +ellps=krass +step +proj=helmert +x=33.4 +y=-146.6 +z=-76.3 +rx=-0.359 +ry=-0.053 +rz=0.844 +s=-0.84 +convention=position_vector +step +inv +proj=cart +ellps=WGS84 +step +proj=utm +zone=34 +ellps=WGS84
+
+
+Impacted files
+--------------
+
+New files (excluding makefile.am, CMakeLists.txt and other build infrastructure
+artefacts):
+
+ * include/proj/: Public installed C++ headers
+ - `common.hpp`_: declarations of osgeo::proj::common namespace.
+ - `coordinateoperation.hpp`_: declarations of osgeo::proj::operation namespace.
+ - `coordinatesystem.hpp`_: declarations of osgeo::proj::cs namespace.
+ - `crs.hpp`_: declarations of osgeo::proj::crs namespace.
+ - `datum.hpp`_: declarations of osgeo::proj::datum namespace.
+ - `io.hpp`_: declarations of osgeo::proj::io namespace.
+ - `metadata.hpp`_: declarations of osgeo::proj::metadata namespace.
+ - `util.hpp`_: declarations of osgeo::proj::util namespace.
+ - `nn.hpp`_: Code from https://github.com/dropbox/nn to manage Non-nullable pointers for C++
+
+ .. _`common.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/common.hpp
+ .. _`coordinateoperation.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/coordinateoperation.hpp
+ .. _`coordinatesystem.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/coordinatesystem.hpp
+ .. _`crs.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/crs.hpp
+ .. _`datum.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/datum.hpp
+ .. _`io.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/io.hpp
+ .. _`metadata.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/metadata.hpp
+ .. _`nn.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/nn.hpp
+ .. _`util.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/util.hpp
+
+ * include/proj/internal: Private non-installed C++ headers
+ - `coordinateoperation_internal.hpp`_: classes InverseCoordinateOperation, InverseConversion, InverseTransformation, PROJBasedOperation, and functions to get conversion mappings between WKT and PROJ syntax
+ - `coordinateoperation_constants.hpp`_: Select subset of conversion/transformation EPSG names and codes for the purpose of translating them to PROJ strings
+ - `coordinatesystem_internal.hpp`_: classes AxisDirectionWKT1, AxisName and AxisAbbreviation
+ - `internal.hpp`_: a few helper functions, mostly to do string-based operations
+ - `io_internal.hpp`_: class WKTConstants
+ - `helmert_constants.hpp`_: Helmert-based transformation & parameters names and codes.
+ - `lru_cache.hpp`_: code from https://github.com/mohaps/lrucache11 to have a generic Least-Recently-Used cache of objects
+
+ .. _`coordinateoperation_internal.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/internal/coordinateoperation_internal.hpp
+ .. _`coordinatesystem_internal.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/internal/coordinatesystem_internal.hpp
+ .. _`internal.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/internal/internal.hpp
+ .. _`io_internal.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/internal/io_internal.hpp
+ .. _`coordinateoperation_constants.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/internal/coordinateoperation_constants.hpp
+ .. _`helmert_constants.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/internal/helmert_constants.hpp
+ .. _`lru_cache.hpp`: https://github.com/rouault/proj.4/blob/iso19111/include/proj/internal/lru_cache.hpp
+
+ * src/:
+ - `c_api.cpp`_: C++ API mapped to C functions
+ - `common.cpp`_: implementation of `common.hpp`_
+ - `coordinateoperation.cpp`_: implementation of `coordinateoperation.hpp`_
+ - `coordinatesystem.cpp`_: implementation of `coordinatesystem.hpp`_
+ - `crs.cpp`_: implementation of `crs.hpp`_
+ - `datum.cpp`_: implementation of `datum.hpp`_
+ - `factory.cpp`_: implementation of AuthorityFactory class (from `io.hpp`_)
+ - `internal.cpp`_: implementation of `internal.hpp`_
+ - `io.cpp`_: implementation of `io.hpp`_
+ - `metadata.cpp`_: implementation of `metadata.hpp`_
+ - `static.cpp`_: a number of static constants (like pre-defined well-known ellipsoid, datum and CRS), put in the right order for correct static initializations
+ - `util.cpp`_: implementation of `util.hpp`_
+ - `projinfo.cpp`_: new 'projinfo' binary
+ - `general.dox`_: generic introduction documentation.
+
+ .. _`c_api.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/c_api.cpp
+ .. _`common.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/common.cpp
+ .. _`coordinateoperation.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/coordinateoperation.cpp
+ .. _`coordinatesystem.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/coordinatesystem.cpp
+ .. _`crs.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/crs.cpp
+ .. _`datum.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/datum.cpp
+ .. _`factory.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/factory.cpp
+ .. _`internal.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/internal.cpp
+ .. _`io.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/io.cpp
+ .. _`metadata.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/metadata.cpp
+ .. _`projinfo.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/projinfo.cpp
+ .. _`static.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/static.cpp
+ .. _`util.cpp`: https://github.com/rouault/proj.4/blob/iso19111/src/util.cpp
+ .. _`general.dox`: https://github.com/rouault/proj.4/blob/iso19111/src/general.dox
+
+ * data/sql/:
+ - `area.sql`_: generated by `build_db.py`_
+ - `axis.sql`_: generated by `build_db.py`_
+ - `begin.sql`_: hand generated (trivial)
+ - `commit.sql`_: hand generated (trivial)
+ - `compound_crs.sql`_: generated by `build_db.py`_
+ - `concatenated_operation.sql`_: generated by `build_db.py`_
+ - `conversion.sql`_: generated by `build_db.py`_
+ - `coordinate_operation.sql`_: generated by `build_db.py`_
+ - `coordinate_system.sql`_: generated by `build_db.py`_
+ - `crs.sql`_: generated by `build_db.py`_
+ - `customizations.sql`_: hand generated (empty)
+ - `ellipsoid.sql`_: generated by `build_db.py`_
+ - `geodetic_crs.sql`_: generated by `build_db.py`_
+ - `geodetic_datum.sql`_: generated by `build_db.py`_
+ - `grid_alternatives.sql`_: hand-generated. Contains links between official registry grid names and PROJ ones
+ - `grid_transformation.sql`_: generated by `build_db.py`_
+ - `grid_transformation_custom.sql`_: hand-generated
+ - `helmert_transformation.sql`_: generated by `build_db.py`_
+ - `ignf.sql`_: generated by `build_db_create_ignf.py`_
+ - `esri.sql`_: generated by `build_db_from_esri.py`_
+ - `metadata.sql`_: hand-generated
+ - `other_transformation.sql`_: generated by `build_db.py`_
+ - `prime_meridian.sql`_: generated by `build_db.py`_
+ - `proj_db_table_defs.sql`_: hand-generated. Database structure: CREATE TABLE / CREATE VIEW / CREATE TRIGGER
+ - `projected_crs.sql`_: generated by `build_db.py`_
+ - `unit_of_measure.sql`_: generated by `build_db.py`_
+ - `vertical_crs.sql`_: generated by `build_db.py`_
+ - `vertical_datum.sql`_: generated by `build_db.py`_
+
+ .. _`area.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/area.sql
+ .. _`axis.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/axis.sql
+ .. _`begin.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/begin.sql
+ .. _`commit.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/commit.sql
+ .. _`compound_crs.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/compound_crs.sql
+ .. _`concatenated_operation.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/concatenated_operation.sql
+ .. _`conversion.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/conversion.sql
+ .. _`coordinate_operation.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/coordinate_operation.sql
+ .. _`coordinate_system.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/coordinate_system.sql
+ .. _`crs.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/crs.sql
+ .. _`customizations.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/customizations.sql
+ .. _`ellipsoid.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/ellipsoid.sql
+ .. _`geodetic_crs.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/geodetic_crs.sql
+ .. _`geodetic_datum.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/geodetic_datum.sql
+ .. _`grid_alternatives.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/grid_alternatives.sql
+ .. _`grid_transformation_custom.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/grid_transformation_custom.sql
+ .. _`grid_transformation.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/grid_transformation.sql
+ .. _`helmert_transformation.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/helmert_transformation.sql
+ .. _`ignf.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/ignf.sql
+ .. _`esri.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/esri.sql
+ .. _`metadata.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/metadata.sql
+ .. _`other_transformation.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/other_transformation.sql
+ .. _`prime_meridian.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/prime_meridian.sql
+ .. _`proj_db_table_defs.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/proj_db_table_defs.sql
+ .. _`projected_crs.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/projected_crs.sql
+ .. _`unit_of_measure.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/unit_of_measure.sql
+ .. _`vertical_crs.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/vertical_crs.sql
+ .. _`vertical_datum.sql`: https://github.com/rouault/proj.4/blob/iso19111/data/sql/vertical_datum.sql
+
+ * scripts/:
+ - `build_db.py`_ : generate .sql files from EPSG database dumps
+ - `build_db_create_ignf.py`_: generates data/sql/`ignf.sql`_
+ - `build_db_from_esri.py`_: generates data/sql/`esri.sql`_
+ - `doxygen.sh`_: generates Doxygen documentation
+ - `gen_html_coverage.sh`_: generates HTML report of the coverage for --coverage build
+ - `filter_lcov_info.py`_: utility used by gen_html_coverage.sh
+ - `reformat.sh`_: used by reformat_cpp.sh
+ - `reformat_cpp.sh`_: reformat all .cpp/.hpp files according to LLVM-style formatting rules
+
+ .. _`build_db.py`: https://github.com/rouault/proj.4/blob/iso19111/scripts/build_db.py
+ .. _`build_db_create_ignf.py`: https://github.com/rouault/proj.4/blob/iso19111/scripts/build_db_create_ignf.py
+ .. _`build_db_from_esri.py`: https://github.com/rouault/proj.4/blob/iso19111/scripts/build_db_from_esri.py
+ .. _`doxygen.sh`: https://github.com/rouault/proj.4/blob/iso19111/scripts/doxygen.sh
+ .. _`gen_html_coverage.sh`: https://github.com/rouault/proj.4/blob/iso19111/scripts/gen_html_coverage.sh
+ .. _`filter_lcov_info.py`: https://github.com/rouault/proj.4/blob/iso19111/scripts/filter_lcov_info.py
+ .. _`reformat.sh`: https://github.com/rouault/proj.4/blob/iso19111/scripts/reformat.sh
+ .. _`reformat_cpp.sh`: https://github.com/rouault/proj.4/blob/iso19111/scripts/reformat_cpp.sh
+
+ * tests/unit/
+ - `test_c_api.cpp`_: test of src/c_api.cpp
+ - `test_common.cpp`_: test of src/common.cpp
+ - `test_util.cpp`_: test of src/util.cpp
+ - `test_crs.cpp`_: test of src/crs.cpp
+ - `test_datum.cpp`_: test of src/datum.cpp
+ - `test_factory.cpp`_: test of src/factory.cpp
+ - `test_io.cpp`_: test of src/io.cpp
+ - `test_metadata.cpp`_: test of src/metadata.cpp
+ - `test_operation.cpp`_: test of src/operation.cpp
+
+ .. _`test_c_api.cpp`: https://github.com/rouault/proj.4/blob/iso19111/test/unit/test_c_api.cpp
+ .. _`test_common.cpp`: https://github.com/rouault/proj.4/blob/iso19111/test/unit/test_common.cpp
+ .. _`test_util.cpp`: https://github.com/rouault/proj.4/blob/iso19111/test/unit/test_util.cpp
+ .. _`test_crs.cpp`: https://github.com/rouault/proj.4/blob/iso19111/test/unit/test_crs.cpp
+ .. _`test_datum.cpp`: https://github.com/rouault/proj.4/blob/iso19111/test/unit/test_datum.cpp
+ .. _`test_factory.cpp`: https://github.com/rouault/proj.4/blob/iso19111/test/unit/test_factory.cpp
+ .. _`test_io.cpp`: https://github.com/rouault/proj.4/blob/iso19111/test/unit/test_io.cpp
+ .. _`test_metadata.cpp`: https://github.com/rouault/proj.4/blob/iso19111/test/unit/test_metadata.cpp
+ .. _`test_operation.cpp`: https://github.com/rouault/proj.4/blob/iso19111/test/unit/test_operation.cpp
+
+C API
+-----
+
+`proj.h`_ has been extended to bind a number of C++ classes/methods to a C API.
+
+The main structure is an opaque PJ_OBJ* roughly encapsulating a osgeo::proj::BaseObject,
+that can represent a CRS or a CoordinateOperation object. A number of the
+C functions will work only if the right type of underlying C++ object is used
+with them. Misuse will be properly handled at runtime. If a user passes
+a PJ_OBJ* representing a coordinate operation to a pj_obj_crs_xxxx() function,
+it will properly error out. This design has been chosen over creating a
+dedicate PJ_xxx object for each C++ class, because such an approach would
+require adding many conversion and free functions for little benefit.
+
+This C API is incomplete. In particular, it does not allow to
+build ISO19111 objects at hand. However it currently permits a number of
+actions:
+
+ - building CRS and coordinate operations from WKT and PROJ strings, or
+ from the proj.db database
+ - exporting CRS and coordinate operations as WKT and PROJ strings
+ - querying main attributes of those objects
+ - finding coordinate operations between two CRS.
+
+`test_c_api.cpp`_ should demonstrates simple usage of the API (note:
+for the conveniency of writing the tests in C++, test_c_api.cpp wraps the C PJ_OBJ*
+instances in C++ 'keeper' objects that automatically call the pj_obj_unref()
+function at function end. In a pure C use, the caller must use pj_obj_unref()
+to prevent leaks.)
+
+.. _`proj.h`: http://even.rouault.free.fr/proj_cpp_api/html/proj_8h.html
+
+
+Documentation
+-------------
+
+All public C++ classes and methods and C functions are documented with
+Doxygen.
+
+`Current snapshot of Class list`_
+
+`Spaghetti inheritance diagram`_
+
+.. _`Current snapshot of Class list`: http://even.rouault.free.fr/proj_cpp_api/html/annotated.html
+.. _`Spaghetti inheritance diagram`: http://even.rouault.free.fr/proj_cpp_api/html/inherits.html
+
+A basic integration of the Doxygen XML output into the general PROJ
+documentation (using reStructuredText format) has been done with the
+Sphinx extension `Breathe`_, producing:
+
+ * `One section with the C++ API`_
+ * `One section with the C API`_
+
+.. _`Breathe`: https://breathe.readthedocs.io/en/latest/
+.. _`One section with the C++ API`: http://even.rouault.free.fr/proj_cpp_api/rst_generated/html/development/reference/cpp/index.html
+.. _`One section with the C API`: http://even.rouault.free.fr/proj_cpp_api/rst_generated/html/development/reference/functions.html#c-api-for-iso-19111-functionality
+
+Testing
+-------
+
+Nearly all exported methods are tested by a unit test. Global line coverage
+of the new files is 92%. Those tests represent 16k lines of codes.
+
+
+Build requirements
+------------------
+
+The new code leverages on a number of C++11 features (auto keyword, constexpr,
+initializer list, std::shared_ptr, lambda functions, etc.), which means that
+a C++11-compliant compiler must be used to generate PROJ:
+
+ * gcc >= 4.8
+ * clang >= 3.3
+ * Visual Studio >= 2015.
+
+Compilers tested by the Travis-CI and AppVeyor continuous integration
+environments:
+
+ * GCC 4.8
+ * mingw-w64-x86-64 4.8
+ * clang 5.0
+ * Apple LLVM version 9.1.0 (clang-902.0.39.2)
+ * MSVC 2015 32 and 64 bit
+ * MSVC 2017 32 and 64 bit
+
+The libsqlite3 >= 3.7 development package must also be available. And the sqlite3
+binary must be available to build the proj.db files from the .sql files.
+
+Runtime requirements
+--------------------
+
+* libc++/libstdc++/MSVC runtime consistent with the compiler used
+* libsqlite3 >= 3.7
+
+
+Backward compatibility
+----------------------
+
+At this stage, no backward compatibility issue is foreseen, as no
+existing functional C code has been modified to use the new capabilities
+
+Future work
+-----------
+
+The work described in this RFC will be pursued in a number of directions.
+Non-exhaustively:
+
+ - Support for ESRI WKT1 dialect (PROJ currently ingest the ProjectedCRS in
+ `esri.sql`_ in that dialect, but there is no mapping between it and EPSG
+ operation and parameter names, so conversion to PROJ strings does not
+ always work.
+
+ - closer integration with the existing code base. In particular, the +init=dict:code
+ syntax should now go first to the database (then the `epsg` and `IGNF`
+ files can be removed). Similarly proj_create_crs_to_crs() could use the
+ new capabilities to find an appropriate coordinate transformation.
+
+ - and whatever else changes are needed to address GDAL and libgeotiff needs
+
+
+Adoption status
+---------------
+
+The RFC has been adopted with support from PSC members Kurt Schwehr, Kristian
+Evers, Howard Butler and Even Rouault.
diff --git a/_sources/community/rfc/rfc-3.rst.txt b/_sources/community/rfc/rfc-3.rst.txt
new file mode 100644
index 00000000..439f22d1
--- /dev/null
+++ b/_sources/community/rfc/rfc-3.rst.txt
@@ -0,0 +1,151 @@
+.. _rfc3:
+
+====================================================================
+PROJ RFC 3: Dependency management
+====================================================================
+
+:Author: Kristian Evers
+:Contact: kreve@sdfe.dk
+:Status: Adopted
+:Last Updated: 2019-01-16
+
+Summary
+-------------------------------------------------------------------------------
+
+This document defines a set of guidelines for dependency management in PROJ.
+With PROJ being a core component in many downstream software packages clearly
+stating which dependencies the library has is of great value. This document
+concern both programming language standards as well as minimum required
+versions of library dependencies and build tools.
+
+It is proposed to adopt a rolling update scheme that ensures that PROJ is
+sufficiently accessible, even on older systems, as well as keeping up with the
+technological evolution. The scheme is divided in two parts, one concerning
+versions of used programming languages within PROJ and the other concerning
+software packages that PROJ depend on.
+
+With adoption of this RFC, versions used for
+
+1. programming languages will always be at least two revisions behind the most
+ recent standard
+2. software packages will always be at least two years old
+ (patch releases are exempt)
+
+A change in programming language standard can only be introduced with a new
+major version release of PROJ. Changes for software package dependencies can be
+introduced with minor version releases of PROJ. Changing the version
+requirements for a dependency needs to be approved by the PSC.
+
+Following the above rule set will ensure that all but the most conservative
+users of PROJ will be able to build and use the most recent version of the
+library.
+
+In the sections below details concerning programming languages and software
+dependencies are outlined. The RFC is concluded with a bootstrapping section
+that details the state of dependencies after the accept of the RFC.
+
+
+Background
+-------------------------------------------------------------------------------
+
+PROJ has traditionally been written in C89. Until recently, no formal
+requirements of e.g. build systems has been defined and formally accepted by
+the project. :ref:`RFC2 <rfc2>` formally introduces dependencies on C++11 and
+SQLite 3.7.
+
+In this RFC a rolling update of version or standard requirements is described.
+The reasoning behind a rolling update scheme is that it has become increasingly
+evident that C89 is becoming outdated and creating a less than optimal
+development environment for contributors. It has been noted that the most
+commonly used compilers all now support more recent versions of C, so the
+strict usage of C89 is no longer as critical as it used to be.
+
+Similarly, rolling updates to other tools and libraries that PROJ depend on
+will ensure that the code base can be kept modern and in line with the rest of
+the open source software ecosphere.
+
+
+C and C++
+-------------------------------------------------------------------------------
+
+Following :ref:`RFC2 <rfc2>` PROJ is written in both C and C++. At the time of
+writing the core library is C based and the code described in RFC2 is written
+in C++. While the core library is mostly written in C it is compiled as C++.
+Minor sections of PROJ, like the geodesic algorithms are still compiled as C
+since there is no apparent benefit of compiling with a C++ compiler. This may
+change in the future.
+
+Both the C and C++ standards are updated with regular intervals. After an
+update of a standard it takes time for compiler manufacturers to implement the
+standards fully, which makes adaption of new standards potentially troublesome
+if done too soon. On the other hand, waiting too long to adopt new standards
+will eventually make the code base feel old and new contributors are more
+likely to stay away because they don't want to work using tools of the past.
+With a rolling update scheme both concerns can be managed by always staying
+behind the most recent standard, but not so far away that potential
+contributors are scared away. Keeping a policy of always lagging behind be two
+iterations of the standard is thought to be the best comprise between the two
+concerns.
+
+C comes in four ISO standardised varieties: C89, C99, C11, C18. In this
+document we refer to their informal names for ease of reading. C++ exists in
+five varieties: C++98, C++03, C++11, C++14, C++17. Before adoption of this RFC
+PROJ uses C89 and C++11. For C, that means that the used standard is three
+iterations behind the most recent standard. C++ is two iterations behind.
+Following the rules in this RFC the required C standard used in PROJ is at
+allowed to be two iterations behind the most recent standard. That means that a
+change to C99 is possible, as long as the PROJ PSC acknowledges such a change.
+
+When a new standard for either C or C++ is released PROJ should consider
+changing its requirement to the next standard in the line. For C++ that means a
+change in standard roughly every three years, for C the periods between
+standard updates is expected to be longer. Adaptation of new programming
+language standards should be coordinated with a major version release of PROJ.
+
+
+Software dependencies
+-------------------------------------------------------------------------------
+
+At the time of writing PROJ is dependent on very few external packages. In
+fact only one runtime dependency is present: SQLite. Building PROJ also
+requires one of two external dependencies for configuration: Autotools or
+CMake.
+
+As with programming language standards it is preferable that software
+dependencies are a bit behind the most recent development. For this reason it
+is required that the minimum version supported in PROJ dependencies is at least
+two years old, preferably more. It is not a requirement that the minimum
+version of dependencies is always kept strictly two years behind current
+development, but it is allowed in case future development of PROJ warrants an
+update. Changes in minimum version requirements are allowed to happen with
+minor version releases of PROJ.
+
+At the time of writing the minimum version required for SQLite it 3.7 which was
+released in 2010. CMake currently is required to be at least at version 2.8.3
+which was also released in 2010.
+
+
+Bootstrapping
+-------------------------------------------------------------------------------
+
+This RFC comes with a set of guidelines for handling dependencies for PROJ in
+the future. Up until now dependencies hasn't been handled consistently, with
+some dependencies not being approved formally by the projects governing body.
+Therefore minimum versions of PROJ dependencies is proposed so that at the
+acception of this RFC PROJ will have the following external requirements:
+
+* C99 (was C89)
+* C++11 (already approved in :ref:`RFC2 <rfc2>`)
+* SQLite 3.7 (already approved in :ref:`RFC2 <rfc2>`)
+* CMake 3.5 (was 2.8.3)
+
+
+Adoption status
+-------------------------------------------------------------------------------
+
+The RFC was adopted on 2018-01-19 with +1's from the following PSC members
+
+* Kristian Evers
+* Even Rouault
+* Thomas Knudsen
+* Howard Butler
diff --git a/_sources/community/rfc/rfc-4.rst.txt b/_sources/community/rfc/rfc-4.rst.txt
new file mode 100644
index 00000000..9673c898
--- /dev/null
+++ b/_sources/community/rfc/rfc-4.rst.txt
@@ -0,0 +1,804 @@
+.. _rfc4:
+
+====================================================================
+PROJ RFC 4: Remote access to grids and GeoTIFF grids
+====================================================================
+
+:Author: Even Rouault, Howard Butler
+:Contact: even.rouault@spatialys.com, howard@hobu.co
+:Status: Adopted
+:Implementation target: PROJ 7
+:Last Updated: 2020-01-10
+
+Motivation
+-------------------------------------------------------------------------------
+
+PROJ 6 brings undeniable advances in the management of coordinate
+transformations between datums by relying and applying information available in
+the PROJ database. PROJ's rapid evolution from a cartographic projections
+library with a little bit of geodetic capability to a full geodetic
+transformation and description environment has highlighted the importance of
+the support data. Users desire the convenience of software doing the right
+thing with the least amount of fuss, and survey organizations wish to deliver
+their models across as wide a software footprint as possible. To get results
+with the highest precision, a grid file that defines a model that provides
+dimension shifts is often needed. The proj-datumgrid project centralizes grids
+available under an open data license and bundles them in different archives
+split along major geographical regions of the world .
+
+It is assumed that a PROJ user has downloaded and installed grid files that are
+referred to in the PROJ database. These files can be quite large in aggregate,
+and packaging support by major distribution channels is somewhat uneven due to
+their size, sometimes ambiguous licensing story, and difficult-to-track
+versioning and lineage. It is not always clear to the user, especially to
+those who may not be so familiar with geodetic operations, that the highest
+precision transformation may not always being applied if grid data is not
+available. Users want both convenience and correctness, and management of the
+shift files can be challenging to those who may not be aware of their
+importance to the process.
+
+The computing environment in which PROJ operates is also changing. Because the
+shift data can be so large (currently more than 700 MB of uncompressed data,
+and growing), deployment of high accuracy operations can be limited due to
+deployment size constraints (serverless operations, for example). Changing to a
+delivery format that supports incremental access over a network along with
+convenient access and compression will ease the resource burden the shift files
+present while allowing the project to deliver transformation capability with
+the highest known precision provided by the survey organizations.
+
+Adjustment grids also tend to be provided in many different formats depending
+on the organization and country that produced them. In PROJ, we have over time
+"standardized" on using horizontal shift grids as NTv2 and vertical shift grids
+using GTX. Both have poor general support as dedicated formats, limited
+metadata capabilities, and neither are not necessarily "cloud optimized" for
+incremental access across a network.
+
+Summary of work planned by this RFC
+-------------------------------------------------------------------------------
+
+- Grids will be hosted by one or several Content Delivery Networks (CDN)
+- Grid loading mechanism will be reworked to be able to download grids or parts
+ of grids from a online repository. When opted in, users will no longer have to
+ manually fetch grid files and place them in PROJ_LIB.
+ Full and accurate capability of the software will no longer require hundreds
+ of megabytes of grid shift files in advance, even if only just a few of them
+ are needed for the transformations done by the user.
+- Local caching of grid files, or even part of files, so that users end up
+ mirroring what they actually use.
+- A grid shift format, for both horizontal and vertical shift grids (and in
+ potential future steps, for other needs, such as deformation models) will be
+ implemented.
+
+The use of grids locally available will of course still be available, and will
+be the default behavior.
+
+Network access to grids
+-------------------------------------------------------------------------------
+
+curl will be an optional build dependency of PROJ, added in autoconf and cmake
+build systems. It can be disabled at build time, but this must be
+an explicit setting of configure/cmake as the resulting builds have less functionality.
+When curl is enabled at build time, download of grids themselves will not be
+enabled by default at runtime. It will require explicit consent of the user, either
+through the API
+(:c:func:`proj_context_set_enable_network`) through the PROJ_NETWORK=ON
+environment variable, or the ``network = on`` setting of proj.ini.
+
+Regarding the minimum version of libcurl required, given GDAL experience that
+can build with rather ancient libcurl for similar functionality, we can aim for
+libcurl >= 7.29.0 (as being available in RHEL 7).
+
+An alternate pluggable network interface can also be set by the user in case
+support for libcurl was not built in, or if for the desired context of use, the
+user wishes to provide the network implementation (a typical use case could be
+QGIS that would use its QT-based networking facilities to solve issues with
+SSL, proxy, authentication, etc.)
+
+A text configuration file, installed in ${installation_prefix}/share/proj/proj.ini
+(or ${PROJ_LIB}/proj.ini)
+will contain the URL of the CDN that will be used.
+The user may also override this setting with the
+:c:func:`proj_context_set_url_endpoint` or through the PROJ_NETWORK_ENDPOINT
+environment variable.
+
+The rationale for putting proj.ini in that location is
+that it is a well-known place by PROJ users, with the existing PROJ_LIB mechanics
+for systems like Windows where hardcoded paths at runtime aren't generally usable.
+
+C API
++++++
+
+The preliminary C API for the above is:
+
+.. code-block:: c
+
+ /** Enable or disable network access.
+ *
+ * @param ctx PROJ context, or NULL
+ * @return TRUE if network access is possible. That is either libcurl is
+ * available, or an alternate interface has been set.
+ */
+ int proj_context_set_enable_network(PJ_CONTEXT* ctx, int enable);
+
+ /** Define URL endpoint to query for remote grids.
+ *
+ * This overrides the default endpoint in the PROJ configuration file or with
+ * the PROJ_NETWORK_ENDPOINT environment variable.
+ *
+ * @param ctx PROJ context, or NULL
+ * @param url Endpoint URL. Must NOT be NULL.
+ */
+ void proj_context_set_url_endpoint(PJ_CONTEXT* ctx, const char* url);
+
+ /** Opaque structure for PROJ. Implementations might cast it to their
+ * structure/class of choice. */
+ typedef struct PROJ_NETWORK_HANDLE PROJ_NETWORK_HANDLE;
+
+ /** Network access: open callback
+ *
+ * Should try to read the size_to_read first bytes at the specified offset of
+ * the file given by URL url,
+ * and write them to buffer. *out_size_read should be updated with the actual
+ * amount of bytes read (== size_to_read if the file is larger than size_to_read).
+ * During this read, the implementation should make sure to store the HTTP
+ * headers from the server response to be able to respond to
+ * proj_network_get_header_value_cbk_type callback.
+ *
+ * error_string_max_size should be the maximum size that can be written into
+ * the out_error_string buffer (including terminating nul character).
+ *
+ * @return a non-NULL opaque handle in case of success.
+ */
+ typedef PROJ_NETWORK_HANDLE* (*proj_network_open_cbk_type)(
+ PJ_CONTEXT* ctx,
+ const char* url,
+ unsigned long long offset,
+ size_t size_to_read,
+ void* buffer,
+ size_t* out_size_read,
+ size_t error_string_max_size,
+ char* out_error_string,
+ void* user_data);
+
+ /** Network access: close callback */
+ typedef void (*proj_network_close_cbk_type)(PJ_CONTEXT* ctx,
+ PROJ_NETWORK_HANDLE* handle,
+ void* user_data);
+
+ /** Network access: get HTTP headers */
+ typedef const char* (*proj_network_get_header_value_cbk_type)(
+ PJ_CONTEXT* ctx,
+ PROJ_NETWORK_HANDLE* handle,
+ const char* header_name,
+ void* user_data);
+
+ /** Network access: read range
+ *
+ * Read size_to_read bytes from handle, starting at offset, into
+ * buffer.
+ * During this read, the implementation should make sure to store the HTTP
+ * headers from the server response to be able to respond to
+ * proj_network_get_header_value_cbk_type callback.
+ *
+ * error_string_max_size should be the maximum size that can be written into
+ * the out_error_string buffer (including terminating nul character).
+ *
+ * @return the number of bytes actually read (0 in case of error)
+ */
+ typedef size_t (*proj_network_read_range_type)(
+ PJ_CONTEXT* ctx,
+ PROJ_NETWORK_HANDLE* handle,
+ unsigned long long offset,
+ size_t size_to_read,
+ void* buffer,
+ size_t error_string_max_size,
+ char* out_error_string,
+ void* user_data);
+
+ /** Define a custom set of callbacks for network access.
+ *
+ * All callbacks should be provided (non NULL pointers).
+ *
+ * @param ctx PROJ context, or NULL
+ * @param open_cbk Callback to open a remote file given its URL
+ * @param close_cbk Callback to close a remote file.
+ * @param get_header_value_cbk Callback to get HTTP headers
+ * @param read_range_cbk Callback to read a range of bytes inside a remote file.
+ * @param user_data Arbitrary pointer provided by the user, and passed to the
+ * above callbacks. May be NULL.
+ * @return TRUE in case of success.
+ */
+ int proj_context_set_network_callbacks(
+ PJ_CONTEXT* ctx,
+ proj_network_open_cbk_type open_cbk,
+ proj_network_close_cbk_type close_cbk,
+ proj_network_get_header_value_cbk_type get_header_value_cbk,
+ proj_network_read_range_type read_range_cbk,
+ void* user_data);
+
+
+To make network access efficient, PROJ will internally have a in-memory cache
+of file ranges to only issue network requests by chunks of 16 KB or multiple of them,
+to limit the number of HTTP GET requests and minimize latency caused by network
+access. This is very similar to the behavior of the GDAL
+`/vsicurl/ <https://gdal.org/user/virtual_file_systems.html#vsicurl-http-https-ftp-files-random-access>`_
+I/O layer. The plan is to mostly copy GDAL's vsicurl implementation inside PROJ, with
+needed adjustments and proper namespacing of it.
+
+A retry strategy (typically a delay with an exponential back-off and some random
+jitter) will be added to account for intermittent network or server-side failure.
+
+URL building
+++++++++++++
+
+The PROJ database has a ``grid_transformation`` grid whose column ``grid_name``
+(and possibly ``grid2_name``) contain the name of the grid as indicated by the
+authority having registered the transformation (typically EPSG). As those
+grid names are not generally directly usable by PROJ, the PROJ database has
+also a ``grid_alternatives`` table that link original grid names to the ones used
+by PROJ. When network access will be available and needed due to lack of a
+local grid, the full URL will be the
+endpoint from the configuration or set by the user, the basename of the PROJ
+usable filename, and the "tif" suffix. So if the CDN is at http://example.com
+and the name from ``grid_alternatives`` is egm96_15.gtx, then the URL will
+be http://example.com/egm96_15.tif
+
+Grid loading
+++++++++++++
+
+The following files will be affected, in one way or another, by the above describes
+changes:
+nad_cvt.cpp, nad_intr.cpp, nad_init.cpp, grid_info.cpp, grid_list.cpp, apply_gridshift.cpp,
+apply_vgridshift.cpp.
+
+In particular the current logic that consists to ingest all the values of a
+grid/subgrid in the ct->cvs array will be completely modified, to enable
+access to grid values at a specified (x,y) location.
+
+proj_create_crs_to_crs() / proj_create_operations() impacts
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
+
+Once network access is available, all grids known to the PROJ database
+(grid_transformation + grid_alternatives table) will be assumed to be available,
+when computing the potential pipelines between two CRS.
+
+Concretely, this will be equivalent to calling
+:cpp:func:`proj_operation_factory_context_set_grid_availability_use`
+with the ``use`` argument set to a new enumeration value
+
+.. code-block:: c
+
+ /** Results will be presented as if grids known to PROJ (that is
+ * registered in the grid_alternatives table of its database) were
+ * available. Used typically when networking is enabled.
+ */
+ PROJ_GRID_AVAILABILITY_KNOWN_AVAILABLE
+
+
+Local on-disk caching of remote grids
++++++++++++++++++++++++++++++++++++++
+
+As many workflows will tend to use the same grids over and over, a local
+on-disk caching of remote grids will be added. The cache will be a single
+SQLite3 database, in a user-writable directory shared by all applications using
+PROJ.
+
+Its total size will be configurable, with a default maximum size of 100 MB
+in proj.ini. The cache will also keep the timestamp of the last time it checked
+various global properties of the file (its size, Last-Modified and ETag headers).
+A time-to-live parameter, with a default of 1 day in proj.ini, will be used to
+determine whether the CDN should be hit to verify if the information in the
+cache is still up-to-date.
+
+.. code-block:: c
+
+ /** Enable or disable the local cache of grid chunks
+ *
+ * This overrides the setting in the PROJ configuration file.
+ *
+ * @param ctx PROJ context, or NULL
+ * @param enabled TRUE if the cache is enabled.
+ */
+ void proj_grid_cache_set_enable(PJ_CONTEXT *ctx, int enabled);
+
+ /** Override, for the considered context, the path and file of the local
+ * cache of grid chunks.
+ *
+ * @param ctx PROJ context, or NULL
+ * @param fullname Full name to the cache (encoded in UTF-8). If set to NULL,
+ * caching will be disabled.
+ */
+ void proj_grid_cache_set_filename(PJ_CONTEXT* ctx, const char* fullname);
+
+ /** Override, for the considered context, the maximum size of the local
+ * cache of grid chunks.
+ *
+ * @param ctx PROJ context, or NULL
+ * @param max_size_MB Maximum size, in mega-bytes (1024*1024 bytes), or
+ * negative value to set unlimited size.
+ */
+ void proj_grid_cache_set_max_size(PJ_CONTEXT* ctx, int max_size_MB);
+
+ /** Override, for the considered context, the time-to-live delay for
+ * re-checking if the cached properties of files are still up-to-date.
+ *
+ * @param ctx PROJ context, or NULL
+ * @param ttl_seconds Delay in seconds. Use negative value for no expiration.
+ */
+ void proj_grid_cache_set_ttl(PJ_CONTEXT* ctx, int ttl_seconds);
+
+ /** Clear the local cache of grid chunks.
+ *
+ * @param ctx PROJ context, or NULL.
+ */
+ void proj_grid_cache_clear(PJ_CONTEXT* ctx);
+
+The planned database structure is:
+
+.. code-block:: sql
+
+ -- General properties on a file
+ CREATE TABLE properties(
+ url TEXT PRIMARY KEY NOT NULL,
+ lastChecked TIMESTAMP NOT NULL,
+ fileSize INTEGER NOT NULL,
+ lastModified TEXT,
+ etag TEXT
+ );
+
+ -- Store chunks of data. To avoid any potential fragmentation of the
+ -- cache, the data BLOB is always set to the maximum chunk size of 16 KB
+ -- (right padded with 0-byte)
+ -- The actual size is stored in chunks.data_size
+ CREATE TABLE chunk_data(
+ id INTEGER PRIMARY KEY AUTOINCREMENT CHECK (id > 0),
+ data BLOB NOT NULL
+ );
+
+ -- Record chunks of data by (url, offset)
+ CREATE TABLE chunks(
+ id INTEGER PRIMARY KEY AUTOINCREMENT CHECK (id > 0),
+ url TEXT NOT NULL,
+ offset INTEGER NOT NULL,
+ data_id INTEGER NOT NULL,
+ data_size INTEGER NOT NULL,
+ CONSTRAINT fk_chunks_url FOREIGN KEY (url) REFERENCES properties(url),
+ CONSTRAINT fk_chunks_data FOREIGN KEY (data_id) REFERENCES chunk_data(id)
+ );
+ CREATE INDEX idx_chunks ON chunks(url, offset);
+
+ -- Doubly linked list of chunks. The next link is to go to the least-recently
+ -- used entries.
+ CREATE TABLE linked_chunks(
+ id INTEGER PRIMARY KEY AUTOINCREMENT CHECK (id > 0),
+ chunk_id INTEGER NOT NULL,
+ prev INTEGER,
+ next INTEGER,
+ CONSTRAINT fk_links_chunkid FOREIGN KEY (chunk_id) REFERENCES chunks(id),
+ CONSTRAINT fk_links_prev FOREIGN KEY (prev) REFERENCES linked_chunks(id),
+ CONSTRAINT fk_links_next FOREIGN KEY (next) REFERENCES linked_chunks(id)
+ );
+ CREATE INDEX idx_linked_chunks_chunk_id ON linked_chunks(chunk_id);
+
+ -- Head and tail pointers of the linked_chunks. The head pointer is for
+ -- the most-recently used chunk.
+ -- There should be just one row in this table.
+ CREATE TABLE linked_chunks_head_tail(
+ head INTEGER,
+ tail INTEGER,
+ CONSTRAINT lht_head FOREIGN KEY (head) REFERENCES linked_chunks(id),
+ CONSTRAINT lht_tail FOREIGN KEY (tail) REFERENCES linked_chunks(id)
+ );
+ INSERT INTO linked_chunks_head_tail VALUES (NULL, NULL);
+
+The chunks table will store 16 KB chunks (or less for terminating chunks).
+The linked_chunks and linked_chunks_head_tail table swill act as a doubly linked
+list of chunks, with the least recently used ones at the end of the list, which
+will be evicted when the cache saturates.
+
+The directory used to locate this database will be ${XDG_DATA_HOME}/proj
+(per https://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html)
+where ${XDG_DATA_HOME} defaults to ${HOME}/.local/share on Unix builds
+and ${LOCALAPPDATA} on Windows builds. Exact details to be sorted out, but
+https://github.com/ActiveState/appdirs/blob/a54ea98feed0a7593475b94de3a359e9e1fe8fdb/appdirs.py#L45-L97
+can be a good reference.
+
+As this database might be accessed by several threads or processes at the same
+time, the code accessing to it will carefully honour SQLite3 errors regarding
+to locks, to do appropriate retries if another thread/process is currently
+locking the database. Accesses requiring a modification of the database will
+start with a BEGIN IMMEDIATE transaction so as to acquire a write lock.
+
+.. note:: This database should be hosted on a local disk, not a network one.
+ Otherwise SQLite3 locking issues are to be expected.
+
+CDN provider
+++++++++++++
+
+`Amazon Public Datasets <https://aws.amazon.com/opendata/public-datasets/>`_
+has offered to be a storage and CDN provider.
+
+The program covers storage and egress (bandwidth) of the data.
+They generally don't allow usage of CloudFront
+(their CDN) as part of the program (we would usually look to have it covered
+by credits), but in this instance, they would be fine to provide it.
+They'd only ask that we keep the CloudFront URL "visible" (as appropriate for
+the use case) so people can see where the data is hosted in case they go looking.
+Their terms can be seen at https://aws.amazon.com/service-terms/ and CloudFront
+has its own, small section. Those terms may change a bit from time to time for
+minor changes. Major changing service terms is assumed to be unfrequent.
+There are also the Public Dataset Program terms at http://aws.amazon.com/public-datasets/terms/.
+Those also do not effectively change over time and are renewed on a 2 year basis.
+
+Criteria for grid hosting
++++++++++++++++++++++++++
+
+The grids hosted on the CDN will be exactly the ones collected,
+currently and in the future, by the `proj-datumgrid <https://github.com/OSGeo/proj-datumgrid/>`_
+initiative. In particular, new grids are accepted as long as
+they are released under a license that is compatible with the
+`Open Source Definition <https://opensource.org/osd-annotated>`_ and the source
+of the grid is clearly stated and verifiable. Suitable licenses include:
+
+- Public domain
+- X/MIT
+- BSD 2/3/4 clause
+- CC0
+- CC-BY (v3.0 or later)
+- CC-BY-SA (v3.0 or later)
+
+For new grids to be transparently used by the proj_create_crs_to_crs() mechanics,
+they must be registered in the PROJ database (proj.db) in the ``grid_transformation`` and
+``grid_alternatives`` table. The nominal path to have a new record in the grid_transformation
+is to have a transformation being registered in the EPSG dataset (if there is no
+existing one), which will be subsequently imported into the PROJ database.
+
+Versioning, historical preservation of grids
+++++++++++++++++++++++++++++++++++++++++++++
+
+The policy regarding this should be similar to the one applied to
+`proj-datumgrid <https://github.com/OSGeo/proj-datumgrid/>`_, which even if
+not formalized, is around the following lines:
+
+- Geodetic agencies release regularly new version of grids. Typically for the
+ USA, NOAA has released GEOID99, GEOID03, GEOID06, GEOID09, GEOID12A, GEOID12B,
+ GEOID18 for the NAVD88 to NAD83/NAD83(2011) vertical adjustments. Each of these
+ grids is considered by EPSG and PROJ has a separate object, with distinct filenames.
+ The release of a new version does not cause the old grid to be automatically removed.
+ That said, due to advertized accuracies and supersession rules of the EPSG dataset, the
+ most recent grid will generally be used for a CRS -> CRS transformation if the
+ user uses proj_create_crs_to_crs() (with the exception that if a VERT_CRS WKT
+ includes a GEOID_MODEL known to PROJ, an old version of the grid will be used).
+ If the user specifies a whole pipeline with an explicit grid name, it will be
+ of course strictly honoured.
+ As time goes, the size of the datasets managed by proj-datumgrid will be increasing,
+ we will have to explore on we managed that for the distributed .zip / .tar.gz
+ archives. This should not be a concern for CDN hosted content.
+
+- In case software-related conversion errors from the original grid format to the
+ one used by PROJ (be it GTX, NTv2 or GeoTIFF) would happen, the previous erroneous
+ version of the dataset would be replaced by the corrected one. In that situation,
+ this might have an effect with the local on-disk caching of remote grids. We will
+ have to see with the CDN providers used if we can use for example the ETag HTTP header
+ on the client to detect a change, so that old cached content is not erroneously
+ reused (if not possible, we'll have to use some text file listing the grid names and their
+ current md5sum)
+
+
+Grids in GeoTIFF format
+-------------------------------------------------------------------------------
+
+Limitations of current formats
+++++++++++++++++++++++++++++++
+
+Several formats exist depending on the ad-hoc needs and ideas of the original
+data producer. It would be appropriate to converge on a common format able to
+address the different use cases.
+
+- Not tiled. Tiling is a nice to have property for cloud-friendly access to
+ large files.
+- No support for compression
+- The NTv2 structures is roughly: header of main grid, data of main grid,
+ header of subgrid 1, data of subgrid 1, header of subgrid 2, data of subgrid 2,
+ etc.Due to the headers being scattered through the file, it is not possibly
+ to retrieve with a single HTTP GET request all header information.
+- GTX format has no provision to store metadata besides the minimum georeferencing
+ of the grid. NTv2 is a bit richer, but no extensible metadata possible.
+
+Discussion on choice of format
+++++++++++++++++++++++++++++++
+
+We have been made recently aware of other initiatives from the industry to come
+with a common format to store geodetic adjustment data. Some discussions have
+happen recently within the OGC CRS Working group. Past efforts include the
+Esri's proposed Geodetic data Grid eXchange Format, GGXF, briefly mentioned at
+page 86 of
+https://iag.dgfi.tum.de/fileadmin/IAG-docs/Travaux2015/01_Travaux_Template_Comm_1_tvd.pdf
+and page 66 of ftp://ftp.iaspei.org/pub/meetings/2010-2019/2015-Prague/IAG-Geodesy.pdf
+The current trend of those works would be to use a netCDF / HDF5 container.
+
+So, for the sake of completeness, we list hereafter a few potential candidate
+formats and their pros and cons.
+
+TIFF/GeoTIFF
+************
+
+Strong points:
+
+* TIFF is a well-known and widespread format.
+
+* The GeoTIFF encoding is a widely industry supported scheme to encode georeferencing.
+ It is now a `OGC standard <https://www.opengeospatial.org/standards/geotiff>`_
+
+* There are independent initiatives to share grids as GeoTIFF, like
+ `that one <https://www.agisoft.com/downloads/geoids/>`_
+
+* TIFF can contain multiple images (IFD: Image File Directory) chained together.
+ This is the mechanism used for multiple-page scanned TIFF files, or in the
+ geospatial field to store multi-resolution/pyramid rasters. So it can be
+ used with sub-grids as in the NTv2 format.
+
+* Extensive experience with the TIFF format, and its appropriateness for network
+ access, in particular through the `Cloud Optimized GeoTIFF initiative <https://www.cogeo.org/>`_
+ whose layout can make use of sub-grids efficient from a network access
+ perspective, because grid headers can be put at the beginning of the file, and
+ so being retrieved in a single HTTP GET request.
+
+* TIFF can be tiled.
+
+* TIFF can be compressed. Commonly found compression formats are DEFLATE, LZW,
+ combined with differential integer or floating point predictors
+
+* A TIFF image can contain a configurable number of channels/bands/samples.
+ In the rest of the document, we will use the sample terminology for this concept.
+
+* TIFF sample organization can be configured: either the values of different
+ samples are packed together (`PlanarConfiguration <https://www.awaresystems.be/imaging/tiff/tifftags/planarconfiguration.html>`_ = Contig), or put in separate tiles/strips
+ (PlanarConfiguration = Separate)
+
+* libtiff is a dependency commonly found in binary distributions of the
+ "ecosystem" to which PROJ belongs too
+
+* libtiff benefits from many years of efforts to increase its security, for
+ example being integrated to the oss-fuzz initiative. Given the potential
+ fetching of grids, using security tested components is an important concern.
+
+* Browser-side: there are "ports" of libtiff/libgeotiff in the browser such
+ as https://geotiffjs.github.io/ which could potentially make a port of PROJ
+ easier.
+
+Weak points:
+
+* we cannot use libgeotiff, since it depends itself on PROJ (to resolve CRS
+ or components of CRS from their EPSG codes). That said, for PROJ intended
+ use, we only need to decode the ModelTiepointTag and ModelPixelScaleTag TIFF
+ tags, so this can be done "at hand"
+
+* the metadata capabilities of TIFF baseline are limited. The TIFF format comes
+ with a predefined set of metadata items whose keys have numeric values. That
+ said, GDAL has used for the last 20 years or so a dedicated tag,
+ `GDAL_METADATA <https://www.awaresystems.be/imaging/tiff/tifftags/gdal_metadata.html>`_
+ of code 42112 that holds a XML-formatted string being able to store arbitrary
+ key-pair values.
+
+netCDF v3
+*********
+
+Strong points:
+
+* The binary format description as given in
+ `OGC 10-092r3 <http://portal.opengeospatial.org/files/?artifact_id=43734>`_ is relatively simple,
+ but it would still probably be necessary to use libnetcdf-c to access it
+
+* Metadata can be stored easily in netCDF attributes
+
+
+Weak points:
+
+* No compression in netCDF v3
+
+* No tiling in netCDF v3
+
+* Multi-samples variables are located in different sections of the files
+ (correspond to TIFF PlanarConfiguration = Separate)
+
+* No natural way of having hierarchical / multigrids. They must be encoded as
+ separate variables
+
+* georeferencing in netCDF is somewhat less standardized than TIFF/GeoTIFF.
+ The generally used model is `the conventions for CF (Climate and Forecast)
+ metadata <http://cfconventions.org/>`_
+ but there is nothing really handy in them for simple georeferencing with
+ the coordinate of the upper-left pixel and the resolution. The practice is
+ to write explicit lon and lat variables with all values taken by the grid.
+ GDAL has for many years supported a simpler syntax, using a GeoTransform
+ attribute.
+
+* From the format description, its layout could be relatively cloud friendly,
+ except that libnetcdf has no API to plug an alternate I/O layer.
+
+* Most binary distributions of netCDF nowadays are based on libnetcdf v4, which
+ implies the HDF5 dependency.
+
+* From a few issues we identified a few years ago regarding crashes on corrupted
+ datasets, we contacted libnetcdf upstream, but they did not seem to be
+ interested in addressing those security issues.
+
+netCDF v4 / HDF5
+****************
+
+Note: The netCDF v4 format is a profile of the HDF5 file format.
+
+Strong points:
+
+* Compression supported (ZLIB and SZIP predefined)
+
+* Tiling (chunking) supported
+
+* Values of Multi-sample variables can be interleaved together (similarly
+ to TIFF PlanarConfiguration = Contig) by using compound data types.
+
+* Hierarchical organization with groups
+
+* While the netCDF API does not provide an alternate I/O layer, this is
+ possible with the HDF5 API.
+
+* Grids can be indexed by more than 2 dimensions (for current needs, we
+ don't need more than 2D support)
+
+Weak points:
+
+* The `HDF 5 File format <https://support.hdfgroup.org/HDF5/doc/H5.format.html>`_
+ is more complex than netCDF v3, and likely more than TIFF. We do not have
+ in-depth expertise of it to assess its cloud-friendliness.
+
+* The ones mentioned for netCDF v3 regarding georeferencing and security apply.
+
+
+GeoPackage
+**********
+
+As PROJ has already a SQLite3 dependency, GeoPackage could be examined as a
+potential solution.
+
+Strong points:
+
+* SQLite3 dependency
+
+* OGC standard
+
+* Multi-grid capabilities
+
+* Tiling
+
+* Compression
+
+* Metadata capabilities
+
+Weak points:
+
+* GeoPackage mostly address the RGB(A) Byte use case, or via the tile gridded
+ data extension, single-sample non-Byte data. No native support for multi-sample
+ non-Byte data: each sample should be put in a separate raster table.
+
+* Experience shows that SQLite3 layout (at least the layout adopted when using
+ the standard libsqlite3) is not cloud friendly. Indices may be scattered in
+ different places of the file.
+
+Conclusions
+***********
+
+The 2 major contenders regarding our goals and constraints are GeoTIFF and HDF5.
+Given past positive experience and its long history, GeoTIFF remains our preferred
+choice.
+
+
+.. _description_geotiff_format:
+
+Format description
+++++++++++++++++++
+
+The format description is available in a dedicated :ref:`geodetictiffgrids`
+document.
+
+Tooling
++++++++
+
+A script will be developed to accept a list of individual grids to combine
+together into a single file.
+
+A ntv2_to_gtiff.py convenience script will be created to convert NTv2 grids,
+including their subgrids, to the above
+described GeoTIFF layout.
+
+A validation Python script will be created to check that a file meets the above
+described requirements and recommendations.
+
+Build requirements
+++++++++++++++++++
+
+The minimum libtiff version will be 4.0 (RHEL 7 ships with libtiff 4.0.3).
+To be able to read grids stored on the CDN, libtiff will need to build against
+zlib to have DEFLATE and LZW support, which is met by all known binary distributions
+of libtiff.
+
+The libtiff dependency can be disabled at build time, but this must be
+an explicit setting of configure/cmake as the resulting builds have less functionality.
+
+Dropping grid catalog functionality
+-------------------------------------------------------------------------------
+
+While digging through existing code, I more or less discovered that the PROJ
+code base has the concept of a grid catalog. This is a feature apparently triggered by
+using the +catalog=somefilename.csv in a PROJ string, where the CSV file list
+grid names, their extent, priority and date. It seems to be an alternative to using
++nadgrids with multiple grids, with the extra ability to interpolate shift values between
+several grids if a +date parameter is provided and the grid catalog mentions a
+date for each grids.
+It was added in June 2012 per `commit fcb186942ec8532655ff6cf4cc990e5da669a3bc
+<https://github.com/OSGeo/PROJ/commit/fcb186942ec8532655ff6cf4cc990e5da669a3bc>`_
+
+This feature is likely unknown to most users as there is no known documentation for
+it (neither in current documentation, nor in `historic one <https://web.archive.org/web/20160601000000*/http://trac.osgeo.org/proj/wiki/GenParms>`_).
+It is not either tested by PROJ tests, so its working status is unknown. It would
+likely make implementation of this RFC easier if this was removed. This would result in
+completely dropping the gridcatalog.cpp and gc_reader.cpp files, their call sites
+and the catalog_name and datum_date parameter from the PJ structure.
+
+In case similar functionality would be be needed, it might be later reintroduced
+as an extra mode of :ref:`hgridshift`, or using a dedicated transformation method,
+similarly to the :ref:`deformation` one,
+and possibly combining the several grids to interpolate among in the same file,
+with a date metadata item.
+
+Backward compatibility issues
+-------------------------------------------------------------------------------
+
+None anticipated, except the removal of the (presumably little used) grid catalog
+functionality.
+
+Potential future related work
+-----------------------------
+
+The foundations set in the definition of the GeoTIFF grid format should hopefully
+be reused to extend them to support deformation models (was initially discussed
+per https://github.com/OSGeo/PROJ/issues/1001).
+
+Definition of such an extension is out of scope of this RFC.
+
+Documentation
+-------------------------------------------------------------------------------
+
+- New API function will be documented.
+- A dedicated documentation page will be created to explain the working of
+ network-based access.
+- A dedicated documentation page will be created to describe the GeoTIFF based
+ grid format. Mostly reusing above material.
+
+Testing
+-------------------------------------------------------------------------------
+
+Number of GeoTIFF formulations (tiled vs untiled, PlanarConfiguration Separate vs
+Contig, data types, scale+offset vs not, etc.) will be tested.
+
+For testing of network capabilities, a mix of real hits to the CDN and use of
+the alternate pluggable network interface to test edge cases will be used.
+
+Proposed implementation
+-------------------------------------------------------------------------------
+
+A proposed implementation is available at https://github.com/OSGeo/PROJ/pull/1817
+
+Tooling scripts are currently available at https://github.com/rouault/sample_proj_gtiff_grids/
+(will be ultimately stored in PROJ repository)
+
+Adoption status
+-------------------------------------------------------------------------------
+
+The RFC was adopted on 2020-01-10 with +1's from the following PSC members
+
+* Kristian Evers
+* Even Rouault
+* Thomas Knudsen
+* Howard Butler
+* Kurt Schwehr
diff --git a/_sources/community/rfc/rfc-5.rst.txt b/_sources/community/rfc/rfc-5.rst.txt
new file mode 100644
index 00000000..44811d58
--- /dev/null
+++ b/_sources/community/rfc/rfc-5.rst.txt
@@ -0,0 +1,135 @@
+.. _rfc5:
+
+====================================================================
+PROJ RFC 5: Adopt GeoTIFF-based grids for grids delivered with PROJ
+====================================================================
+
+:Author: Even Rouault
+:Contact: even.rouault@spatialys.com
+:Status: Adopted
+:Implementation target: PROJ 7
+:Last Updated: 2020-01-28
+
+Motivation
+-------------------------------------------------------------------------------
+
+This RFC is a continuation of :ref:`rfc4`. With RFC4, PROJ can, upon request
+of the user, download grids from a CDN in a progressive way. There is also API,
+such as :cpp:func:`proj_download_file` to be able to download a GeoTIFF grid in
+the user writable directory. The content of the CDN at https://cdn.proj.org
+is https://github.com/OSGeo/PROJ-data , which has the same content
+as https://github.com/OSGeo/proj-datumgrid converted in GeoTIFF files. In the
+current state, we could have a somewhat inconsistency between users relying on
+the proj-datumgrid, proj-datumgrid-[world,northamerica,oceania,europe] packages
+of mostly NTv2 and GTX files, and what is shipped through the CDN. Maintaining
+two repositories is also a maintenance burden in the long term.
+
+It is thus desirable to have a single source of truth, and we propose it to be
+based on the GeoTIFF grids.
+
+Summary of work planned by this RFC and related decisions
+-------------------------------------------------------------------------------
+
+- https://github.com/OSGeo/PROJ-data/ will be used, starting with
+ PROJ 7.0, to create "static" grid packages.
+
+- For now, a single package of, mostly GeoTIFF grids (a few text files for
+ PROJ init style files, as well as a few edge cases for deformation models where
+ grids have not been converted), will be delivered.
+ Its size at the time of writing is 486 MB (compared to 1.5 GB of uncompressed
+ NTv2 + GTX content, compressed to ~ 700 MB currently)
+
+- The content of this archive will be flat, i.e. no subdirectories
+
+- Each file will be named according to the following pattern
+ ``${agency_name}_${filename}[.ext]``. For example fr_ign_ntf_r93.tif
+ This convention should allow packagers, if the need arise, to be able to
+ split the monolithic package in smaller ones, based on criterion related to
+ the country.
+
+ The agency name is the one you can see from the directory names at
+ https://github.com/OSGeo/PROJ-data/.
+ ``${agency_name}`` itself is structure like ``${two_letter_country_code_of_agency_nationality}_${some_abbreviation}``
+ (with the exception of eur_nkg, for the Nordic Geodetic Commission which
+ isn't affiliated to a single country but to some European countries, and
+ follows the general scheme)
+
+- https://github.com/OSGeo/proj-datumgrid and related packages will only be
+ maintained during the remaining lifetime of PROJ 6.x. After that, the
+ repository will no longer receive any update and will be put in archiving
+ state (see https://help.github.com/en/github/creating-cloning-and-archiving-repositories/about-archiving-repositories)
+
+- PROJ database ``grid_alternatives`` table will be updated to point to the new
+ TIFF filenames. It will also maintain the old names as used by current
+ proj-datumgrid packages to be able to provide backward compatibility when
+ a PROJ string refers to a grid by its previous name.
+
+- Upon adoption of this RFC, new grids referenced by PROJ database will only
+ point to GeoTIFF grid names.
+
+- Related to the above point, if a PROJ string refers to a grid name, let's
+ say foo.gsb. This grid will first be looked for in all the relevant locations
+ under this name. If no match is found, then a lookup in the
+ ``grid_alternatives`` table will be done to retrieve the potential new name
+ (GeoTIFF file), and if there's such match, a new look-up in the file system
+ will be done with the name of this GeoTIFF file.
+
+- The ``package_name`` column of grid_alternatives will no longer be filled.
+ And ``url`` will be filled with the direct URL to the grid in the CDN, for
+ example: https://cdn.proj.org/fr_ign_ntf_r93.tif
+
+- The Python scripts to convert grids (NTv2, GTX) to GeoTIFF currently available at
+ https://github.com/rouault/sample_proj_gtiff_grids/ will be moved to a
+ grid_tools/ subdirectories of https://github.com/OSGeo/PROJ-data/
+ Documentation for those utilities will be added to PROJ documentation.
+
+- Obviously, all the above assumes PROJ builds to have libtiff enabled.
+ Non-libtiff builds are not considered as nominal PROJ builds (if a PROJ master
+ build is attempted and libtiff is not detected, it fails. The user has to
+ explicitly ask to disable TIFF support), and users deciding to go through that
+ route will have to deal with the consequences (that is that
+ grid-based transformations generated by PROJ will likely be non working)
+
+Backward compatibility
+-------------------------------------------------------------------------------
+
+This change is considered to be *mostly* backward compatible. There might be
+impacts for software using :cpp:func:`proj_coordoperation_get_grid_used` and
+assuming that the url returned is one of the proj-datumgrid-xxx files at
+https://download.osgeo.org. As mentioned in
+https://lists.osgeo.org/pipermail/proj/2020-January/009274.html , this
+assumption was not completely bullet-proof either.
+There will be impacts on software checking the value of PROJ pipeline strings
+resulting :cpp:func:`proj_create_crs_to_crs`. The new grid names will now
+be returned (the most impacted software will likely be PROJ's own test suite)
+
+Although discouraged, people not using the new proj-datumgrid-geotiff-XXX.zip
+archives, should still be able to use the old archives made of NTv2/GTX files,
+at least as long as the PROJ database does not only point to a GeoTIFF grid.
+So this might be a short-term partly working solution, but at time goes, it
+will become increasingly non-working. The nominal combination will be
+PROJ 7.0 + proj-datumgrid-geotiff-1.0.zip
+
+Testing
+-------------------------------------------------------------------------------
+
+PROJ test suite will have to be adapted for the new TIFF based filenames.
+
+Mechanism to auto-promote existing NTv2/GTX names to TIFF ones will be exercised.
+
+Proposed implementation
+-------------------------------------------------------------------------------
+
+https://github.com/OSGeo/PROJ/pull/1891 and https://github.com/OSGeo/PROJ-data/pull/5
+
+Adoption status
+-------------------------------------------------------------------------------
+
+The RFC was adopted on 2020-01-28 with +1's from the following PSC members
+
+* Kristian Evers
+* Even Rouault
+* Thomas Knudsen
+* Howard Butler
+* Kurt Schwehr
+
diff --git a/_sources/community/rfc/rfc-6.rst.txt b/_sources/community/rfc/rfc-6.rst.txt
new file mode 100644
index 00000000..ed8115df
--- /dev/null
+++ b/_sources/community/rfc/rfc-6.rst.txt
@@ -0,0 +1,368 @@
+.. _rfc6:
+
+====================================================================
+PROJ RFC 6: Triangulation-based transformations
+====================================================================
+
+:Author: Even Rouault
+:Contact: even.rouault@spatialys.com
+:Status: Adopted
+:Implementation target: PROJ 7.2
+:Last Updated: 2020-09-02
+
+Summary
+-------------------------------------------------------------------------------
+
+This RFC adds a new transformation method, ``tinshift`` (TIN stands for
+Triangulated Irregular Network)
+
+The motivation for this work is to be able to handle the official transformations
+created by National Land Survey of Finland, for:
+
+- horizontal transformation between the KKJ and ETRS89 horizontal datums
+- vertical transformations between N43 and N60 heights, and N60 and N2000 heights.
+
+Such transformations are somehow related to traditional grid-based transformations,
+except that the correction values are hold by the vertices of the triangulation,
+instead of being at nodes of a grid.
+
+Triangulation are in a number of cases the initial product of a geodesic adjustment,
+with grids being a derived product. The Swiss grids have for example
+derived products of an original triangulation.
+
+Grid-based transformations remain very convenient to use because accessing
+correction values is really easy and efficient, so triangulation-based transformations
+are not meant as replacing them, but more about it being a complement, that is
+sometimes necessary to be able to replicate the results of a officially vetted
+transformation to a millimetric or better precision (speaking here about reproducibility
+of numeric results, rather than the physical accuracy of the transformation that
+might rather be centimetric). It is always possible to approach the result of
+the triangulation with a grid, but that may require to adopt a small grid step,
+and thus generate a grid that can be much larger than the original triangulation.
+
+Details
+-------------------------------------------------------------------------------
+
+Transformation
+++++++++++++++
+
+A new transformation method, ``tinshift``, is added. It takes one mandatory
+argument, ``file``, that points to a JSON file, which contains the triangulation
+and associated metadata. Input and output coordinates must be geographic or projected.
+Depending on the content
+of the JSON file, horizontal, vertical or both components of the coordinates may
+be transformed.
+
+The transformation is used like:
+
+::
+
+ $ echo 3210000.0000 6700000.0000 0 2020 | cct +proj=tinshift +file=./triangulation_kkj.json
+
+ 209948.3217 6697187.0009 0.0000 2020
+
+The transformation is invertible, with the same computational complexity than
+the forward transformation.
+
+Algorithm
++++++++++
+
+Internally, ``tinshift`` ingest the whole file into memory. It is considered that
+triangulation should be small enough for that. The above mentioned KKJ to ETRS89
+triangulation fits into 65 KB of JSON, for 1449 triangles and 767 vertices.
+
+When a point is transformed, one must find the triangle into which it falls into.
+Instead of iterating over all triangles, we build a in-memory quadtree to speed-up
+the identification of candidates triangles. On the above mentioned KKJ -> ETRS89
+triangulation, this speeds up the whole transformation by a factor of 10. The
+quadtree structure is a very good compromise between the performance gain it brings
+and the simplicity of its implementation (we have ported the implementation coming
+from GDAL, inherit from the one used for shapefile .spx spatial indices).
+
+To determine if a point falls into a triangle, one computes its 3
+`barycentric coordinates <https://en.wikipedia.org/wiki/Barycentric_coordinate_system#Conversion_between_barycentric_and_Cartesian_coordinates>`_
+from its projected coordinates, :math:`\lambda_i` for :math:`i=1,2,3`.
+They are real values (in the [0,1] range for a point inside the triangle),
+giving the weight of each of the 3 vertices of the triangles.
+
+Once those weights are known, interpolating the target horizontal
+coordinate is a matter of doing the linear combination of those weights with
+the target horizontal coordinates at the 3 vertices of the triangle (:math:`Xt_i` and :math:`Yt_i`):
+
+.. math::
+
+ X_{target} = Xt_1 * \lambda_1 + Xt_2 * \lambda_2 + Xt_3 * \lambda_3
+
+ Y_{target} = Yt_1 * \lambda_1 + Yt_2 * \lambda_2 + Yt_3 * \lambda_3
+
+This interpolation is exact at the vertices of the triangulation, and has linear properties
+inside each triangle. It is completely equivalent to other formulations of
+triangular interpolation, such as
+
+.. math::
+
+ X_{target} = A + X_{source} * B + Y_{source} * C
+
+ Y_{target} = D + X_{source} * E + Y_{source} * F
+
+where the A, B, C, D, E, F constants (for a given triangle) are found by solving
+the 2 systems of 3 linear equations, constraint by the source and target coordinate pairs
+of the 3 vertices of the triangle:
+
+.. math::
+
+ Xt_i = A + Xs_i * B + Ys_i * C
+
+ Yt_i = D + Xs_i * E + Ys_i * F
+
+
+.. note::
+
+ From experiments, the interpolation using barycentric coordinates is slightly
+ more numerically robust when interpolating projected coordinates of amplitude of the
+ order of 1e5 / 1e6, due to computations involving differences of coordinates.
+ Whereas the formulation with the A, B, C, D, E, F tends to have big values for
+ the A and D constants, and values clause to 0 for C and E, and close to 1 for
+ B and F. However, the difference between the two approaches is negligible for
+ practical purposes (below micrometre precision)
+
+Similarly for a vertical coordinate transformation, where :math:`Zoff_i` is the vertical
+offset at each vertex of the triangle:
+
+.. math::
+
+ Z_{target} = Z_{source} + Zoff_1 * \lambda_1 + Zoff_2 * \lambda_2 + Zoff_3 * \lambda_3
+
+Constraints on the triangulation
+++++++++++++++++++++++++++++++++
+
+No check is done on the consistence of the triangulation. It is highly
+recommended that triangles do not overlap each other (when considering the
+source coordinates or the forward transformation, or the target coordinates for
+the inverse transformation), otherwise which triangle will be selected is
+unspecified. Besides that, the triangulation does not need to have particular
+properties (like being a Delaunay triangulation)
+
+File format
++++++++++++
+
+To the best of our knowledge, there are no established file formats to convey
+geodetic transformations as triangulations. Potential similar formats to store TINs
+are `ITF <http://vterrain.org/Implementation/Formats/ITF.html>`_ or
+`XMS <https://www.xmswiki.com/wiki/TIN_Files>`_.
+Both of them would need to be extended in order to handle datum shift information,
+since they are both intended for mostly DEM use.
+
+We thus propose a text-based format, using JSON as a serialization. Using a text-based
+format could potentially be thought as a limitation performance-wise compared to
+binary formats, but for the size of triangulations considered (a few thousands triangles / vertices),
+there is no issue. Loading such file is a matter of 20 milliseconds or so. For reference,
+loading a triangulation of about 115 000 triangles and 71 000 vertices takes 450 ms.
+
+Using JSON provides generic formatting and parsing rules, and convenience to
+create it from Python script for examples. This could also be easily generated "at hand"
+by non-JSON aware writers.
+
+For generic metadata, we reuse closely what has been used for the
+`Deformation model master file <https://github.com/linz/deformation-model-format>`_
+
+Below a minimal example, from the KKJ to ETRS89 transformation, with just a
+single triangle:
+
+.. code-block:: json
+
+ {
+ "file_type": "triangulation_file",
+ "format_version": "1.0",
+ "name": "Name",
+ "version": "Version",
+ "publication_date": "2018-07-01T00:00:00Z",
+ "license": "Creative Commons Attribution 4.0 International",
+ "description": "Test triangulation",
+ "authority": {
+ "name": "Authority name",
+ "url": "http://example.com",
+ "address": "Address",
+ "email": "test@example.com"
+ },
+ "links": [
+ {
+ "href": "https://example.com/about.html",
+ "rel": "about",
+ "type": "text/html",
+ "title": "About"
+ },
+ {
+ "href": "https://example.com/download",
+ "rel": "source",
+ "type": "application/zip",
+ "title": "Authoritative source"
+ },
+ {
+ "href": "https://creativecommons.org/licenses/by/4.0/",
+ "rel": "license",
+ "type": "text/html",
+ "title": "Creative Commons Attribution 4.0 International license"
+ },
+ {
+ "href": "https://example.com/metadata.xml",
+ "rel": "metadata",
+ "type": "application/xml",
+ "title": " ISO 19115 XML encoded metadata regarding the triangulation"
+ }
+ ],
+ "input_crs": "EPSG:2393",
+ "target_crs": "EPSG:3067",
+ "transformed_components": [ "horizontal" ],
+ "vertices_columns": [ "source_x", "source_y", "target_x", "target_y" ],
+ "triangles_columns": [ "idx_vertex1", "idx_vertex2", "idx_vertex3" ],
+ "vertices": [ [3244102.707, 6693710.937, 244037.137, 6690900.686],
+ [3205290.722, 6715311.822, 205240.895, 6712492.577],
+ [3218328.492, 6649538.429, 218273.648, 6646745.973] ],
+ "triangles": [ [0, 1, 2] ]
+ }
+
+So after the generic metadata, we define the input and output CRS (informative
+only), and that the transformation affects horizontal components of
+coordinates. We name the columns of the ``vertices`` and ``triangles`` arrays.
+We defined the source and target coordinates of each vertex, and define a
+triangle by referring to the index of its vertices in the ``vertices`` array.
+
+More formally, the specific items for the triangulation file are:
+
+input_crs
+ String identifying the CRS of source coordinates
+ in the vertices. Typically ``EPSG:XXXX``. If the transformation is for vertical
+ component, this should be the code for a compound CRS (can be EPSG:XXXX+YYYY
+ where XXXX is the code of the horizontal CRS and YYYY the code of the vertical CRS).
+ For example, for the KKJ->ETRS89 transformation, this is EPSG:2393
+ (``KKJ / Finland Uniform Coordinate System``). The input coordinates are assumed
+ to be passed in the "normalized for visualisation" / "GIS friendly" order,
+ that is longitude, latitude for geographic coordinates and
+ easting, northing for projected coordinates.
+
+
+output_crs
+ String identifying the CRS of target coordinates in the vertices.
+ Typically ``EPSG:XXXX``. If the transformation is for vertical component,
+ this should be the code for a compound CRS (can be EPSG:XXXX+YYYY where
+ XXXX is the code of the horizontal CRS and YYYY the code of the vertical CRS).
+ For example, for the KKJ->ETRS89 transformation, this is EPSG:3067
+ (\"ETRS89 / TM35FIN(E,N)\"). The output coordinates will be returned in
+ the "normalized for visualisation" / "GIS friendly" order,
+ that is longitude, latitude for geographic coordinates and
+ easting, northing for projected coordinates.
+
+
+transformed_components
+ Array which may contain one or two strings: "horizontal" when horizontal
+ components of the coordinates are transformed and/or "vertical" when the
+ vertical component is transformed.
+
+
+vertices_columns
+ Specify the name of the columns of the rows in the ``vertices``
+ array. There must be exactly as many elements in ``vertices_columns`` as in a
+ row of ``vertices``. The following names have a special meaning: ``source_x``,
+ ``source_y``, ``target_x``, ``target_y``, ``source_z``, ``target_z`` and
+ ``offset_z``. ``source_x`` and ``source_y`` are compulsory.
+ ``source_x`` is for the source longitude (in degree) or easting.
+ ``source_y`` is for the source latitude (in degree) or northing.
+ ``target_x`` and ``target_y`` are compulsory when ``horizontal`` is specified
+ in ``transformed_components``. (``source_z`` and ``target_z``) or
+ ``offset_z`` are compulsory when ``vertical`` is specified in ``transformed_components``
+
+
+triangles_columns
+ Specify the name of the columns of the rows in the
+ ``triangles`` array. There must be exactly as many elements in ``triangles_columns``
+ as in a row of ``triangles``. The following names have a special meaning:
+ ``idx_vertex1``, ``idx_vertex2``, ``idx_vertex3``. They are compulsory.
+
+
+vertices
+ An array whose items are themselves arrays with as many columns as
+ described in ``vertices_columns``.
+
+
+triangles
+ An array whose items are themselves arrays with as many columns as
+ described in ``triangles_columns``.
+ The value of the ``idx_vertexN`` columns must be indices
+ (between 0 and len(``vertices``-1) of items of the ``vertices`` array.
+
+Code impacts
+++++++++++++
+
+The following new files are added in src/transformations:
+
+- tinshift.cpp: PROJ specific code for defining the new operation. Takes care
+ of the input and output coordinate conversions (between input_crs and triangulation_source_crs,
+ and triangulation_target_crs and output_crs), when needed.
+- tinshift.hpp: Header-based implementation. This file contains the API.
+- tinshift_exceptions.hpp: Exceptions that can be raised during file parsing
+- tinshift_impl.hpp: Implementation of file loading, triangle search and interpolation.
+
+This is the approach that has been followed for the deformation model implementation,
+and which makes it easier to do unit test.
+
+src/quadtree.hpp contains a quadtree implementation.
+
+Performance indications
++++++++++++++++++++++++
+
+Tested on Intel(R) Core(TM) i7-6700HQ CPU @ 2.60GHz, transforming 4 million points
+
+For the KKJ to ETRS89 transformation (1449 triangles and 767 vertices),
+4.4 million points / sec can be transformed.
+
+For comparison, the Helmert-based KKJ to ETRS89 transformation operates at
+1.6 million points / sec.
+
+A triangulation with about 115 000 triangles and 71 000 vertices
+operates at 2.2 million points / sec
+(throughput on more points would be better since the initial loading of the
+triangulation is non-negligible here)
+
+Backward compatibility
+-------------------------------------------------------------------------------
+
+New functionality fully backward compatible.
+
+Testing
+-------------------------------------------------------------------------------
+
+The PROJ test suite will be enhanced to test the new transformation, with a
+new .gie file, and a C++ unit test to test at a lower level.
+
+Documentation
+-------------------------------------------------------------------------------
+
+- The tinshift method will be documented.
+- The JSON format will be documented under https://proj.org/specifications/
+- A JSON schema will also be provided.
+
+Proposed implementation
+-------------------------------------------------------------------------------
+
+An initial implementation is available at https://github.com/rouault/PROJ/tree/tinshift
+
+References
+-------------------------------------------------------------------------------
+
+`Finnish coordinate transformation (automated translation to English) <https://translate.google.fr/translate?sl=auto&tl=en&u=https%3A%2F%2Fwww.maanmittauslaitos.fi%2Fkartat-ja-paikkatieto%2Fasiantuntevalle-kayttajalle%2Fkoordinaattimuunnokset>`_
+
+Adoption status
+-------------------------------------------------------------------------------
+
+The RFC was adopted on 2020-09-02 with +1's from the following PSC members
+
+* Kristian Evers
+* Charles Karney
+* Thomas Knudsen
+* Even Rouault
+
+Funding
+-------------------------------------------------------------------------------
+
+This work is funded by `National Land Survey of Finland <https://www.maanmittauslaitos.fi/en>`_
diff --git a/_sources/community/rfc/rfc-7.rst.txt b/_sources/community/rfc/rfc-7.rst.txt
new file mode 100644
index 00000000..b1ca3547
--- /dev/null
+++ b/_sources/community/rfc/rfc-7.rst.txt
@@ -0,0 +1,135 @@
+.. _rfc7:
+
+====================================================================
+PROJ RFC 7: Drop Autotools, maintain CMake
+====================================================================
+
+:Author: Mike Taves
+:Contact: mwtoews@gmail.com
+:Status: Adopted
+:Implementation target: PROJ 9.0
+:Last Updated: 2021-10-27
+
+Summary
+-------------------------------------------------------------------------------
+
+This RFC proposes to drop Autotools for PROJ 9.0, and to maintain CMake
+for build automation, testing and packaging. This will reduce the overall
+maintenance for PROJ and will enable the library to be integrated into other
+projects that use CMake.
+
+Background
+-------------------------------------------------------------------------------
+
+Here is a short timeline of the build tools used for PROJ:
+
+- Throughout the mid-1990s, Gerald Evenden maintained a Unix build system with
+ a few scripts (some derived from Autoconf), and Makefile templates.
+- In 2000, Frank Warmerdam wrote Autoconf and Automake configurations for
+ PROJ 4.4.0.
+- This was followed by a NMake configuration to build PROJ 4.4.2 for Windows.
+- In 2014, a CMake build setup was introduced by Howard Butler for
+ PROJ 4.9.0RC1. The CMake configuration was improved for the 4.9.1 release,
+ but not considered at feature parity with the Autotools builds at the time.
+- The NMake build setup was removed for PROJ 6.0.0, as its functionality had
+ been replaced by CMake.
+
+Motivation
+-------------------------------------------------------------------------------
+
+The primary motivation in removing Autotools is to reduce the burden of
+maintaining multiple build configurations, which requires developers to be
+familiar with different tools and configuration files. There are several other
+benefits in maintaining a single build system:
+
+- Remove extra configuration and m4 macro files from source repository,
+- Simplify scripts used for running tests for CI services (GitHub Actions,
+ TravisCI),
+- Reduce compilation time (and carbon footprint) used for testing on CI
+ services,
+- Ease development effort, particularly with new contributors.
+
+Why drop Autotools?
+-------------------------------------------------------------------------------
+
+The GNU Build System or Autotools consist of a suite of tools including
+Autoconf and Automake, which can be used to build software on Unix-like
+systems. These tools are not cross-platform, and do not naively integrate
+with development environments on Microsoft Windows. Furthermore, the existing
+PROJ Autotools builds do not install the CMake configuration files required to
+find PROJ from other projects that use CMake
+(`#2546 <https://github.com/OSGeo/PROJ/issues/2546>`_).
+
+Why use CMake?
+-------------------------------------------------------------------------------
+
+CMake is an open source cross-platform tool for build automation, testing and
+packaging of software. It does not directly compile the software, but manages
+the build process using generators, including Unix Makefiles and Ninja among
+other command-based and IDE tools. The CMake software has been under active
+development since its origins in 2000. The CMake language is carefully
+developed with backwards-compatible policies that aim to provide consistent
+behaviour across different versions. CMake is currently the preferred build
+tool for PROJ for the following reasons:
+
+- It has existed in the PROJ code base since 2014, and is familiar to active
+ PROJ contributors,
+- It can install configuration files that can be used by other software that
+ use CMake to find PROJ for linking via ``find_package()``,
+- CMake configurations are used in 3rd-party binary packages of PROJ,
+ including conda-forge and vcpkg,
+- It can be used to build PROJ on all major operating systems and compiler
+ combinations (where compatible),
+- It has integration with modern IDEs and tools, including
+ Microsoft Visual Studio and Qt Creator.
+
+Why not CMake?
+-------------------------------------------------------------------------------
+
+Other modern cross-platform build systems exist, including Meson and Bazel,
+which have many advantages over CMake. However, they are currently not widely
+used by active PROJ contributors. This RFC should not restrict future build
+system configurations being introduced to PROJ, if they are proven to have
+advantages to CMake over time.
+
+Potential impacts
+-------------------------------------------------------------------------------
+
+Binary packagers that currently rely on Autotools would obviously need to
+transition building and testing PROJ with CMake. Issues related to
+multiarch builds of PROJ may become apparent, which can be patched and/or
+reported to PROJ developers. One feature of Autotools is that both static and
+dynamic (shared) libraries are built, which packagers may distribute. This
+feature is currently not set-up for PROJ, as it would need to be configured
+and built twice.
+
+End-users that use binary packages of PROJ should not be impacted. PROJ should
+be discoverable via both pkg-config and CMake's ``find_package()``.
+Other projects that use Autotools will continue to work as expected,
+linking statically or dynamically to PROJ built by CMake.
+
+Transition plan
+-------------------------------------------------------------------------------
+
+If this proposal is approved, the following tasks should be completed:
+
+- Rewrite CI tests to only use CMake for packaging, building, testing,
+ installation and post-install tests,
+- Remove files only used by Autotools, also update ``.gitignore``,
+- Update documentation and ``HOWTORELEASE`` notes.
+
+Related issues will be tracked on GitHub with a tag
+`RFC7: Autotools→CMake <https://github.com/OSGeo/PROJ/labels/RFC7%3A%20Autotools%E2%86%92CMake>`_.
+
+Adoption status
+-------------------------------------------------------------------------------
+
+The RFC was adopted on 2021-10-26 with +1's from the following PSC members
+
+* Kristian Evers
+* Even Rouault
+* Howard Butler
+* Thomas Knudsen
+* Kurt Schwehr
+* Charles Karney
+* Thomas Knudsen