summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authoralvyjudy <alvyjudy@gmail.com>2020-05-11 14:46:49 -0400
committeralvyjudy <alvyjudy@gmail.com>2020-05-11 14:46:49 -0400
commit14bae0541bb17824fa4530cf693ef0093b5bdf48 (patch)
tree3aed4285364c091684bd5dc496971633e343be45
parent9ea37eda077883d310bfb7537f58c51cab84ff4c (diff)
downloadpython-setuptools-git-14bae0541bb17824fa4530cf693ef0093b5bdf48.tar.gz
docs: migrate functionality info to separate file
cut-and-paste from setuptools.txt
-rw-r--r--docs/setuptools.txt1326
-rw-r--r--docs/userguide/functionalities.txt1338
2 files changed, 1347 insertions, 1317 deletions
diff --git a/docs/setuptools.txt b/docs/setuptools.txt
index 25af9d33..7daeb3fc 100644
--- a/docs/setuptools.txt
+++ b/docs/setuptools.txt
@@ -61,953 +61,29 @@ Developer's Guide
-Using ``find_packages()``
--------------------------
-
-For simple projects, it's usually easy enough to manually add packages to
-the ``packages`` argument of ``setup()``. However, for very large projects
-(Twisted, PEAK, Zope, Chandler, etc.), it can be a big burden to keep the
-package list updated. That's what ``setuptools.find_packages()`` is for.
-
-``find_packages()`` takes a source directory and two lists of package name
-patterns to exclude and include. If omitted, the source directory defaults to
-the same
-directory as the setup script. Some projects use a ``src`` or ``lib``
-directory as the root of their source tree, and those projects would of course
-use ``"src"`` or ``"lib"`` as the first argument to ``find_packages()``. (And
-such projects also need something like ``package_dir={"": "src"}`` in their
-``setup()`` arguments, but that's just a normal distutils thing.)
-
-Anyway, ``find_packages()`` walks the target directory, filtering by inclusion
-patterns, and finds Python packages (any directory). Packages are only
-recognized if they include an ``__init__.py`` file. Finally, exclusion
-patterns are applied to remove matching packages.
-
-Inclusion and exclusion patterns are package names, optionally including
-wildcards. For
-example, ``find_packages(exclude=["*.tests"])`` will exclude all packages whose
-last name part is ``tests``. Or, ``find_packages(exclude=["*.tests",
-"*.tests.*"])`` will also exclude any subpackages of packages named ``tests``,
-but it still won't exclude a top-level ``tests`` package or the children
-thereof. In fact, if you really want no ``tests`` packages at all, you'll need
-something like this::
-
- find_packages(exclude=["*.tests", "*.tests.*", "tests.*", "tests"])
-
-in order to cover all the bases. Really, the exclusion patterns are intended
-to cover simpler use cases than this, like excluding a single, specified
-package and its subpackages.
-
-Regardless of the parameters, the ``find_packages()``
-function returns a list of package names suitable for use as the ``packages``
-argument to ``setup()``, and so is usually the easiest way to set that
-argument in your setup script. Especially since it frees you from having to
-remember to modify your setup script whenever your project grows additional
-top-level packages or subpackages.
-
-``find_namespace_packages()``
------------------------------
-In Python 3.3+, ``setuptools`` also provides the ``find_namespace_packages`` variant
-of ``find_packages``, which has the same function signature as
-``find_packages``, but works with `PEP 420`_ compliant implicit namespace
-packages. Here is a minimal setup script using ``find_namespace_packages``::
-
- from setuptools import setup, find_namespace_packages
- setup(
- name="HelloWorld",
- version="0.1",
- packages=find_namespace_packages(),
- )
-
-
-Keep in mind that according to PEP 420, you may have to either re-organize your
-codebase a bit or define a few exclusions, as the definition of an implicit
-namespace package is quite lenient, so for a project organized like so::
-
-
- ├── namespace
- │   └── mypackage
- │   ├── __init__.py
- │   └── mod1.py
- ├── setup.py
- └── tests
- └── test_mod1.py
-A naive ``find_namespace_packages()`` would install both ``namespace.mypackage`` and a
-top-level package called ``tests``! One way to avoid this problem is to use the
-``include`` keyword to whitelist the packages to include, like so::
- from setuptools import setup, find_namespace_packages
- setup(
- name="namespace.mypackage",
- version="0.1",
- packages=find_namespace_packages(include=["namespace.*"])
- )
-Another option is to use the "src" layout, where all package code is placed in
-the ``src`` directory, like so::
- ├── setup.py
- ├── src
- │   └── namespace
- │   └── mypackage
- │   ├── __init__.py
- │   └── mod1.py
- └── tests
- └── test_mod1.py
-
-With this layout, the package directory is specified as ``src``, as such::
-
- setup(name="namespace.mypackage",
- version="0.1",
- package_dir={"": "src"},
- packages=find_namespace_packages(where="src"))
-
-.. _PEP 420: https://www.python.org/dev/peps/pep-0420/
-
-Automatic Script Creation
-=========================
-
-Packaging and installing scripts can be a bit awkward with the distutils. For
-one thing, there's no easy way to have a script's filename match local
-conventions on both Windows and POSIX platforms. For another, you often have
-to create a separate file just for the "main" script, when your actual "main"
-is a function in a module somewhere. And even in Python 2.4, using the ``-m``
-option only works for actual ``.py`` files that aren't installed in a package.
-
-``setuptools`` fixes all of these problems by automatically generating scripts
-for you with the correct extension, and on Windows it will even create an
-``.exe`` file so that users don't have to change their ``PATHEXT`` settings.
-The way to use this feature is to define "entry points" in your setup script
-that indicate what function the generated script should import and run. For
-example, to create two console scripts called ``foo`` and ``bar``, and a GUI
-script called ``baz``, you might do something like this::
-
- setup(
- # other arguments here...
- entry_points={
- "console_scripts": [
- "foo = my_package.some_module:main_func",
- "bar = other_module:some_func",
- ],
- "gui_scripts": [
- "baz = my_package_gui:start_func",
- ]
- }
- )
-
-When this project is installed on non-Windows platforms (using "setup.py
-install", "setup.py develop", or with pip), a set of ``foo``, ``bar``,
-and ``baz`` scripts will be installed that import ``main_func`` and
-``some_func`` from the specified modules. The functions you specify are
-called with no arguments, and their return value is passed to
-``sys.exit()``, so you can return an errorlevel or message to print to
-stderr.
-
-On Windows, a set of ``foo.exe``, ``bar.exe``, and ``baz.exe`` launchers are
-created, alongside a set of ``foo.py``, ``bar.py``, and ``baz.pyw`` files. The
-``.exe`` wrappers find and execute the right version of Python to run the
-``.py`` or ``.pyw`` file.
-
-You may define as many "console script" and "gui script" entry points as you
-like, and each one can optionally specify "extras" that it depends on, that
-will be added to ``sys.path`` when the script is run. For more information on
-"extras", see the section below on `Declaring Extras`_. For more information
-on "entry points" in general, see the section below on `Dynamic Discovery of
-Services and Plugins`_.
-
-
-"Eggsecutable" Scripts
-----------------------
-
-.. deprecated:: 45.3.0
-
-Occasionally, there are situations where it's desirable to make an ``.egg``
-file directly executable. You can do this by including an entry point such
-as the following::
-
- setup(
- # other arguments here...
- entry_points={
- "setuptools.installation": [
- "eggsecutable = my_package.some_module:main_func",
- ]
- }
- )
-
-Any eggs built from the above setup script will include a short executable
-prelude that imports and calls ``main_func()`` from ``my_package.some_module``.
-The prelude can be run on Unix-like platforms (including Mac and Linux) by
-invoking the egg with ``/bin/sh``, or by enabling execute permissions on the
-``.egg`` file. For the executable prelude to run, the appropriate version of
-Python must be available via the ``PATH`` environment variable, under its
-"long" name. That is, if the egg is built for Python 2.3, there must be a
-``python2.3`` executable present in a directory on ``PATH``.
-
-IMPORTANT NOTE: Eggs with an "eggsecutable" header cannot be renamed, or
-invoked via symlinks. They *must* be invoked using their original filename, in
-order to ensure that, once running, ``pkg_resources`` will know what project
-and version is in use. The header script will check this and exit with an
-error if the ``.egg`` file has been renamed or is invoked via a symlink that
-changes its base name.
-
-
-Declaring Dependencies
-======================
-
-``setuptools`` supports automatically installing dependencies when a package is
-installed, and including information about dependencies in Python Eggs (so that
-package management tools like pip can use the information).
-
-``setuptools`` and ``pkg_resources`` use a common syntax for specifying a
-project's required dependencies. This syntax consists of a project's PyPI
-name, optionally followed by a comma-separated list of "extras" in square
-brackets, optionally followed by a comma-separated list of version
-specifiers. A version specifier is one of the operators ``<``, ``>``, ``<=``,
-``>=``, ``==`` or ``!=``, followed by a version identifier. Tokens may be
-separated by whitespace, but any whitespace or nonstandard characters within a
-project name or version identifier must be replaced with ``-``.
-
-Version specifiers for a given project are internally sorted into ascending
-version order, and used to establish what ranges of versions are acceptable.
-Adjacent redundant conditions are also consolidated (e.g. ``">1, >2"`` becomes
-``">2"``, and ``"<2,<3"`` becomes ``"<2"``). ``"!="`` versions are excised from
-the ranges they fall within. A project's version is then checked for
-membership in the resulting ranges. (Note that providing conflicting conditions
-for the same version (e.g. "<2,>=2" or "==2,!=2") is meaningless and may
-therefore produce bizarre results.)
-
-Here are some example requirement specifiers::
-
- docutils >= 0.3
-
- # comment lines and \ continuations are allowed in requirement strings
- BazSpam ==1.1, ==1.2, ==1.3, ==1.4, ==1.5, \
- ==1.6, ==1.7 # and so are line-end comments
-
- PEAK[FastCGI, reST]>=0.5a4
-
- setuptools==0.5a7
-
-The simplest way to include requirement specifiers is to use the
-``install_requires`` argument to ``setup()``. It takes a string or list of
-strings containing requirement specifiers. If you include more than one
-requirement in a string, each requirement must begin on a new line.
-
-This has three effects:
-
-1. When your project is installed, either by using pip, ``setup.py install``,
- or ``setup.py develop``, all of the dependencies not already installed will
- be located (via PyPI), downloaded, built (if necessary), and installed.
-2. Any scripts in your project will be installed with wrappers that verify
- the availability of the specified dependencies at runtime, and ensure that
- the correct versions are added to ``sys.path`` (e.g. if multiple versions
- have been installed).
-
-3. Python Egg distributions will include a metadata file listing the
- dependencies.
-Note, by the way, that if you declare your dependencies in ``setup.py``, you do
-*not* need to use the ``require()`` function in your scripts or modules, as
-long as you either install the project or use ``setup.py develop`` to do
-development work on it. (See `"Development Mode"`_ below for more details on
-using ``setup.py develop``.)
-
-
-Dependencies that aren't in PyPI
---------------------------------
-
-.. warning::
- Dependency links support has been dropped by pip starting with version
- 19.0 (released 2019-01-22).
-
-If your project depends on packages that don't exist on PyPI, you may still be
-able to depend on them, as long as they are available for download as:
-
-- an egg, in the standard distutils ``sdist`` format,
-- a single ``.py`` file, or
-- a VCS repository (Subversion, Mercurial, or Git).
-
-You just need to add some URLs to the ``dependency_links`` argument to
-``setup()``.
-
-The URLs must be either:
-
-1. direct download URLs,
-2. the URLs of web pages that contain direct download links, or
-3. the repository's URL
-
-In general, it's better to link to web pages, because it is usually less
-complex to update a web page than to release a new version of your project.
-You can also use a SourceForge ``showfiles.php`` link in the case where a
-package you depend on is distributed via SourceForge.
-
-If you depend on a package that's distributed as a single ``.py`` file, you
-must include an ``"#egg=project-version"`` suffix to the URL, to give a project
-name and version number. (Be sure to escape any dashes in the name or version
-by replacing them with underscores.) EasyInstall will recognize this suffix
-and automatically create a trivial ``setup.py`` to wrap the single ``.py`` file
-as an egg.
-
-In the case of a VCS checkout, you should also append ``#egg=project-version``
-in order to identify for what package that checkout should be used. You can
-append ``@REV`` to the URL's path (before the fragment) to specify a revision.
-Additionally, you can also force the VCS being used by prepending the URL with
-a certain prefix. Currently available are:
-
-- ``svn+URL`` for Subversion,
-- ``git+URL`` for Git, and
-- ``hg+URL`` for Mercurial
-
-A more complete example would be:
-
- ``vcs+proto://host/path@revision#egg=project-version``
-
-Be careful with the version. It should match the one inside the project files.
-If you want to disregard the version, you have to omit it both in the
-``requires`` and in the URL's fragment.
-
-This will do a checkout (or a clone, in Git and Mercurial parlance) to a
-temporary folder and run ``setup.py bdist_egg``.
-
-The ``dependency_links`` option takes the form of a list of URL strings. For
-example, this will cause a search of the specified page for eggs or source
-distributions, if the package's dependencies aren't already installed::
-
- setup(
- ...
- dependency_links=[
- "http://peak.telecommunity.com/snapshots/"
- ],
- )
-
-
-.. _Declaring Extras:
-
-
-Declaring "Extras" (optional features with their own dependencies)
-------------------------------------------------------------------
-
-Sometimes a project has "recommended" dependencies, that are not required for
-all uses of the project. For example, a project might offer optional PDF
-output if ReportLab is installed, and reStructuredText support if docutils is
-installed. These optional features are called "extras", and setuptools allows
-you to define their requirements as well. In this way, other projects that
-require these optional features can force the additional requirements to be
-installed, by naming the desired extras in their ``install_requires``.
-
-For example, let's say that Project A offers optional PDF and reST support::
-
- setup(
- name="Project-A",
- ...
- extras_require={
- "PDF": ["ReportLab>=1.2", "RXP"],
- "reST": ["docutils>=0.3"],
- }
- )
-
-As you can see, the ``extras_require`` argument takes a dictionary mapping
-names of "extra" features, to strings or lists of strings describing those
-features' requirements. These requirements will *not* be automatically
-installed unless another package depends on them (directly or indirectly) by
-including the desired "extras" in square brackets after the associated project
-name. (Or if the extras were listed in a requirement spec on the "pip install"
-command line.)
-
-Extras can be used by a project's `entry points`_ to specify dynamic
-dependencies. For example, if Project A includes a "rst2pdf" script, it might
-declare it like this, so that the "PDF" requirements are only resolved if the
-"rst2pdf" script is run::
-
- setup(
- name="Project-A",
- ...
- entry_points={
- "console_scripts": [
- "rst2pdf = project_a.tools.pdfgen [PDF]",
- "rst2html = project_a.tools.htmlgen",
- # more script entry points ...
- ],
- }
- )
-
-Projects can also use another project's extras when specifying dependencies.
-For example, if project B needs "project A" with PDF support installed, it
-might declare the dependency like this::
-
- setup(
- name="Project-B",
- install_requires=["Project-A[PDF]"],
- ...
- )
-
-This will cause ReportLab to be installed along with project A, if project B is
-installed -- even if project A was already installed. In this way, a project
-can encapsulate groups of optional "downstream dependencies" under a feature
-name, so that packages that depend on it don't have to know what the downstream
-dependencies are. If a later version of Project A builds in PDF support and
-no longer needs ReportLab, or if it ends up needing other dependencies besides
-ReportLab in order to provide PDF support, Project B's setup information does
-not need to change, but the right packages will still be installed if needed.
-
-Note, by the way, that if a project ends up not needing any other packages to
-support a feature, it should keep an empty requirements list for that feature
-in its ``extras_require`` argument, so that packages depending on that feature
-don't break (due to an invalid feature name). For example, if Project A above
-builds in PDF support and no longer needs ReportLab, it could change its
-setup to this::
-
- setup(
- name="Project-A",
- ...
- extras_require={
- "PDF": [],
- "reST": ["docutils>=0.3"],
- }
- )
-
-so that Package B doesn't have to remove the ``[PDF]`` from its requirement
-specifier.
-
-
-.. _Platform Specific Dependencies:
-
-
-Declaring platform specific dependencies
-----------------------------------------
-
-Sometimes a project might require a dependency to run on a specific platform.
-This could to a package that back ports a module so that it can be used in
-older python versions. Or it could be a package that is required to run on a
-specific operating system. This will allow a project to work on multiple
-different platforms without installing dependencies that are not required for
-a platform that is installing the project.
-
-For example, here is a project that uses the ``enum`` module and ``pywin32``::
-
- setup(
- name="Project",
- ...
- install_requires=[
- "enum34;python_version<'3.4'",
- "pywin32 >= 1.0;platform_system=='Windows'"
- ]
- )
-
-Since the ``enum`` module was added in Python 3.4, it should only be installed
-if the python version is earlier. Since ``pywin32`` will only be used on
-windows, it should only be installed when the operating system is Windows.
-Specifying version requirements for the dependencies is supported as normal.
-
-The environmental markers that may be used for testing platform types are
-detailed in `PEP 508`_.
-
-.. _PEP 508: https://www.python.org/dev/peps/pep-0508/
-
-Including Data Files
-====================
-
-The distutils have traditionally allowed installation of "data files", which
-are placed in a platform-specific location. However, the most common use case
-for data files distributed with a package is for use *by* the package, usually
-by including the data files in the package directory.
-
-Setuptools offers three ways to specify data files to be included in your
-packages. First, you can simply use the ``include_package_data`` keyword,
-e.g.::
-
- from setuptools import setup, find_packages
- setup(
- ...
- include_package_data=True
- )
-
-This tells setuptools to install any data files it finds in your packages.
-The data files must be specified via the distutils' ``MANIFEST.in`` file.
-(They can also be tracked by a revision control system, using an appropriate
-plugin. See the section below on `Adding Support for Revision Control
-Systems`_ for information on how to write such plugins.)
-
-If you want finer-grained control over what files are included (for example,
-if you have documentation files in your package directories and want to exclude
-them from installation), then you can also use the ``package_data`` keyword,
-e.g.::
-
- from setuptools import setup, find_packages
- setup(
- ...
- package_data={
- # If any package contains *.txt or *.rst files, include them:
- "": ["*.txt", "*.rst"],
- # And include any *.msg files found in the "hello" package, too:
- "hello": ["*.msg"],
- }
- )
-
-The ``package_data`` argument is a dictionary that maps from package names to
-lists of glob patterns. The globs may include subdirectory names, if the data
-files are contained in a subdirectory of the package. For example, if the
-package tree looks like this::
-
- setup.py
- src/
- mypkg/
- __init__.py
- mypkg.txt
- data/
- somefile.dat
- otherdata.dat
-
-The setuptools setup file might look like this::
-
- from setuptools import setup, find_packages
- setup(
- ...
- packages=find_packages("src"), # include all packages under src
- package_dir={"": "src"}, # tell distutils packages are under src
-
- package_data={
- # If any package contains *.txt files, include them:
- "": ["*.txt"],
- # And include any *.dat files found in the "data" subdirectory
- # of the "mypkg" package, also:
- "mypkg": ["data/*.dat"],
- }
- )
-
-Notice that if you list patterns in ``package_data`` under the empty string,
-these patterns are used to find files in every package, even ones that also
-have their own patterns listed. Thus, in the above example, the ``mypkg.txt``
-file gets included even though it's not listed in the patterns for ``mypkg``.
-
-Also notice that if you use paths, you *must* use a forward slash (``/``) as
-the path separator, even if you are on Windows. Setuptools automatically
-converts slashes to appropriate platform-specific separators at build time.
-
-If datafiles are contained in a subdirectory of a package that isn't a package
-itself (no ``__init__.py``), then the subdirectory names (or ``*``) are required
-in the ``package_data`` argument (as shown above with ``"data/*.dat"``).
-
-When building an ``sdist``, the datafiles are also drawn from the
-``package_name.egg-info/SOURCES.txt`` file, so make sure that this is removed if
-the ``setup.py`` ``package_data`` list is updated before calling ``setup.py``.
-
-(Note: although the ``package_data`` argument was previously only available in
-``setuptools``, it was also added to the Python ``distutils`` package as of
-Python 2.4; there is `some documentation for the feature`__ available on the
-python.org website. If using the setuptools-specific ``include_package_data``
-argument, files specified by ``package_data`` will *not* be automatically
-added to the manifest unless they are listed in the MANIFEST.in file.)
-
-__ https://docs.python.org/3/distutils/setupscript.html#installing-package-data
-
-Sometimes, the ``include_package_data`` or ``package_data`` options alone
-aren't sufficient to precisely define what files you want included. For
-example, you may want to include package README files in your revision control
-system and source distributions, but exclude them from being installed. So,
-setuptools offers an ``exclude_package_data`` option as well, that allows you
-to do things like this::
-
- from setuptools import setup, find_packages
- setup(
- ...
- packages=find_packages("src"), # include all packages under src
- package_dir={"": "src"}, # tell distutils packages are under src
-
- include_package_data=True, # include everything in source control
-
- # ...but exclude README.txt from all packages
- exclude_package_data={"": ["README.txt"]},
- )
-
-The ``exclude_package_data`` option is a dictionary mapping package names to
-lists of wildcard patterns, just like the ``package_data`` option. And, just
-as with that option, a key of ``""`` will apply the given pattern(s) to all
-packages. However, any files that match these patterns will be *excluded*
-from installation, even if they were listed in ``package_data`` or were
-included as a result of using ``include_package_data``.
-
-In summary, the three options allow you to:
-
-``include_package_data``
- Accept all data files and directories matched by ``MANIFEST.in``.
-
-``package_data``
- Specify additional patterns to match files that may or may
- not be matched by ``MANIFEST.in`` or found in source control.
-
-``exclude_package_data``
- Specify patterns for data files and directories that should *not* be
- included when a package is installed, even if they would otherwise have
- been included due to the use of the preceding options.
-
-NOTE: Due to the way the distutils build process works, a data file that you
-include in your project and then stop including may be "orphaned" in your
-project's build directories, requiring you to run ``setup.py clean --all`` to
-fully remove them. This may also be important for your users and contributors
-if they track intermediate revisions of your project using Subversion; be sure
-to let them know when you make changes that remove files from inclusion so they
-can run ``setup.py clean --all``.
-
-
-Accessing Data Files at Runtime
--------------------------------
-
-Typically, existing programs manipulate a package's ``__file__`` attribute in
-order to find the location of data files. However, this manipulation isn't
-compatible with PEP 302-based import hooks, including importing from zip files
-and Python Eggs. It is strongly recommended that, if you are using data files,
-you should use the :ref:`ResourceManager API` of ``pkg_resources`` to access
-them. The ``pkg_resources`` module is distributed as part of setuptools, so if
-you're using setuptools to distribute your package, there is no reason not to
-use its resource management API. See also `Importlib Resources`_ for
-a quick example of converting code that uses ``__file__`` to use
-``pkg_resources`` instead.
-
-.. _Importlib Resources: https://docs.python.org/3/library/importlib.html#module-importlib.resources
-
-
-Non-Package Data Files
-----------------------
-
-Historically, ``setuptools`` by way of ``easy_install`` would encapsulate data
-files from the distribution into the egg (see `the old docs
-<https://github.com/pypa/setuptools/blob/52aacd5b276fedd6849c3a648a0014f5da563e93/docs/setuptools.txt#L970-L1001>`_). As eggs are deprecated and pip-based installs
-fall back to the platform-specific location for installing data files, there is
-no supported facility to reliably retrieve these resources.
-
-Instead, the PyPA recommends that any data files you wish to be accessible at
-run time be included in the package.
-
-
-Automatic Resource Extraction
------------------------------
-
-If you are using tools that expect your resources to be "real" files, or your
-project includes non-extension native libraries or other files that your C
-extensions expect to be able to access, you may need to list those files in
-the ``eager_resources`` argument to ``setup()``, so that the files will be
-extracted together, whenever a C extension in the project is imported.
-
-This is especially important if your project includes shared libraries *other*
-than distutils-built C extensions, and those shared libraries use file
-extensions other than ``.dll``, ``.so``, or ``.dylib``, which are the
-extensions that setuptools 0.6a8 and higher automatically detects as shared
-libraries and adds to the ``native_libs.txt`` file for you. Any shared
-libraries whose names do not end with one of those extensions should be listed
-as ``eager_resources``, because they need to be present in the filesystem when
-he C extensions that link to them are used.
-
-The ``pkg_resources`` runtime for compressed packages will automatically
-extract *all* C extensions and ``eager_resources`` at the same time, whenever
-*any* C extension or eager resource is requested via the ``resource_filename()``
-API. (C extensions are imported using ``resource_filename()`` internally.)
-This ensures that C extensions will see all of the "real" files that they
-expect to see.
-
-Note also that you can list directory resource names in ``eager_resources`` as
-well, in which case the directory's contents (including subdirectories) will be
-extracted whenever any C extension or eager resource is requested.
-
-Please note that if you're not sure whether you need to use this argument, you
-don't! It's really intended to support projects with lots of non-Python
-dependencies and as a last resort for crufty projects that can't otherwise
-handle being compressed. If your package is pure Python, Python plus data
-files, or Python plus C, you really don't need this. You've got to be using
-either C or an external program that needs "real" files in your project before
-there's any possibility of ``eager_resources`` being relevant to your project.
-
-
-Extensible Applications and Frameworks
-======================================
-
-
-.. _Entry Points:
-Dynamic Discovery of Services and Plugins
------------------------------------------
-``setuptools`` supports creating libraries that "plug in" to extensible
-applications and frameworks, by letting you register "entry points" in your
-project that can be imported by the application or framework.
-
-For example, suppose that a blogging tool wants to support plugins
-that provide translation for various file types to the blog's output format.
-The framework might define an "entry point group" called ``blogtool.parsers``,
-and then allow plugins to register entry points for the file extensions they
-support.
-
-This would allow people to create distributions that contain one or more
-parsers for different file types, and then the blogging tool would be able to
-find the parsers at runtime by looking up an entry point for the file
-extension (or mime type, or however it wants to).
-
-Note that if the blogging tool includes parsers for certain file formats, it
-can register these as entry points in its own setup script, which means it
-doesn't have to special-case its built-in formats. They can just be treated
-the same as any other plugin's entry points would be.
-
-If you're creating a project that plugs in to an existing application or
-framework, you'll need to know what entry points or entry point groups are
-defined by that application or framework. Then, you can register entry points
-in your setup script. Here are a few examples of ways you might register an
-``.rst`` file parser entry point in the ``blogtool.parsers`` entry point group,
-for our hypothetical blogging tool::
-
- setup(
- # ...
- entry_points={"blogtool.parsers": ".rst = some_module:SomeClass"}
- )
-
- setup(
- # ...
- entry_points={"blogtool.parsers": [".rst = some_module:a_func"]}
- )
-
- setup(
- # ...
- entry_points="""
- [blogtool.parsers]
- .rst = some.nested.module:SomeClass.some_classmethod [reST]
- """,
- extras_require=dict(reST="Docutils>=0.3.5")
- )
-
-The ``entry_points`` argument to ``setup()`` accepts either a string with
-``.ini``-style sections, or a dictionary mapping entry point group names to
-either strings or lists of strings containing entry point specifiers. An
-entry point specifier consists of a name and value, separated by an ``=``
-sign. The value consists of a dotted module name, optionally followed by a
-``:`` and a dotted identifier naming an object within the module. It can
-also include a bracketed list of "extras" that are required for the entry
-point to be used. When the invoking application or framework requests loading
-of an entry point, any requirements implied by the associated extras will be
-passed to ``pkg_resources.require()``, so that an appropriate error message
-can be displayed if the needed package(s) are missing. (Of course, the
-invoking app or framework can ignore such errors if it wants to make an entry
-point optional if a requirement isn't installed.)
-
-
-Defining Additional Metadata
-----------------------------
-
-Some extensible applications and frameworks may need to define their own kinds
-of metadata to include in eggs, which they can then access using the
-``pkg_resources`` metadata APIs. Ordinarily, this is done by having plugin
-developers include additional files in their ``ProjectName.egg-info``
-directory. However, since it can be tedious to create such files by hand, you
-may want to create a distutils extension that will create the necessary files
-from arguments to ``setup()``, in much the same way that ``setuptools`` does
-for many of the ``setup()`` arguments it adds. See the section below on
-`Creating distutils Extensions`_ for more details, especially the subsection on
-`Adding new EGG-INFO Files`_.
-
-
-"Development Mode"
-==================
-
-Under normal circumstances, the ``distutils`` assume that you are going to
-build a distribution of your project, not use it in its "raw" or "unbuilt"
-form. If you were to use the ``distutils`` that way, you would have to rebuild
-and reinstall your project every time you made a change to it during
-development.
-
-Another problem that sometimes comes up with the ``distutils`` is that you may
-need to do development on two related projects at the same time. You may need
-to put both projects' packages in the same directory to run them, but need to
-keep them separate for revision control purposes. How can you do this?
-
-Setuptools allows you to deploy your projects for use in a common directory or
-staging area, but without copying any files. Thus, you can edit each project's
-code in its checkout directory, and only need to run build commands when you
-change a project's C extensions or similarly compiled files. You can even
-deploy a project into another project's checkout directory, if that's your
-preferred way of working (as opposed to using a common independent staging area
-or the site-packages directory).
-
-To do this, use the ``setup.py develop`` command. It works very similarly to
-``setup.py install``, except that it doesn't actually install anything.
-Instead, it creates a special ``.egg-link`` file in the deployment directory,
-that links to your project's source code. And, if your deployment directory is
-Python's ``site-packages`` directory, it will also update the
-``easy-install.pth`` file to include your project's source code, thereby making
-it available on ``sys.path`` for all programs using that Python installation.
-
-If you have enabled the ``use_2to3`` flag, then of course the ``.egg-link``
-will not link directly to your source code when run under Python 3, since
-that source code would be made for Python 2 and not work under Python 3.
-Instead the ``setup.py develop`` will build Python 3 code under the ``build``
-directory, and link there. This means that after doing code changes you will
-have to run ``setup.py build`` before these changes are picked up by your
-Python 3 installation.
-
-In addition, the ``develop`` command creates wrapper scripts in the target
-script directory that will run your in-development scripts after ensuring that
-all your ``install_requires`` packages are available on ``sys.path``.
-
-You can deploy the same project to multiple staging areas, e.g. if you have
-multiple projects on the same machine that are sharing the same project you're
-doing development work.
-
-When you're done with a given development task, you can remove the project
-source from a staging area using ``setup.py develop --uninstall``, specifying
-the desired staging area if it's not the default.
-
-There are several options to control the precise behavior of the ``develop``
-command; see the section on the `develop`_ command below for more details.
-
-Note that you can also apply setuptools commands to non-setuptools projects,
-using commands like this::
-
- python -c "import setuptools; with open('setup.py') as f: exec(compile(f.read(), 'setup.py', 'exec'))" develop
-
-That is, you can simply list the normal setup commands and options following
-the quoted part.
-
-
-Distributing a ``setuptools``-based project
-===========================================
-
-Detailed instructions to distribute a setuptools project can be found at
-`Packaging project tutorials`_.
-
-.. _Packaging project tutorials: https://packaging.python.org/tutorials/packaging-projects/#generating-distribution-archives
-
-Before you begin, make sure you have the latest versions of setuptools and wheel::
-
- pip install --upgrade setuptools wheel
-
-To build a setuptools project, run this command from the same directory where
-setup.py is located::
-
- setup.py sdist bdist_wheel
-
-This will generate distribution archives in the `dist` directory.
-
-Before you upload the generated archives make sure you're registered on
-https://test.pypi.org/account/register/. You will also need to verify your email
-to be able to upload any packages.
-You should install twine to be able to upload packages::
-
- pip install --upgrade twine
-
-Now, to upload these archives, run::
-
- twine upload --repository-url https://test.pypi.org/legacy/ dist/*
-
-To install your newly uploaded package ``example_pkg``, you can use pip::
-
- pip install --index-url https://test.pypi.org/simple/ example_pkg
-
-If you have issues at any point, please refer to `Packaging project tutorials`_
-for clarification.
-
-Setting the ``zip_safe`` flag
------------------------------
-
-For some use cases (such as bundling as part of a larger application), Python
-packages may be run directly from a zip file.
-Not all packages, however, are capable of running in compressed form, because
-they may expect to be able to access either source code or data files as
-normal operating system files. So, ``setuptools`` can install your project
-as a zipfile or a directory, and its default choice is determined by the
-project's ``zip_safe`` flag.
-
-You can pass a True or False value for the ``zip_safe`` argument to the
-``setup()`` function, or you can omit it. If you omit it, the ``bdist_egg``
-command will analyze your project's contents to see if it can detect any
-conditions that would prevent it from working in a zipfile. It will output
-notices to the console about any such conditions that it finds.
-
-Currently, this analysis is extremely conservative: it will consider the
-project unsafe if it contains any C extensions or datafiles whatsoever. This
-does *not* mean that the project can't or won't work as a zipfile! It just
-means that the ``bdist_egg`` authors aren't yet comfortable asserting that
-the project *will* work. If the project contains no C or data files, and does
-no ``__file__`` or ``__path__`` introspection or source code manipulation, then
-there is an extremely solid chance the project will work when installed as a
-zipfile. (And if the project uses ``pkg_resources`` for all its data file
-access, then C extensions and other data files shouldn't be a problem at all.
-See the `Accessing Data Files at Runtime`_ section above for more information.)
-However, if ``bdist_egg`` can't be *sure* that your package will work, but
-you've checked over all the warnings it issued, and you are either satisfied it
-*will* work (or if you want to try it for yourself), then you should set
-``zip_safe`` to ``True`` in your ``setup()`` call. If it turns out that it
-doesn't work, you can always change it to ``False``, which will force
-``setuptools`` to install your project as a directory rather than as a zipfile.
-In the future, as we gain more experience with different packages and become
-more satisfied with the robustness of the ``pkg_resources`` runtime, the
-"zip safety" analysis may become less conservative. However, we strongly
-recommend that you determine for yourself whether your project functions
-correctly when installed as a zipfile, correct any problems if you can, and
-then make an explicit declaration of ``True`` or ``False`` for the ``zip_safe``
-flag, so that it will not be necessary for ``bdist_egg`` to try to guess
-whether your project can work as a zipfile.
-.. _Namespace Packages:
-Namespace Packages
-------------------
-Sometimes, a large package is more useful if distributed as a collection of
-smaller eggs. However, Python does not normally allow the contents of a
-package to be retrieved from more than one location. "Namespace packages"
-are a solution for this problem. When you declare a package to be a namespace
-package, it means that the package has no meaningful contents in its
-``__init__.py``, and that it is merely a container for modules and subpackages.
-The ``pkg_resources`` runtime will then automatically ensure that the contents
-of namespace packages that are spread over multiple eggs or directories are
-combined into a single "virtual" package.
-The ``namespace_packages`` argument to ``setup()`` lets you declare your
-project's namespace packages, so that they will be included in your project's
-metadata. The argument should list the namespace packages that the egg
-participates in. For example, the ZopeInterface project might do this::
- setup(
- # ...
- namespace_packages=["zope"]
- )
-because it contains a ``zope.interface`` package that lives in the ``zope``
-namespace package. Similarly, a project for a standalone ``zope.publisher``
-would also declare the ``zope`` namespace package. When these projects are
-installed and used, Python will see them both as part of a "virtual" ``zope``
-package, even though they will be installed in different locations.
-
-Namespace packages don't have to be top-level packages. For example, Zope 3's
-``zope.app`` package is a namespace package, and in the future PEAK's
-``peak.util`` package will be too.
-
-Note, by the way, that your project's source tree must include the namespace
-packages' ``__init__.py`` files (and the ``__init__.py`` of any parent
-packages), in a normal Python package layout. These ``__init__.py`` files
-*must* contain the line::
-
- __import__("pkg_resources").declare_namespace(__name__)
-
-This code ensures that the namespace package machinery is operating and that
-the current package is registered as a namespace package.
-
-You must NOT include any other code and data in a namespace package's
-``__init__.py``. Even though it may appear to work during development, or when
-projects are installed as ``.egg`` files, it will not work when the projects
-are installed using "system" packaging tools -- in such cases the
-``__init__.py`` files will not be installed, let alone executed.
-
-You must include the ``declare_namespace()`` line in the ``__init__.py`` of
-*every* project that has contents for the namespace package in question, in
-order to ensure that the namespace will be declared regardless of which
-project's copy of ``__init__.py`` is loaded first. If the first loaded
-``__init__.py`` doesn't declare it, it will never *be* declared, because no
-other copies will ever be loaded!
+
+
+
+
TRANSITIONAL NOTE
@@ -1033,164 +109,14 @@ Our apologies for the inconvenience, and thank you for your patience.
-Tagging and "Daily Build" or "Snapshot" Releases
-------------------------------------------------
-
-When a set of related projects are under development, it may be important to
-track finer-grained version increments than you would normally use for e.g.
-"stable" releases. While stable releases might be measured in dotted numbers
-with alpha/beta/etc. status codes, development versions of a project often
-need to be tracked by revision or build number or even build date. This is
-especially true when projects in development need to refer to one another, and
-therefore may literally need an up-to-the-minute version of something!
-
-To support these scenarios, ``setuptools`` allows you to "tag" your source and
-egg distributions by adding one or more of the following to the project's
-"official" version identifier:
-
-* A manually-specified pre-release tag, such as "build" or "dev", or a
- manually-specified post-release tag, such as a build or revision number
- (``--tag-build=STRING, -bSTRING``)
-
-* An 8-character representation of the build date (``--tag-date, -d``), as
- a postrelease tag
-
-You can add these tags by adding ``egg_info`` and the desired options to
-the command line ahead of the ``sdist`` or ``bdist`` commands that you want
-to generate a daily build or snapshot for. See the section below on the
-`egg_info`_ command for more details.
-
-(Also, before you release your project, be sure to see the section above on
-`Specifying Your Project's Version`_ for more information about how pre- and
-post-release tags affect how version numbers are interpreted. This is
-important in order to make sure that dependency processing tools will know
-which versions of your project are newer than others.)
-
-Finally, if you are creating builds frequently, and either building them in a
-downloadable location or are copying them to a distribution server, you should
-probably also check out the `rotate`_ command, which lets you automatically
-delete all but the N most-recently-modified distributions matching a glob
-pattern. So, you can use a command line like::
-
- setup.py egg_info -rbDEV bdist_egg rotate -m.egg -k3
-
-to build an egg whose version info includes "DEV-rNNNN" (where NNNN is the
-most recent Subversion revision that affected the source tree), and then
-delete any egg files from the distribution directory except for the three
-that were built most recently.
-
-If you have to manage automated builds for multiple packages, each with
-different tagging and rotation policies, you may also want to check out the
-`alias`_ command, which would let each package define an alias like ``daily``
-that would perform the necessary tag, build, and rotate commands. Then, a
-simpler script or cron job could just run ``setup.py daily`` in each project
-directory. (And, you could also define sitewide or per-user default versions
-of the ``daily`` alias, so that projects that didn't define their own would
-use the appropriate defaults.)
-
-
-Generating Source Distributions
--------------------------------
-
-``setuptools`` enhances the distutils' default algorithm for source file
-selection with pluggable endpoints for looking up files to include. If you are
-using a revision control system, and your source distributions only need to
-include files that you're tracking in revision control, use a corresponding
-plugin instead of writing a ``MANIFEST.in`` file. See the section below on
-`Adding Support for Revision Control Systems`_ for information on plugins.
-
-If you need to include automatically generated files, or files that are kept in
-an unsupported revision control system, you'll need to create a ``MANIFEST.in``
-file to specify any files that the default file location algorithm doesn't
-catch. See the distutils documentation for more information on the format of
-the ``MANIFEST.in`` file.
-But, be sure to ignore any part of the distutils documentation that deals with
-``MANIFEST`` or how it's generated from ``MANIFEST.in``; setuptools shields you
-from these issues and doesn't work the same way in any case. Unlike the
-distutils, setuptools regenerates the source distribution manifest file
-every time you build a source distribution, and it builds it inside the
-project's ``.egg-info`` directory, out of the way of your main project
-directory. You therefore need not worry about whether it is up-to-date or not.
-Indeed, because setuptools' approach to determining the contents of a source
-distribution is so much simpler, its ``sdist`` command omits nearly all of
-the options that the distutils' more complex ``sdist`` process requires. For
-all practical purposes, you'll probably use only the ``--formats`` option, if
-you use any option at all.
-Making "Official" (Non-Snapshot) Releases
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-When you make an official release, creating source or binary distributions,
-you will need to override the tag settings from ``setup.cfg``, so that you
-don't end up registering versions like ``foobar-0.7a1.dev-r34832``. This is
-easy to do if you are developing on the trunk and using tags or branches for
-your releases - just make the change to ``setup.cfg`` after branching or
-tagging the release, so the trunk will still produce development snapshots.
-Alternately, if you are not branching for releases, you can override the
-default version options on the command line, using something like::
- setup.py egg_info -Db "" sdist bdist_egg
-The first part of this command (``egg_info -Db ""``) will override the
-configured tag information, before creating source and binary eggs. Thus, these
-commands will use the plain version from your ``setup.py``, without adding the
-build designation string.
-
-Of course, if you will be doing this a lot, you may wish to create a personal
-alias for this operation, e.g.::
-
- setup.py alias -u release egg_info -Db ""
-
-You can then use it like this::
-
- setup.py release sdist bdist_egg
-
-Or of course you can create more elaborate aliases that do all of the above.
-See the sections below on the `egg_info`_ and `alias`_ commands for more ideas.
-
-
-
-Distributing Extensions compiled with Cython
---------------------------------------------
-
-``setuptools`` will detect at build time whether Cython is installed or not.
-If Cython is not found ``setuptools`` will ignore pyx files.
-
-To ensure Cython is available, include Cython in the build-requires section
-of your pyproject.toml::
-
- [build-system]
- requires=[..., "cython"]
-
-Built with pip 10 or later, that declaration is sufficient to include Cython
-in the build. For broader compatibility, declare the dependency in your
-setup-requires of setup.cfg::
-
- [options]
- setup_requires =
- ...
- cython
-
-As long as Cython is present in the build environment, ``setuptools`` includes
-transparent support for building Cython extensions, as
-long as extensions are defined using ``setuptools.Extension``.
-
-If you follow these rules, you can safely list ``.pyx`` files as the source
-of your ``Extension`` objects in the setup script. If it is, then ``setuptools``
-will use it.
-
-Of course, for this to work, your source distributions must include the C
-code generated by Cython, as well as your original ``.pyx`` files. This means
-that you will probably want to include current ``.c`` files in your revision
-control system, rebuilding them whenever you check changes in for the ``.pyx``
-source files. This will ensure that people tracking your project in a revision
-control system will be able to build it even if they don't have Cython
-installed, and that your source releases will be similarly usable with or
-without Cython.
@@ -1466,247 +392,13 @@ by directives, such as ``attr:`` and others, will be silently ignored.
As a consequence, the resulting dictionary will include no such options.
---------------------------------
-Extending and Reusing Setuptools
---------------------------------
-
-Creating ``distutils`` Extensions
-=================================
-
-It can be hard to add new commands or setup arguments to the distutils. But
-the ``setuptools`` package makes it a bit easier, by allowing you to distribute
-a distutils extension as a separate project, and then have projects that need
-the extension just refer to it in their ``setup_requires`` argument.
-
-With ``setuptools``, your distutils extension projects can hook in new
-commands and ``setup()`` arguments just by defining "entry points". These
-are mappings from command or argument names to a specification of where to
-import a handler from. (See the section on `Dynamic Discovery of Services and
-Plugins`_ above for some more background on entry points.)
-
-
-Adding Commands
----------------
-
-You can add new ``setup`` commands by defining entry points in the
-``distutils.commands`` group. For example, if you wanted to add a ``foo``
-command, you might add something like this to your distutils extension
-project's setup script::
-
- setup(
- # ...
- entry_points={
- "distutils.commands": [
- "foo = mypackage.some_module:foo",
- ],
- },
- )
-
-(Assuming, of course, that the ``foo`` class in ``mypackage.some_module`` is
-a ``setuptools.Command`` subclass.)
-
-Once a project containing such entry points has been activated on ``sys.path``,
-(e.g. by running "install" or "develop" with a site-packages installation
-directory) the command(s) will be available to any ``setuptools``-based setup
-scripts. It is not necessary to use the ``--command-packages`` option or
-to monkeypatch the ``distutils.command`` package to install your commands;
-``setuptools`` automatically adds a wrapper to the distutils to search for
-entry points in the active distributions on ``sys.path``. In fact, this is
-how setuptools' own commands are installed: the setuptools project's setup
-script defines entry points for them!
-
-
-Adding ``setup()`` Arguments
-----------------------------
-
-.. warning:: Adding arguments to setup is discouraged as such arguments
- are only supported through imperative execution and not supported through
- declarative config.
-
-Sometimes, your commands may need additional arguments to the ``setup()``
-call. You can enable this by defining entry points in the
-``distutils.setup_keywords`` group. For example, if you wanted a ``setup()``
-argument called ``bar_baz``, you might add something like this to your
-distutils extension project's setup script::
-
- setup(
- # ...
- entry_points={
- "distutils.commands": [
- "foo = mypackage.some_module:foo",
- ],
- "distutils.setup_keywords": [
- "bar_baz = mypackage.some_module:validate_bar_baz",
- ],
- },
- )
-
-The idea here is that the entry point defines a function that will be called
-to validate the ``setup()`` argument, if it's supplied. The ``Distribution``
-object will have the initial value of the attribute set to ``None``, and the
-validation function will only be called if the ``setup()`` call sets it to
-a non-None value. Here's an example validation function::
-
- def assert_bool(dist, attr, value):
- """Verify that value is True, False, 0, or 1"""
- if bool(value) != value:
- raise DistutilsSetupError(
- "%r must be a boolean value (got %r)" % (attr,value)
- )
-
-Your function should accept three arguments: the ``Distribution`` object,
-the attribute name, and the attribute value. It should raise a
-``DistutilsSetupError`` (from the ``distutils.errors`` module) if the argument
-is invalid. Remember, your function will only be called with non-None values,
-and the default value of arguments defined this way is always None. So, your
-commands should always be prepared for the possibility that the attribute will
-be ``None`` when they access it later.
-
-If more than one active distribution defines an entry point for the same
-``setup()`` argument, *all* of them will be called. This allows multiple
-distutils extensions to define a common argument, as long as they agree on
-what values of that argument are valid.
-
-Also note that as with commands, it is not necessary to subclass or monkeypatch
-the distutils ``Distribution`` class in order to add your arguments; it is
-sufficient to define the entry points in your extension, as long as any setup
-script using your extension lists your project in its ``setup_requires``
-argument.
-
-
-Customizing Distribution Options
---------------------------------
-
-Plugins may wish to extend or alter the options on a Distribution object to
-suit the purposes of that project. For example, a tool that infers the
-``Distribution.version`` from SCM-metadata may need to hook into the
-option finalization. To enable this feature, Setuptools offers an entry
-point "setuptools.finalize_distribution_options". That entry point must
-be a callable taking one argument (the Distribution instance).
-
-If the callable has an ``.order`` property, that value will be used to
-determine the order in which the hook is called. Lower numbers are called
-first and the default is zero (0).
-
-Plugins may read, alter, and set properties on the distribution, but each
-plugin is encouraged to load the configuration/settings for their behavior
-independently.
-
-
-Adding new EGG-INFO Files
--------------------------
-
-Some extensible applications or frameworks may want to allow third parties to
-develop plugins with application or framework-specific metadata included in
-the plugins' EGG-INFO directory, for easy access via the ``pkg_resources``
-metadata API. The easiest way to allow this is to create a distutils extension
-to be used from the plugin projects' setup scripts (via ``setup_requires``)
-that defines a new setup keyword, and then uses that data to write an EGG-INFO
-file when the ``egg_info`` command is run.
-
-The ``egg_info`` command looks for extension points in an ``egg_info.writers``
-group, and calls them to write the files. Here's a simple example of a
-distutils extension defining a setup argument ``foo_bar``, which is a list of
-lines that will be written to ``foo_bar.txt`` in the EGG-INFO directory of any
-project that uses the argument::
-
- setup(
- # ...
- entry_points={
- "distutils.setup_keywords": [
- "foo_bar = setuptools.dist:assert_string_list",
- ],
- "egg_info.writers": [
- "foo_bar.txt = setuptools.command.egg_info:write_arg",
- ],
- },
- )
-
-This simple example makes use of two utility functions defined by setuptools
-for its own use: a routine to validate that a setup keyword is a sequence of
-strings, and another one that looks up a setup argument and writes it to
-a file. Here's what the writer utility looks like::
-
- def write_arg(cmd, basename, filename):
- argname = os.path.splitext(basename)[0]
- value = getattr(cmd.distribution, argname, None)
- if value is not None:
- value = "\n".join(value) + "\n"
- cmd.write_or_delete_file(argname, filename, value)
-
-As you can see, ``egg_info.writers`` entry points must be a function taking
-three arguments: a ``egg_info`` command instance, the basename of the file to
-write (e.g. ``foo_bar.txt``), and the actual full filename that should be
-written to.
-
-In general, writer functions should honor the command object's ``dry_run``
-setting when writing files, and use the ``distutils.log`` object to do any
-console output. The easiest way to conform to this requirement is to use
-the ``cmd`` object's ``write_file()``, ``delete_file()``, and
-``write_or_delete_file()`` methods exclusively for your file operations. See
-those methods' docstrings for more details.
-
-
-Adding Support for Revision Control Systems
--------------------------------------------------
-
-If the files you want to include in the source distribution are tracked using
-Git, Mercurial or SVN, you can use the following packages to achieve that:
-
-- Git and Mercurial: `setuptools_scm <https://pypi.org/project/setuptools_scm/>`_
-- SVN: `setuptools_svn <https://pypi.org/project/setuptools_svn/>`_
-
-If you would like to create a plugin for ``setuptools`` to find files tracked
-by another revision control system, you can do so by adding an entry point to
-the ``setuptools.file_finders`` group. The entry point should be a function
-accepting a single directory name, and should yield all the filenames within
-that directory (and any subdirectories thereof) that are under revision
-control.
-
-For example, if you were going to create a plugin for a revision control system
-called "foobar", you would write a function something like this:
-.. code-block:: python
- def find_files_for_foobar(dirname):
- # loop to yield paths that start with `dirname`
-
-And you would register it in a setup script using something like this::
-
- entry_points={
- "setuptools.file_finders": [
- "foobar = my_foobar_module:find_files_for_foobar",
- ]
- }
-
-Then, anyone who wants to use your plugin can simply install it, and their
-local setuptools installation will be able to find the necessary files.
-
-It is not necessary to distribute source control plugins with projects that
-simply use the other source control system, or to specify the plugins in
-``setup_requires``. When you create a source distribution with the ``sdist``
-command, setuptools automatically records what files were found in the
-``SOURCES.txt`` file. That way, recipients of source distributions don't need
-to have revision control at all. However, if someone is working on a package
-by checking out with that system, they will need the same plugin(s) that the
-original author is using.
-
-A few important points for writing revision control file finders:
-
-* Your finder function MUST return relative paths, created by appending to the
- passed-in directory name. Absolute paths are NOT allowed, nor are relative
- paths that reference a parent directory of the passed-in directory.
-
-* Your finder function MUST accept an empty string as the directory name,
- meaning the current directory. You MUST NOT convert this to a dot; just
- yield relative paths. So, yielding a subdirectory named ``some/dir`` under
- the current directory should NOT be rendered as ``./some/dir`` or
- ``/somewhere/some/dir``, but *always* as simply ``some/dir``
-
-* Your finder function SHOULD NOT raise any errors, and SHOULD deal gracefully
- with the absence of needed programs (i.e., ones belonging to the revision
- control system itself. It *may*, however, use ``distutils.log.warn()`` to
- inform the user of the missing program(s).
+
+
+
+
+
Mailing List and Bug Tracker
diff --git a/docs/userguide/functionalities.txt b/docs/userguide/functionalities.txt
new file mode 100644
index 00000000..a310da97
--- /dev/null
+++ b/docs/userguide/functionalities.txt
@@ -0,0 +1,1338 @@
+Using ``find_packages()``
+-------------------------
+
+For simple projects, it's usually easy enough to manually add packages to
+the ``packages`` argument of ``setup()``. However, for very large projects
+(Twisted, PEAK, Zope, Chandler, etc.), it can be a big burden to keep the
+package list updated. That's what ``setuptools.find_packages()`` is for.
+
+``find_packages()`` takes a source directory and two lists of package name
+patterns to exclude and include. If omitted, the source directory defaults to
+the same
+directory as the setup script. Some projects use a ``src`` or ``lib``
+directory as the root of their source tree, and those projects would of course
+use ``"src"`` or ``"lib"`` as the first argument to ``find_packages()``. (And
+such projects also need something like ``package_dir={"": "src"}`` in their
+``setup()`` arguments, but that's just a normal distutils thing.)
+
+Anyway, ``find_packages()`` walks the target directory, filtering by inclusion
+patterns, and finds Python packages (any directory). Packages are only
+recognized if they include an ``__init__.py`` file. Finally, exclusion
+patterns are applied to remove matching packages.
+
+Inclusion and exclusion patterns are package names, optionally including
+wildcards. For
+example, ``find_packages(exclude=["*.tests"])`` will exclude all packages whose
+last name part is ``tests``. Or, ``find_packages(exclude=["*.tests",
+"*.tests.*"])`` will also exclude any subpackages of packages named ``tests``,
+but it still won't exclude a top-level ``tests`` package or the children
+thereof. In fact, if you really want no ``tests`` packages at all, you'll need
+something like this::
+
+ find_packages(exclude=["*.tests", "*.tests.*", "tests.*", "tests"])
+
+in order to cover all the bases. Really, the exclusion patterns are intended
+to cover simpler use cases than this, like excluding a single, specified
+package and its subpackages.
+
+Regardless of the parameters, the ``find_packages()``
+function returns a list of package names suitable for use as the ``packages``
+argument to ``setup()``, and so is usually the easiest way to set that
+argument in your setup script. Especially since it frees you from having to
+remember to modify your setup script whenever your project grows additional
+top-level packages or subpackages.
+
+``find_namespace_packages()``
+-----------------------------
+In Python 3.3+, ``setuptools`` also provides the ``find_namespace_packages`` variant
+of ``find_packages``, which has the same function signature as
+``find_packages``, but works with `PEP 420`_ compliant implicit namespace
+packages. Here is a minimal setup script using ``find_namespace_packages``::
+
+ from setuptools import setup, find_namespace_packages
+ setup(
+ name="HelloWorld",
+ version="0.1",
+ packages=find_namespace_packages(),
+ )
+
+
+Keep in mind that according to PEP 420, you may have to either re-organize your
+codebase a bit or define a few exclusions, as the definition of an implicit
+namespace package is quite lenient, so for a project organized like so::
+
+
+ ├── namespace
+ │   └── mypackage
+ │   ├── __init__.py
+ │   └── mod1.py
+ ├── setup.py
+ └── tests
+ └── test_mod1.py
+
+A naive ``find_namespace_packages()`` would install both ``namespace.mypackage`` and a
+top-level package called ``tests``! One way to avoid this problem is to use the
+``include`` keyword to whitelist the packages to include, like so::
+
+ from setuptools import setup, find_namespace_packages
+
+ setup(
+ name="namespace.mypackage",
+ version="0.1",
+ packages=find_namespace_packages(include=["namespace.*"])
+ )
+
+Another option is to use the "src" layout, where all package code is placed in
+the ``src`` directory, like so::
+
+
+ ├── setup.py
+ ├── src
+ │   └── namespace
+ │   └── mypackage
+ │   ├── __init__.py
+ │   └── mod1.py
+ └── tests
+ └── test_mod1.py
+
+With this layout, the package directory is specified as ``src``, as such::
+
+ setup(name="namespace.mypackage",
+ version="0.1",
+ package_dir={"": "src"},
+ packages=find_namespace_packages(where="src"))
+
+.. _PEP 420: https://www.python.org/dev/peps/pep-0420/
+
+Automatic Script Creation
+=========================
+
+Packaging and installing scripts can be a bit awkward with the distutils. For
+one thing, there's no easy way to have a script's filename match local
+conventions on both Windows and POSIX platforms. For another, you often have
+to create a separate file just for the "main" script, when your actual "main"
+is a function in a module somewhere. And even in Python 2.4, using the ``-m``
+option only works for actual ``.py`` files that aren't installed in a package.
+
+``setuptools`` fixes all of these problems by automatically generating scripts
+for you with the correct extension, and on Windows it will even create an
+``.exe`` file so that users don't have to change their ``PATHEXT`` settings.
+The way to use this feature is to define "entry points" in your setup script
+that indicate what function the generated script should import and run. For
+example, to create two console scripts called ``foo`` and ``bar``, and a GUI
+script called ``baz``, you might do something like this::
+
+ setup(
+ # other arguments here...
+ entry_points={
+ "console_scripts": [
+ "foo = my_package.some_module:main_func",
+ "bar = other_module:some_func",
+ ],
+ "gui_scripts": [
+ "baz = my_package_gui:start_func",
+ ]
+ }
+ )
+
+When this project is installed on non-Windows platforms (using "setup.py
+install", "setup.py develop", or with pip), a set of ``foo``, ``bar``,
+and ``baz`` scripts will be installed that import ``main_func`` and
+``some_func`` from the specified modules. The functions you specify are
+called with no arguments, and their return value is passed to
+``sys.exit()``, so you can return an errorlevel or message to print to
+stderr.
+
+On Windows, a set of ``foo.exe``, ``bar.exe``, and ``baz.exe`` launchers are
+created, alongside a set of ``foo.py``, ``bar.py``, and ``baz.pyw`` files. The
+``.exe`` wrappers find and execute the right version of Python to run the
+``.py`` or ``.pyw`` file.
+
+You may define as many "console script" and "gui script" entry points as you
+like, and each one can optionally specify "extras" that it depends on, that
+will be added to ``sys.path`` when the script is run. For more information on
+"extras", see the section below on `Declaring Extras`_. For more information
+on "entry points" in general, see the section below on `Dynamic Discovery of
+Services and Plugins`_.
+
+"Eggsecutable" Scripts
+----------------------
+
+.. deprecated:: 45.3.0
+
+Occasionally, there are situations where it's desirable to make an ``.egg``
+file directly executable. You can do this by including an entry point such
+as the following::
+
+ setup(
+ # other arguments here...
+ entry_points={
+ "setuptools.installation": [
+ "eggsecutable = my_package.some_module:main_func",
+ ]
+ }
+ )
+
+Any eggs built from the above setup script will include a short executable
+prelude that imports and calls ``main_func()`` from ``my_package.some_module``.
+The prelude can be run on Unix-like platforms (including Mac and Linux) by
+invoking the egg with ``/bin/sh``, or by enabling execute permissions on the
+``.egg`` file. For the executable prelude to run, the appropriate version of
+Python must be available via the ``PATH`` environment variable, under its
+"long" name. That is, if the egg is built for Python 2.3, there must be a
+``python2.3`` executable present in a directory on ``PATH``.
+
+IMPORTANT NOTE: Eggs with an "eggsecutable" header cannot be renamed, or
+invoked via symlinks. They *must* be invoked using their original filename, in
+order to ensure that, once running, ``pkg_resources`` will know what project
+and version is in use. The header script will check this and exit with an
+error if the ``.egg`` file has been renamed or is invoked via a symlink that
+changes its base name.
+
+Declaring Dependencies
+======================
+
+``setuptools`` supports automatically installing dependencies when a package is
+installed, and including information about dependencies in Python Eggs (so that
+package management tools like pip can use the information).
+
+``setuptools`` and ``pkg_resources`` use a common syntax for specifying a
+project's required dependencies. This syntax consists of a project's PyPI
+name, optionally followed by a comma-separated list of "extras" in square
+brackets, optionally followed by a comma-separated list of version
+specifiers. A version specifier is one of the operators ``<``, ``>``, ``<=``,
+``>=``, ``==`` or ``!=``, followed by a version identifier. Tokens may be
+separated by whitespace, but any whitespace or nonstandard characters within a
+project name or version identifier must be replaced with ``-``.
+
+Version specifiers for a given project are internally sorted into ascending
+version order, and used to establish what ranges of versions are acceptable.
+Adjacent redundant conditions are also consolidated (e.g. ``">1, >2"`` becomes
+``">2"``, and ``"<2,<3"`` becomes ``"<2"``). ``"!="`` versions are excised from
+the ranges they fall within. A project's version is then checked for
+membership in the resulting ranges. (Note that providing conflicting conditions
+for the same version (e.g. "<2,>=2" or "==2,!=2") is meaningless and may
+therefore produce bizarre results.)
+
+Here are some example requirement specifiers::
+
+ docutils >= 0.3
+
+ # comment lines and \ continuations are allowed in requirement strings
+ BazSpam ==1.1, ==1.2, ==1.3, ==1.4, ==1.5, \
+ ==1.6, ==1.7 # and so are line-end comments
+
+ PEAK[FastCGI, reST]>=0.5a4
+
+ setuptools==0.5a7
+
+The simplest way to include requirement specifiers is to use the
+``install_requires`` argument to ``setup()``. It takes a string or list of
+strings containing requirement specifiers. If you include more than one
+requirement in a string, each requirement must begin on a new line.
+
+This has three effects:
+
+1. When your project is installed, either by using pip, ``setup.py install``,
+ or ``setup.py develop``, all of the dependencies not already installed will
+ be located (via PyPI), downloaded, built (if necessary), and installed.
+
+2. Any scripts in your project will be installed with wrappers that verify
+ the availability of the specified dependencies at runtime, and ensure that
+ the correct versions are added to ``sys.path`` (e.g. if multiple versions
+ have been installed).
+
+3. Python Egg distributions will include a metadata file listing the
+ dependencies.
+
+Note, by the way, that if you declare your dependencies in ``setup.py``, you do
+*not* need to use the ``require()`` function in your scripts or modules, as
+long as you either install the project or use ``setup.py develop`` to do
+development work on it. (See `"Development Mode"`_ below for more details on
+using ``setup.py develop``.)
+
+Dependencies that aren't in PyPI
+--------------------------------
+
+.. warning::
+ Dependency links support has been dropped by pip starting with version
+ 19.0 (released 2019-01-22).
+
+If your project depends on packages that don't exist on PyPI, you may still be
+able to depend on them, as long as they are available for download as:
+
+- an egg, in the standard distutils ``sdist`` format,
+- a single ``.py`` file, or
+- a VCS repository (Subversion, Mercurial, or Git).
+
+You just need to add some URLs to the ``dependency_links`` argument to
+``setup()``.
+
+The URLs must be either:
+
+1. direct download URLs,
+2. the URLs of web pages that contain direct download links, or
+3. the repository's URL
+
+In general, it's better to link to web pages, because it is usually less
+complex to update a web page than to release a new version of your project.
+You can also use a SourceForge ``showfiles.php`` link in the case where a
+package you depend on is distributed via SourceForge.
+
+If you depend on a package that's distributed as a single ``.py`` file, you
+must include an ``"#egg=project-version"`` suffix to the URL, to give a project
+name and version number. (Be sure to escape any dashes in the name or version
+by replacing them with underscores.) EasyInstall will recognize this suffix
+and automatically create a trivial ``setup.py`` to wrap the single ``.py`` file
+as an egg.
+
+In the case of a VCS checkout, you should also append ``#egg=project-version``
+in order to identify for what package that checkout should be used. You can
+append ``@REV`` to the URL's path (before the fragment) to specify a revision.
+Additionally, you can also force the VCS being used by prepending the URL with
+a certain prefix. Currently available are:
+
+- ``svn+URL`` for Subversion,
+- ``git+URL`` for Git, and
+- ``hg+URL`` for Mercurial
+
+A more complete example would be:
+
+ ``vcs+proto://host/path@revision#egg=project-version``
+
+Be careful with the version. It should match the one inside the project files.
+If you want to disregard the version, you have to omit it both in the
+``requires`` and in the URL's fragment.
+
+This will do a checkout (or a clone, in Git and Mercurial parlance) to a
+temporary folder and run ``setup.py bdist_egg``.
+
+The ``dependency_links`` option takes the form of a list of URL strings. For
+example, this will cause a search of the specified page for eggs or source
+distributions, if the package's dependencies aren't already installed::
+
+ setup(
+ ...
+ dependency_links=[
+ "http://peak.telecommunity.com/snapshots/"
+ ],
+ )
+
+
+.. _Declaring Extras:
+
+
+Declaring "Extras" (optional features with their own dependencies)
+------------------------------------------------------------------
+
+Sometimes a project has "recommended" dependencies, that are not required for
+all uses of the project. For example, a project might offer optional PDF
+output if ReportLab is installed, and reStructuredText support if docutils is
+installed. These optional features are called "extras", and setuptools allows
+you to define their requirements as well. In this way, other projects that
+require these optional features can force the additional requirements to be
+installed, by naming the desired extras in their ``install_requires``.
+
+For example, let's say that Project A offers optional PDF and reST support::
+
+ setup(
+ name="Project-A",
+ ...
+ extras_require={
+ "PDF": ["ReportLab>=1.2", "RXP"],
+ "reST": ["docutils>=0.3"],
+ }
+ )
+
+As you can see, the ``extras_require`` argument takes a dictionary mapping
+names of "extra" features, to strings or lists of strings describing those
+features' requirements. These requirements will *not* be automatically
+installed unless another package depends on them (directly or indirectly) by
+including the desired "extras" in square brackets after the associated project
+name. (Or if the extras were listed in a requirement spec on the "pip install"
+command line.)
+
+Extras can be used by a project's `entry points`_ to specify dynamic
+dependencies. For example, if Project A includes a "rst2pdf" script, it might
+declare it like this, so that the "PDF" requirements are only resolved if the
+"rst2pdf" script is run::
+
+ setup(
+ name="Project-A",
+ ...
+ entry_points={
+ "console_scripts": [
+ "rst2pdf = project_a.tools.pdfgen [PDF]",
+ "rst2html = project_a.tools.htmlgen",
+ # more script entry points ...
+ ],
+ }
+ )
+
+Projects can also use another project's extras when specifying dependencies.
+For example, if project B needs "project A" with PDF support installed, it
+might declare the dependency like this::
+
+ setup(
+ name="Project-B",
+ install_requires=["Project-A[PDF]"],
+ ...
+ )
+
+This will cause ReportLab to be installed along with project A, if project B is
+installed -- even if project A was already installed. In this way, a project
+can encapsulate groups of optional "downstream dependencies" under a feature
+name, so that packages that depend on it don't have to know what the downstream
+dependencies are. If a later version of Project A builds in PDF support and
+no longer needs ReportLab, or if it ends up needing other dependencies besides
+ReportLab in order to provide PDF support, Project B's setup information does
+not need to change, but the right packages will still be installed if needed.
+
+Note, by the way, that if a project ends up not needing any other packages to
+support a feature, it should keep an empty requirements list for that feature
+in its ``extras_require`` argument, so that packages depending on that feature
+don't break (due to an invalid feature name). For example, if Project A above
+builds in PDF support and no longer needs ReportLab, it could change its
+setup to this::
+
+ setup(
+ name="Project-A",
+ ...
+ extras_require={
+ "PDF": [],
+ "reST": ["docutils>=0.3"],
+ }
+ )
+
+so that Package B doesn't have to remove the ``[PDF]`` from its requirement
+specifier.
+
+.. _Platform Specific Dependencies:
+
+
+Declaring platform specific dependencies
+----------------------------------------
+
+Sometimes a project might require a dependency to run on a specific platform.
+This could to a package that back ports a module so that it can be used in
+older python versions. Or it could be a package that is required to run on a
+specific operating system. This will allow a project to work on multiple
+different platforms without installing dependencies that are not required for
+a platform that is installing the project.
+
+For example, here is a project that uses the ``enum`` module and ``pywin32``::
+
+ setup(
+ name="Project",
+ ...
+ install_requires=[
+ "enum34;python_version<'3.4'",
+ "pywin32 >= 1.0;platform_system=='Windows'"
+ ]
+ )
+
+Since the ``enum`` module was added in Python 3.4, it should only be installed
+if the python version is earlier. Since ``pywin32`` will only be used on
+windows, it should only be installed when the operating system is Windows.
+Specifying version requirements for the dependencies is supported as normal.
+
+The environmental markers that may be used for testing platform types are
+detailed in `PEP 508`_.
+
+.. _PEP 508: https://www.python.org/dev/peps/pep-0508/
+
+Including Data Files
+====================
+
+The distutils have traditionally allowed installation of "data files", which
+are placed in a platform-specific location. However, the most common use case
+for data files distributed with a package is for use *by* the package, usually
+by including the data files in the package directory.
+
+Setuptools offers three ways to specify data files to be included in your
+packages. First, you can simply use the ``include_package_data`` keyword,
+e.g.::
+
+ from setuptools import setup, find_packages
+ setup(
+ ...
+ include_package_data=True
+ )
+
+This tells setuptools to install any data files it finds in your packages.
+The data files must be specified via the distutils' ``MANIFEST.in`` file.
+(They can also be tracked by a revision control system, using an appropriate
+plugin. See the section below on `Adding Support for Revision Control
+Systems`_ for information on how to write such plugins.)
+
+If you want finer-grained control over what files are included (for example,
+if you have documentation files in your package directories and want to exclude
+them from installation), then you can also use the ``package_data`` keyword,
+e.g.::
+
+ from setuptools import setup, find_packages
+ setup(
+ ...
+ package_data={
+ # If any package contains *.txt or *.rst files, include them:
+ "": ["*.txt", "*.rst"],
+ # And include any *.msg files found in the "hello" package, too:
+ "hello": ["*.msg"],
+ }
+ )
+
+The ``package_data`` argument is a dictionary that maps from package names to
+lists of glob patterns. The globs may include subdirectory names, if the data
+files are contained in a subdirectory of the package. For example, if the
+package tree looks like this::
+
+ setup.py
+ src/
+ mypkg/
+ __init__.py
+ mypkg.txt
+ data/
+ somefile.dat
+ otherdata.dat
+
+The setuptools setup file might look like this::
+
+ from setuptools import setup, find_packages
+ setup(
+ ...
+ packages=find_packages("src"), # include all packages under src
+ package_dir={"": "src"}, # tell distutils packages are under src
+
+ package_data={
+ # If any package contains *.txt files, include them:
+ "": ["*.txt"],
+ # And include any *.dat files found in the "data" subdirectory
+ # of the "mypkg" package, also:
+ "mypkg": ["data/*.dat"],
+ }
+ )
+
+Notice that if you list patterns in ``package_data`` under the empty string,
+these patterns are used to find files in every package, even ones that also
+have their own patterns listed. Thus, in the above example, the ``mypkg.txt``
+file gets included even though it's not listed in the patterns for ``mypkg``.
+
+Also notice that if you use paths, you *must* use a forward slash (``/``) as
+the path separator, even if you are on Windows. Setuptools automatically
+converts slashes to appropriate platform-specific separators at build time.
+
+If datafiles are contained in a subdirectory of a package that isn't a package
+itself (no ``__init__.py``), then the subdirectory names (or ``*``) are required
+in the ``package_data`` argument (as shown above with ``"data/*.dat"``).
+
+When building an ``sdist``, the datafiles are also drawn from the
+``package_name.egg-info/SOURCES.txt`` file, so make sure that this is removed if
+the ``setup.py`` ``package_data`` list is updated before calling ``setup.py``.
+
+(Note: although the ``package_data`` argument was previously only available in
+``setuptools``, it was also added to the Python ``distutils`` package as of
+Python 2.4; there is `some documentation for the feature`__ available on the
+python.org website. If using the setuptools-specific ``include_package_data``
+argument, files specified by ``package_data`` will *not* be automatically
+added to the manifest unless they are listed in the MANIFEST.in file.)
+
+__ https://docs.python.org/3/distutils/setupscript.html#installing-package-data
+
+Sometimes, the ``include_package_data`` or ``package_data`` options alone
+aren't sufficient to precisely define what files you want included. For
+example, you may want to include package README files in your revision control
+system and source distributions, but exclude them from being installed. So,
+setuptools offers an ``exclude_package_data`` option as well, that allows you
+to do things like this::
+
+ from setuptools import setup, find_packages
+ setup(
+ ...
+ packages=find_packages("src"), # include all packages under src
+ package_dir={"": "src"}, # tell distutils packages are under src
+
+ include_package_data=True, # include everything in source control
+
+ # ...but exclude README.txt from all packages
+ exclude_package_data={"": ["README.txt"]},
+ )
+
+The ``exclude_package_data`` option is a dictionary mapping package names to
+lists of wildcard patterns, just like the ``package_data`` option. And, just
+as with that option, a key of ``""`` will apply the given pattern(s) to all
+packages. However, any files that match these patterns will be *excluded*
+from installation, even if they were listed in ``package_data`` or were
+included as a result of using ``include_package_data``.
+
+In summary, the three options allow you to:
+
+``include_package_data``
+ Accept all data files and directories matched by ``MANIFEST.in``.
+
+``package_data``
+ Specify additional patterns to match files that may or may
+ not be matched by ``MANIFEST.in`` or found in source control.
+
+``exclude_package_data``
+ Specify patterns for data files and directories that should *not* be
+ included when a package is installed, even if they would otherwise have
+ been included due to the use of the preceding options.
+
+NOTE: Due to the way the distutils build process works, a data file that you
+include in your project and then stop including may be "orphaned" in your
+project's build directories, requiring you to run ``setup.py clean --all`` to
+fully remove them. This may also be important for your users and contributors
+if they track intermediate revisions of your project using Subversion; be sure
+to let them know when you make changes that remove files from inclusion so they
+can run ``setup.py clean --all``.
+
+Accessing Data Files at Runtime
+-------------------------------
+
+Typically, existing programs manipulate a package's ``__file__`` attribute in
+order to find the location of data files. However, this manipulation isn't
+compatible with PEP 302-based import hooks, including importing from zip files
+and Python Eggs. It is strongly recommended that, if you are using data files,
+you should use the :ref:`ResourceManager API` of ``pkg_resources`` to access
+them. The ``pkg_resources`` module is distributed as part of setuptools, so if
+you're using setuptools to distribute your package, there is no reason not to
+use its resource management API. See also `Importlib Resources`_ for
+a quick example of converting code that uses ``__file__`` to use
+``pkg_resources`` instead.
+
+.. _Importlib Resources: https://docs.python.org/3/library/importlib.html#module-importlib.resources
+
+
+Non-Package Data Files
+----------------------
+
+Historically, ``setuptools`` by way of ``easy_install`` would encapsulate data
+files from the distribution into the egg (see `the old docs
+<https://github.com/pypa/setuptools/blob/52aacd5b276fedd6849c3a648a0014f5da563e93/docs/setuptools.txt#L970-L1001>`_). As eggs are deprecated and pip-based installs
+fall back to the platform-specific location for installing data files, there is
+no supported facility to reliably retrieve these resources.
+
+Instead, the PyPA recommends that any data files you wish to be accessible at
+run time be included in the package.
+
+
+Automatic Resource Extraction
+-----------------------------
+
+If you are using tools that expect your resources to be "real" files, or your
+project includes non-extension native libraries or other files that your C
+extensions expect to be able to access, you may need to list those files in
+the ``eager_resources`` argument to ``setup()``, so that the files will be
+extracted together, whenever a C extension in the project is imported.
+
+This is especially important if your project includes shared libraries *other*
+than distutils-built C extensions, and those shared libraries use file
+extensions other than ``.dll``, ``.so``, or ``.dylib``, which are the
+extensions that setuptools 0.6a8 and higher automatically detects as shared
+libraries and adds to the ``native_libs.txt`` file for you. Any shared
+libraries whose names do not end with one of those extensions should be listed
+as ``eager_resources``, because they need to be present in the filesystem when
+he C extensions that link to them are used.
+
+The ``pkg_resources`` runtime for compressed packages will automatically
+extract *all* C extensions and ``eager_resources`` at the same time, whenever
+*any* C extension or eager resource is requested via the ``resource_filename()``
+API. (C extensions are imported using ``resource_filename()`` internally.)
+This ensures that C extensions will see all of the "real" files that they
+expect to see.
+
+Note also that you can list directory resource names in ``eager_resources`` as
+well, in which case the directory's contents (including subdirectories) will be
+extracted whenever any C extension or eager resource is requested.
+
+Please note that if you're not sure whether you need to use this argument, you
+don't! It's really intended to support projects with lots of non-Python
+dependencies and as a last resort for crufty projects that can't otherwise
+handle being compressed. If your package is pure Python, Python plus data
+files, or Python plus C, you really don't need this. You've got to be using
+either C or an external program that needs "real" files in your project before
+there's any possibility of ``eager_resources`` being relevant to your project.
+
+
+Extensible Applications and Frameworks
+======================================
+
+
+.. _Entry Points:
+
+Dynamic Discovery of Services and Plugins
+-----------------------------------------
+
+``setuptools`` supports creating libraries that "plug in" to extensible
+applications and frameworks, by letting you register "entry points" in your
+project that can be imported by the application or framework.
+
+For example, suppose that a blogging tool wants to support plugins
+that provide translation for various file types to the blog's output format.
+The framework might define an "entry point group" called ``blogtool.parsers``,
+and then allow plugins to register entry points for the file extensions they
+support.
+
+This would allow people to create distributions that contain one or more
+parsers for different file types, and then the blogging tool would be able to
+find the parsers at runtime by looking up an entry point for the file
+extension (or mime type, or however it wants to).
+
+Note that if the blogging tool includes parsers for certain file formats, it
+can register these as entry points in its own setup script, which means it
+doesn't have to special-case its built-in formats. They can just be treated
+the same as any other plugin's entry points would be.
+
+If you're creating a project that plugs in to an existing application or
+framework, you'll need to know what entry points or entry point groups are
+defined by that application or framework. Then, you can register entry points
+in your setup script. Here are a few examples of ways you might register an
+``.rst`` file parser entry point in the ``blogtool.parsers`` entry point group,
+for our hypothetical blogging tool::
+
+ setup(
+ # ...
+ entry_points={"blogtool.parsers": ".rst = some_module:SomeClass"}
+ )
+
+ setup(
+ # ...
+ entry_points={"blogtool.parsers": [".rst = some_module:a_func"]}
+ )
+
+ setup(
+ # ...
+ entry_points="""
+ [blogtool.parsers]
+ .rst = some.nested.module:SomeClass.some_classmethod [reST]
+ """,
+ extras_require=dict(reST="Docutils>=0.3.5")
+ )
+
+The ``entry_points`` argument to ``setup()`` accepts either a string with
+``.ini``-style sections, or a dictionary mapping entry point group names to
+either strings or lists of strings containing entry point specifiers. An
+entry point specifier consists of a name and value, separated by an ``=``
+sign. The value consists of a dotted module name, optionally followed by a
+``:`` and a dotted identifier naming an object within the module. It can
+also include a bracketed list of "extras" that are required for the entry
+point to be used. When the invoking application or framework requests loading
+of an entry point, any requirements implied by the associated extras will be
+passed to ``pkg_resources.require()``, so that an appropriate error message
+can be displayed if the needed package(s) are missing. (Of course, the
+invoking app or framework can ignore such errors if it wants to make an entry
+point optional if a requirement isn't installed.)
+
+Defining Additional Metadata
+----------------------------
+
+Some extensible applications and frameworks may need to define their own kinds
+of metadata to include in eggs, which they can then access using the
+``pkg_resources`` metadata APIs. Ordinarily, this is done by having plugin
+developers include additional files in their ``ProjectName.egg-info``
+directory. However, since it can be tedious to create such files by hand, you
+may want to create a distutils extension that will create the necessary files
+from arguments to ``setup()``, in much the same way that ``setuptools`` does
+for many of the ``setup()`` arguments it adds. See the section below on
+`Creating distutils Extensions`_ for more details, especially the subsection on
+`Adding new EGG-INFO Files`_.
+
+
+"Development Mode"
+==================
+
+Under normal circumstances, the ``distutils`` assume that you are going to
+build a distribution of your project, not use it in its "raw" or "unbuilt"
+form. If you were to use the ``distutils`` that way, you would have to rebuild
+and reinstall your project every time you made a change to it during
+development.
+
+Another problem that sometimes comes up with the ``distutils`` is that you may
+need to do development on two related projects at the same time. You may need
+to put both projects' packages in the same directory to run them, but need to
+keep them separate for revision control purposes. How can you do this?
+
+Setuptools allows you to deploy your projects for use in a common directory or
+staging area, but without copying any files. Thus, you can edit each project's
+code in its checkout directory, and only need to run build commands when you
+change a project's C extensions or similarly compiled files. You can even
+deploy a project into another project's checkout directory, if that's your
+preferred way of working (as opposed to using a common independent staging area
+or the site-packages directory).
+
+To do this, use the ``setup.py develop`` command. It works very similarly to
+``setup.py install``, except that it doesn't actually install anything.
+Instead, it creates a special ``.egg-link`` file in the deployment directory,
+that links to your project's source code. And, if your deployment directory is
+Python's ``site-packages`` directory, it will also update the
+``easy-install.pth`` file to include your project's source code, thereby making
+it available on ``sys.path`` for all programs using that Python installation.
+
+If you have enabled the ``use_2to3`` flag, then of course the ``.egg-link``
+will not link directly to your source code when run under Python 3, since
+that source code would be made for Python 2 and not work under Python 3.
+Instead the ``setup.py develop`` will build Python 3 code under the ``build``
+directory, and link there. This means that after doing code changes you will
+have to run ``setup.py build`` before these changes are picked up by your
+Python 3 installation.
+
+In addition, the ``develop`` command creates wrapper scripts in the target
+script directory that will run your in-development scripts after ensuring that
+all your ``install_requires`` packages are available on ``sys.path``.
+
+You can deploy the same project to multiple staging areas, e.g. if you have
+multiple projects on the same machine that are sharing the same project you're
+doing development work.
+
+When you're done with a given development task, you can remove the project
+source from a staging area using ``setup.py develop --uninstall``, specifying
+the desired staging area if it's not the default.
+
+There are several options to control the precise behavior of the ``develop``
+command; see the section on the `develop`_ command below for more details.
+
+Note that you can also apply setuptools commands to non-setuptools projects,
+using commands like this::
+
+ python -c "import setuptools; with open('setup.py') as f: exec(compile(f.read(), 'setup.py', 'exec'))" develop
+
+That is, you can simply list the normal setup commands and options following
+the quoted part.
+
+
+Distributing a ``setuptools``-based project
+===========================================
+
+Detailed instructions to distribute a setuptools project can be found at
+`Packaging project tutorials`_.
+
+.. _Packaging project tutorials: https://packaging.python.org/tutorials/packaging-projects/#generating-distribution-archives
+
+Before you begin, make sure you have the latest versions of setuptools and wheel::
+
+ pip install --upgrade setuptools wheel
+
+To build a setuptools project, run this command from the same directory where
+setup.py is located::
+
+ setup.py sdist bdist_wheel
+
+This will generate distribution archives in the `dist` directory.
+
+Before you upload the generated archives make sure you're registered on
+https://test.pypi.org/account/register/. You will also need to verify your email
+to be able to upload any packages.
+You should install twine to be able to upload packages::
+
+ pip install --upgrade twine
+
+Now, to upload these archives, run::
+
+ twine upload --repository-url https://test.pypi.org/legacy/ dist/*
+
+To install your newly uploaded package ``example_pkg``, you can use pip::
+
+ pip install --index-url https://test.pypi.org/simple/ example_pkg
+
+If you have issues at any point, please refer to `Packaging project tutorials`_
+for clarification.
+
+
+Setting the ``zip_safe`` flag
+-----------------------------
+
+For some use cases (such as bundling as part of a larger application), Python
+packages may be run directly from a zip file.
+Not all packages, however, are capable of running in compressed form, because
+they may expect to be able to access either source code or data files as
+normal operating system files. So, ``setuptools`` can install your project
+as a zipfile or a directory, and its default choice is determined by the
+project's ``zip_safe`` flag.
+
+You can pass a True or False value for the ``zip_safe`` argument to the
+``setup()`` function, or you can omit it. If you omit it, the ``bdist_egg``
+command will analyze your project's contents to see if it can detect any
+conditions that would prevent it from working in a zipfile. It will output
+notices to the console about any such conditions that it finds.
+
+Currently, this analysis is extremely conservative: it will consider the
+project unsafe if it contains any C extensions or datafiles whatsoever. This
+does *not* mean that the project can't or won't work as a zipfile! It just
+means that the ``bdist_egg`` authors aren't yet comfortable asserting that
+the project *will* work. If the project contains no C or data files, and does
+no ``__file__`` or ``__path__`` introspection or source code manipulation, then
+there is an extremely solid chance the project will work when installed as a
+zipfile. (And if the project uses ``pkg_resources`` for all its data file
+access, then C extensions and other data files shouldn't be a problem at all.
+See the `Accessing Data Files at Runtime`_ section above for more information.)
+
+However, if ``bdist_egg`` can't be *sure* that your package will work, but
+you've checked over all the warnings it issued, and you are either satisfied it
+*will* work (or if you want to try it for yourself), then you should set
+``zip_safe`` to ``True`` in your ``setup()`` call. If it turns out that it
+doesn't work, you can always change it to ``False``, which will force
+``setuptools`` to install your project as a directory rather than as a zipfile.
+
+In the future, as we gain more experience with different packages and become
+more satisfied with the robustness of the ``pkg_resources`` runtime, the
+"zip safety" analysis may become less conservative. However, we strongly
+recommend that you determine for yourself whether your project functions
+correctly when installed as a zipfile, correct any problems if you can, and
+then make an explicit declaration of ``True`` or ``False`` for the ``zip_safe``
+flag, so that it will not be necessary for ``bdist_egg`` to try to guess
+whether your project can work as a zipfile.
+
+.. _Namespace Packages:
+
+Namespace Packages
+------------------
+
+Sometimes, a large package is more useful if distributed as a collection of
+smaller eggs. However, Python does not normally allow the contents of a
+package to be retrieved from more than one location. "Namespace packages"
+are a solution for this problem. When you declare a package to be a namespace
+package, it means that the package has no meaningful contents in its
+``__init__.py``, and that it is merely a container for modules and subpackages.
+
+The ``pkg_resources`` runtime will then automatically ensure that the contents
+of namespace packages that are spread over multiple eggs or directories are
+combined into a single "virtual" package.
+
+The ``namespace_packages`` argument to ``setup()`` lets you declare your
+project's namespace packages, so that they will be included in your project's
+metadata. The argument should list the namespace packages that the egg
+participates in. For example, the ZopeInterface project might do this::
+
+ setup(
+ # ...
+ namespace_packages=["zope"]
+ )
+
+because it contains a ``zope.interface`` package that lives in the ``zope``
+namespace package. Similarly, a project for a standalone ``zope.publisher``
+would also declare the ``zope`` namespace package. When these projects are
+installed and used, Python will see them both as part of a "virtual" ``zope``
+package, even though they will be installed in different locations.
+
+Namespace packages don't have to be top-level packages. For example, Zope 3's
+``zope.app`` package is a namespace package, and in the future PEAK's
+``peak.util`` package will be too.
+
+Note, by the way, that your project's source tree must include the namespace
+packages' ``__init__.py`` files (and the ``__init__.py`` of any parent
+packages), in a normal Python package layout. These ``__init__.py`` files
+*must* contain the line::
+
+ __import__("pkg_resources").declare_namespace(__name__)
+
+This code ensures that the namespace package machinery is operating and that
+the current package is registered as a namespace package.
+
+You must NOT include any other code and data in a namespace package's
+``__init__.py``. Even though it may appear to work during development, or when
+projects are installed as ``.egg`` files, it will not work when the projects
+are installed using "system" packaging tools -- in such cases the
+``__init__.py`` files will not be installed, let alone executed.
+
+You must include the ``declare_namespace()`` line in the ``__init__.py`` of
+*every* project that has contents for the namespace package in question, in
+order to ensure that the namespace will be declared regardless of which
+project's copy of ``__init__.py`` is loaded first. If the first loaded
+``__init__.py`` doesn't declare it, it will never *be* declared, because no
+other copies will ever be loaded!
+
+Tagging and "Daily Build" or "Snapshot" Releases
+------------------------------------------------
+
+When a set of related projects are under development, it may be important to
+track finer-grained version increments than you would normally use for e.g.
+"stable" releases. While stable releases might be measured in dotted numbers
+with alpha/beta/etc. status codes, development versions of a project often
+need to be tracked by revision or build number or even build date. This is
+especially true when projects in development need to refer to one another, and
+therefore may literally need an up-to-the-minute version of something!
+
+To support these scenarios, ``setuptools`` allows you to "tag" your source and
+egg distributions by adding one or more of the following to the project's
+"official" version identifier:
+
+* A manually-specified pre-release tag, such as "build" or "dev", or a
+ manually-specified post-release tag, such as a build or revision number
+ (``--tag-build=STRING, -bSTRING``)
+
+* An 8-character representation of the build date (``--tag-date, -d``), as
+ a postrelease tag
+
+You can add these tags by adding ``egg_info`` and the desired options to
+the command line ahead of the ``sdist`` or ``bdist`` commands that you want
+to generate a daily build or snapshot for. See the section below on the
+`egg_info`_ command for more details.
+
+(Also, before you release your project, be sure to see the section above on
+`Specifying Your Project's Version`_ for more information about how pre- and
+post-release tags affect how version numbers are interpreted. This is
+important in order to make sure that dependency processing tools will know
+which versions of your project are newer than others.)
+
+Finally, if you are creating builds frequently, and either building them in a
+downloadable location or are copying them to a distribution server, you should
+probably also check out the `rotate`_ command, which lets you automatically
+delete all but the N most-recently-modified distributions matching a glob
+pattern. So, you can use a command line like::
+
+ setup.py egg_info -rbDEV bdist_egg rotate -m.egg -k3
+
+to build an egg whose version info includes "DEV-rNNNN" (where NNNN is the
+most recent Subversion revision that affected the source tree), and then
+delete any egg files from the distribution directory except for the three
+that were built most recently.
+
+If you have to manage automated builds for multiple packages, each with
+different tagging and rotation policies, you may also want to check out the
+`alias`_ command, which would let each package define an alias like ``daily``
+that would perform the necessary tag, build, and rotate commands. Then, a
+simpler script or cron job could just run ``setup.py daily`` in each project
+directory. (And, you could also define sitewide or per-user default versions
+of the ``daily`` alias, so that projects that didn't define their own would
+use the appropriate defaults.)
+
+Generating Source Distributions
+-------------------------------
+
+``setuptools`` enhances the distutils' default algorithm for source file
+selection with pluggable endpoints for looking up files to include. If you are
+using a revision control system, and your source distributions only need to
+include files that you're tracking in revision control, use a corresponding
+plugin instead of writing a ``MANIFEST.in`` file. See the section below on
+`Adding Support for Revision Control Systems`_ for information on plugins.
+
+If you need to include automatically generated files, or files that are kept in
+an unsupported revision control system, you'll need to create a ``MANIFEST.in``
+file to specify any files that the default file location algorithm doesn't
+catch. See the distutils documentation for more information on the format of
+the ``MANIFEST.in`` file.
+
+But, be sure to ignore any part of the distutils documentation that deals with
+``MANIFEST`` or how it's generated from ``MANIFEST.in``; setuptools shields you
+from these issues and doesn't work the same way in any case. Unlike the
+distutils, setuptools regenerates the source distribution manifest file
+every time you build a source distribution, and it builds it inside the
+project's ``.egg-info`` directory, out of the way of your main project
+directory. You therefore need not worry about whether it is up-to-date or not.
+
+Indeed, because setuptools' approach to determining the contents of a source
+distribution is so much simpler, its ``sdist`` command omits nearly all of
+the options that the distutils' more complex ``sdist`` process requires. For
+all practical purposes, you'll probably use only the ``--formats`` option, if
+you use any option at all.
+
+
+Making "Official" (Non-Snapshot) Releases
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+When you make an official release, creating source or binary distributions,
+you will need to override the tag settings from ``setup.cfg``, so that you
+don't end up registering versions like ``foobar-0.7a1.dev-r34832``. This is
+easy to do if you are developing on the trunk and using tags or branches for
+your releases - just make the change to ``setup.cfg`` after branching or
+tagging the release, so the trunk will still produce development snapshots.
+
+Alternately, if you are not branching for releases, you can override the
+default version options on the command line, using something like::
+
+ setup.py egg_info -Db "" sdist bdist_egg
+
+The first part of this command (``egg_info -Db ""``) will override the
+configured tag information, before creating source and binary eggs. Thus, these
+commands will use the plain version from your ``setup.py``, without adding the
+build designation string.
+
+Of course, if you will be doing this a lot, you may wish to create a personal
+alias for this operation, e.g.::
+
+ setup.py alias -u release egg_info -Db ""
+
+You can then use it like this::
+
+ setup.py release sdist bdist_egg
+
+Or of course you can create more elaborate aliases that do all of the above.
+See the sections below on the `egg_info`_ and `alias`_ commands for more ideas.
+
+Distributing Extensions compiled with Cython
+--------------------------------------------
+
+``setuptools`` will detect at build time whether Cython is installed or not.
+If Cython is not found ``setuptools`` will ignore pyx files.
+
+To ensure Cython is available, include Cython in the build-requires section
+of your pyproject.toml::
+
+ [build-system]
+ requires=[..., "cython"]
+
+Built with pip 10 or later, that declaration is sufficient to include Cython
+in the build. For broader compatibility, declare the dependency in your
+setup-requires of setup.cfg::
+
+ [options]
+ setup_requires =
+ ...
+ cython
+
+As long as Cython is present in the build environment, ``setuptools`` includes
+transparent support for building Cython extensions, as
+long as extensions are defined using ``setuptools.Extension``.
+
+If you follow these rules, you can safely list ``.pyx`` files as the source
+of your ``Extension`` objects in the setup script. If it is, then ``setuptools``
+will use it.
+
+Of course, for this to work, your source distributions must include the C
+code generated by Cython, as well as your original ``.pyx`` files. This means
+that you will probably want to include current ``.c`` files in your revision
+control system, rebuilding them whenever you check changes in for the ``.pyx``
+source files. This will ensure that people tracking your project in a revision
+control system will be able to build it even if they don't have Cython
+installed, and that your source releases will be similarly usable with or
+without Cython.
+
+--------------------------------
+Extending and Reusing Setuptools
+--------------------------------
+
+Creating ``distutils`` Extensions
+=================================
+
+It can be hard to add new commands or setup arguments to the distutils. But
+the ``setuptools`` package makes it a bit easier, by allowing you to distribute
+a distutils extension as a separate project, and then have projects that need
+the extension just refer to it in their ``setup_requires`` argument.
+
+With ``setuptools``, your distutils extension projects can hook in new
+commands and ``setup()`` arguments just by defining "entry points". These
+are mappings from command or argument names to a specification of where to
+import a handler from. (See the section on `Dynamic Discovery of Services and
+Plugins`_ above for some more background on entry points.)
+
+
+Adding Commands
+---------------
+
+You can add new ``setup`` commands by defining entry points in the
+``distutils.commands`` group. For example, if you wanted to add a ``foo``
+command, you might add something like this to your distutils extension
+project's setup script::
+
+ setup(
+ # ...
+ entry_points={
+ "distutils.commands": [
+ "foo = mypackage.some_module:foo",
+ ],
+ },
+ )
+
+(Assuming, of course, that the ``foo`` class in ``mypackage.some_module`` is
+a ``setuptools.Command`` subclass.)
+
+Once a project containing such entry points has been activated on ``sys.path``,
+(e.g. by running "install" or "develop" with a site-packages installation
+directory) the command(s) will be available to any ``setuptools``-based setup
+scripts. It is not necessary to use the ``--command-packages`` option or
+to monkeypatch the ``distutils.command`` package to install your commands;
+``setuptools`` automatically adds a wrapper to the distutils to search for
+entry points in the active distributions on ``sys.path``. In fact, this is
+how setuptools' own commands are installed: the setuptools project's setup
+script defines entry points for them!
+
+Adding ``setup()`` Arguments
+----------------------------
+
+.. warning:: Adding arguments to setup is discouraged as such arguments
+ are only supported through imperative execution and not supported through
+ declarative config.
+
+Sometimes, your commands may need additional arguments to the ``setup()``
+call. You can enable this by defining entry points in the
+``distutils.setup_keywords`` group. For example, if you wanted a ``setup()``
+argument called ``bar_baz``, you might add something like this to your
+distutils extension project's setup script::
+
+ setup(
+ # ...
+ entry_points={
+ "distutils.commands": [
+ "foo = mypackage.some_module:foo",
+ ],
+ "distutils.setup_keywords": [
+ "bar_baz = mypackage.some_module:validate_bar_baz",
+ ],
+ },
+ )
+
+The idea here is that the entry point defines a function that will be called
+to validate the ``setup()`` argument, if it's supplied. The ``Distribution``
+object will have the initial value of the attribute set to ``None``, and the
+validation function will only be called if the ``setup()`` call sets it to
+a non-None value. Here's an example validation function::
+
+ def assert_bool(dist, attr, value):
+ """Verify that value is True, False, 0, or 1"""
+ if bool(value) != value:
+ raise DistutilsSetupError(
+ "%r must be a boolean value (got %r)" % (attr,value)
+ )
+
+Your function should accept three arguments: the ``Distribution`` object,
+the attribute name, and the attribute value. It should raise a
+``DistutilsSetupError`` (from the ``distutils.errors`` module) if the argument
+is invalid. Remember, your function will only be called with non-None values,
+and the default value of arguments defined this way is always None. So, your
+commands should always be prepared for the possibility that the attribute will
+be ``None`` when they access it later.
+
+If more than one active distribution defines an entry point for the same
+``setup()`` argument, *all* of them will be called. This allows multiple
+distutils extensions to define a common argument, as long as they agree on
+what values of that argument are valid.
+
+Also note that as with commands, it is not necessary to subclass or monkeypatch
+the distutils ``Distribution`` class in order to add your arguments; it is
+sufficient to define the entry points in your extension, as long as any setup
+script using your extension lists your project in its ``setup_requires``
+argument.
+
+
+Customizing Distribution Options
+--------------------------------
+
+Plugins may wish to extend or alter the options on a Distribution object to
+suit the purposes of that project. For example, a tool that infers the
+``Distribution.version`` from SCM-metadata may need to hook into the
+option finalization. To enable this feature, Setuptools offers an entry
+point "setuptools.finalize_distribution_options". That entry point must
+be a callable taking one argument (the Distribution instance).
+
+If the callable has an ``.order`` property, that value will be used to
+determine the order in which the hook is called. Lower numbers are called
+first and the default is zero (0).
+
+Plugins may read, alter, and set properties on the distribution, but each
+plugin is encouraged to load the configuration/settings for their behavior
+independently.
+
+
+Adding new EGG-INFO Files
+-------------------------
+
+Some extensible applications or frameworks may want to allow third parties to
+develop plugins with application or framework-specific metadata included in
+the plugins' EGG-INFO directory, for easy access via the ``pkg_resources``
+metadata API. The easiest way to allow this is to create a distutils extension
+to be used from the plugin projects' setup scripts (via ``setup_requires``)
+that defines a new setup keyword, and then uses that data to write an EGG-INFO
+file when the ``egg_info`` command is run.
+
+The ``egg_info`` command looks for extension points in an ``egg_info.writers``
+group, and calls them to write the files. Here's a simple example of a
+distutils extension defining a setup argument ``foo_bar``, which is a list of
+lines that will be written to ``foo_bar.txt`` in the EGG-INFO directory of any
+project that uses the argument::
+
+ setup(
+ # ...
+ entry_points={
+ "distutils.setup_keywords": [
+ "foo_bar = setuptools.dist:assert_string_list",
+ ],
+ "egg_info.writers": [
+ "foo_bar.txt = setuptools.command.egg_info:write_arg",
+ ],
+ },
+ )
+
+This simple example makes use of two utility functions defined by setuptools
+for its own use: a routine to validate that a setup keyword is a sequence of
+strings, and another one that looks up a setup argument and writes it to
+a file. Here's what the writer utility looks like::
+
+ def write_arg(cmd, basename, filename):
+ argname = os.path.splitext(basename)[0]
+ value = getattr(cmd.distribution, argname, None)
+ if value is not None:
+ value = "\n".join(value) + "\n"
+ cmd.write_or_delete_file(argname, filename, value)
+
+As you can see, ``egg_info.writers`` entry points must be a function taking
+three arguments: a ``egg_info`` command instance, the basename of the file to
+write (e.g. ``foo_bar.txt``), and the actual full filename that should be
+written to.
+
+In general, writer functions should honor the command object's ``dry_run``
+setting when writing files, and use the ``distutils.log`` object to do any
+console output. The easiest way to conform to this requirement is to use
+the ``cmd`` object's ``write_file()``, ``delete_file()``, and
+``write_or_delete_file()`` methods exclusively for your file operations. See
+those methods' docstrings for more details.
+
+Adding Support for Revision Control Systems
+-------------------------------------------------
+
+If the files you want to include in the source distribution are tracked using
+Git, Mercurial or SVN, you can use the following packages to achieve that:
+
+- Git and Mercurial: `setuptools_scm <https://pypi.org/project/setuptools_scm/>`_
+- SVN: `setuptools_svn <https://pypi.org/project/setuptools_svn/>`_
+
+If you would like to create a plugin for ``setuptools`` to find files tracked
+by another revision control system, you can do so by adding an entry point to
+the ``setuptools.file_finders`` group. The entry point should be a function
+accepting a single directory name, and should yield all the filenames within
+that directory (and any subdirectories thereof) that are under revision
+control.
+
+For example, if you were going to create a plugin for a revision control system
+called "foobar", you would write a function something like this:
+
+.. code-block:: python
+
+ def find_files_for_foobar(dirname):
+ # loop to yield paths that start with `dirname`
+
+And you would register it in a setup script using something like this::
+
+ entry_points={
+ "setuptools.file_finders": [
+ "foobar = my_foobar_module:find_files_for_foobar",
+ ]
+ }
+
+Then, anyone who wants to use your plugin can simply install it, and their
+local setuptools installation will be able to find the necessary files.
+
+It is not necessary to distribute source control plugins with projects that
+simply use the other source control system, or to specify the plugins in
+``setup_requires``. When you create a source distribution with the ``sdist``
+command, setuptools automatically records what files were found in the
+``SOURCES.txt`` file. That way, recipients of source distributions don't need
+to have revision control at all. However, if someone is working on a package
+by checking out with that system, they will need the same plugin(s) that the
+original author is using.
+
+A few important points for writing revision control file finders:
+
+* Your finder function MUST return relative paths, created by appending to the
+ passed-in directory name. Absolute paths are NOT allowed, nor are relative
+ paths that reference a parent directory of the passed-in directory.
+
+* Your finder function MUST accept an empty string as the directory name,
+ meaning the current directory. You MUST NOT convert this to a dot; just
+ yield relative paths. So, yielding a subdirectory named ``some/dir`` under
+ the current directory should NOT be rendered as ``./some/dir`` or
+ ``/somewhere/some/dir``, but *always* as simply ``some/dir``
+
+* Your finder function SHOULD NOT raise any errors, and SHOULD deal gracefully
+ with the absence of needed programs (i.e., ones belonging to the revision
+ control system itself. It *may*, however, use ``distutils.log.warn()`` to
+ inform the user of the missing program(s). \ No newline at end of file