From 6e48dcfe8c6f4918724f6300762bfae1ccfb76b0 Mon Sep 17 00:00:00 2001 From: Dimitri Papadopoulos <3234522+DimitriPapadopoulos@users.noreply.github.com> Date: Wed, 8 Nov 2023 09:49:28 +0100 Subject: [PATCH 1/2] MNT: do not refer to the optional data packages The rationale is that script `doc/source/devel/register_me.py` is a Python 2 script that is not compatible with Python 3. Looks like the whole machinery has not been used for ages. --- doc/source/devel/data_pkg_design.rst | 298 --------------------------- doc/source/devel/devdiscuss.rst | 1 - doc/source/devel/register_me.py | 47 ----- doc/source/installing_data.rst | 80 ------- 4 files changed, 426 deletions(-) delete mode 100644 doc/source/devel/data_pkg_design.rst delete mode 100644 doc/source/devel/register_me.py delete mode 100644 doc/source/installing_data.rst diff --git a/doc/source/devel/data_pkg_design.rst b/doc/source/devel/data_pkg_design.rst deleted file mode 100644 index eabf2ea7e8..0000000000 --- a/doc/source/devel/data_pkg_design.rst +++ /dev/null @@ -1,298 +0,0 @@ -.. _data-package-design: - -Design of data packages for the nibabel and the nipy suite -========================================================== - -See :ref:`data-package-discuss` for a more general discussion of design -issues. - -When developing or using nipy, many data files can be useful. We divide the -data files nipy uses into at least 3 categories - -#. *test data* - data files required for routine code testing -#. *template data* - data files required for algorithms to function, - such as templates or atlases -#. *example data* - data files for running examples, or optional tests - -Files used for routine testing are typically very small data files. They are -shipped with the software, and live in the code repository. For example, in -the case of ``nipy`` itself, there are some test files that live in the module -path ``nipy.testing.data``. Nibabel ships data files in -``nibabel.tests.data``. See :doc:`add_test_data` for discussion. - -*template data* and *example data* are example of *data packages*. What -follows is a discussion of the design and use of data packages. - -.. testsetup:: - - # Make fake data and template directories - import os - from os.path import join as pjoin - import tempfile - tmpdir = tempfile.mkdtemp() - os.environ['NIPY_USER_DIR'] = tmpdir - for subdir in ('data', 'templates'): - files_dir = pjoin(tmpdir, 'nipy', subdir) - os.makedirs(files_dir) - with open(pjoin(files_dir, 'config.ini'), 'wt') as fobj: - fobj.write( - """[DEFAULT] - version = 0.2 - """) - -Use cases for data packages -+++++++++++++++++++++++++++ - -Using the data package -`````````````````````` - -The programmer can use the data like this: - -.. testcode:: - - from nibabel.data import make_datasource - - templates = make_datasource(dict(relpath='nipy/templates')) - fname = templates.get_filename('ICBM152', '2mm', 'T1.nii.gz') - -where ``fname`` will be the absolute path to the template image -``ICBM152/2mm/T1.nii.gz``. - -The programmer can insist on a particular version of a ``datasource``: - ->>> if templates.version < '0.4': -... raise ValueError('Need datasource version at least 0.4') -Traceback (most recent call last): -... -ValueError: Need datasource version at least 0.4 - -If the repository cannot find the data, then: - ->>> make_datasource(dict(relpath='nipy/implausible')) -Traceback (most recent call last): - ... -nibabel.data.DataError: ... - -where ``DataError`` gives a helpful warning about why the data was not -found, and how it should be installed. - -Warnings during installation -```````````````````````````` - -The example data and template data may be important, and so we want to warn -the user if NIPY cannot find either of the two sets of data when installing -the package. Thus:: - - python setup.py install - -will import nipy after installation to check whether these raise an error: - ->>> from nibabel.data import make_datasource ->>> templates = make_datasource(dict(relpath='nipy/templates')) ->>> example_data = make_datasource(dict(relpath='nipy/data')) - -and warn the user accordingly, with some basic instructions for how to -install the data. - -.. _find-data: - -Finding the data -```````````````` - -The routine ``make_datasource`` will look for data packages that have been -installed. For the following call: - ->>> templates = make_datasource(dict(relpath='nipy/templates')) - -the code will: - -#. Get a list of paths where data is known to be stored with - ``nibabel.data.get_data_path()`` -#. For each of these paths, search for directory ``nipy/templates``. If - found, and of the correct format (see below), return a datasource, - otherwise raise an Exception - -The paths collected by ``nibabel.data.get_data_paths()`` are constructed from -':' (Unix) or ';' separated strings. The source of the strings (in the order -in which they will be used in the search above) are: - -#. The value of the ``NIPY_DATA_PATH`` environment variable, if set -#. A section = ``DATA``, parameter = ``path`` entry in a - ``config.ini`` file in ``nipy_dir`` where ``nipy_dir`` is - ``$HOME/.nipy`` or equivalent. -#. Section = ``DATA``, parameter = ``path`` entries in configuration - ``.ini`` files, where the ``.ini`` files are found by - ``glob.glob(os.path.join(etc_dir, '*.ini')`` and ``etc_dir`` is - ``/etc/nipy`` on Unix, and some suitable equivalent on Windows. -#. The result of ``os.path.join(sys.prefix, 'share', 'nipy')`` -#. If ``sys.prefix`` is ``/usr``, we add ``/usr/local/share/nipy``. We - need this because Python >= 2.6 in Debian / Ubuntu does default installs to - ``/usr/local``. -#. The result of ``get_nipy_user_dir()`` - -Requirements for a data package -``````````````````````````````` - -To be a valid NIPY project data package, you need to satisfy: - -#. The installer installs the data in some place that can be found using - the method defined in :ref:`find-data`. - -We recommend that: - -#. By default, you install data in a standard location such as - ``/share/nipy`` where ```` is the standard Python - prefix obtained by ``>>> import sys; print sys.prefix`` - -Remember that there is a distinction between the NIPY project - the -umbrella of neuroimaging in python - and the NIPY package - the main -code package in the NIPY project. Thus, if you want to install data -under the NIPY *package* umbrella, your data might go to -``/usr/share/nipy/nipy/packagename`` (on Unix). Note ``nipy`` twice - -once for the project, once for the package. If you want to install data -under - say - the ``pbrain`` package umbrella, that would go in -``/usr/share/nipy/pbrain/packagename``. - -Data package format -``````````````````` - -The following tree is an example of the kind of pattern we would expect -in a data directory, where the ``nipy-data`` and ``nipy-templates`` -packages have been installed:: - - - `-- nipy - |-- data - | |-- config.ini - | `-- placeholder.txt - `-- templates - |-- ICBM152 - | `-- 2mm - | `-- T1.nii.gz - |-- colin27 - | `-- 2mm - | `-- T1.nii.gz - `-- config.ini - -The ```` directory is the directory that will appear somewhere in -the list from ``nibabel.data.get_data_path()``. The ``nipy`` subdirectory -signifies data for the ``nipy`` package (as opposed to other -NIPY-related packages such as ``pbrain``). The ``data`` subdirectory of -``nipy`` contains files from the ``nipy-data`` package. In the -``nipy/data`` or ``nipy/templates`` directories, there is a -``config.ini`` file, that has at least an entry like this:: - - [DEFAULT] - version = 0.2 - -giving the version of the data package. - -.. _data-package-design-install: - -Installing the data -``````````````````` - -We use python distutils to install data packages, and the ``data_files`` -mechanism to install the data. On Unix, with the following command:: - - python setup.py install --prefix=/my/prefix - -data will go to:: - - /my/prefix/share/nipy - -For the example above this will result in these subdirectories:: - - /my/prefix/share/nipy/nipy/data - /my/prefix/share/nipy/nipy/templates - -because ``nipy`` is both the project, and the package to which the data -relates. - -If you install to a particular location, you will need to add that location to -the output of ``nibabel.data.get_data_path()`` using one of the mechanisms -above, for example, in your system configuration:: - - export NIPY_DATA_PATH=/my/prefix/share/nipy - -Packaging for distributions -``````````````````````````` - -For a particular data package - say ``nipy-templates`` - distributions -will want to: - -#. Install the data in set location. The default from ``python setup.py - install`` for the data packages will be ``/usr/share/nipy`` on Unix. -#. Point a system installation of NIPY to these data. - -For the latter, the most obvious route is to copy an ``.ini`` file named for -the data package into the NIPY ``etc_dir``. In this case, on Unix, we will -want a file called ``/etc/nipy/nipy_templates.ini`` with contents:: - - [DATA] - path = /usr/share/nipy - -Current implementation -`````````````````````` - -This section describes how we (the nipy community) implement data packages at -the moment. - -The data in the data packages will not usually be under source control. This -is because images don't compress very well, and any change in the data will -result in a large extra storage cost in the repository. If you're pretty -clear that the data files aren't going to change, then a repository could work -OK. - -The data packages will be available at a central release location. For now -this will be: http://nipy.org/data-packages/ . - -A package, such as ``nipy-templates-0.2.tar.gz`` will have the following sort -of structure:: - - - - |-- setup.py - |-- README.txt - |-- MANIFEST.in - `-- templates - |-- ICBM152 - | |-- 1mm - | | `-- T1_brain.nii.gz - | `-- 2mm - | `-- T1.nii.gz - |-- colin27 - | `-- 2mm - | `-- T1.nii.gz - `-- config.ini - - -There should be only one ``nipy/packagename`` directory delivered by a -particular package. For example, this package installs ``nipy/templates``, -but does not contain ``nipy/data``. - -Making a new package tarball is simply: - -#. Downloading and unpacking e.g. ``nipy-templates-0.1.tar.gz`` to form the - directory structure above; -#. Making any changes to the directory; -#. Running ``setup.py sdist`` to recreate the package. - -The process of making a release should be: - -#. Increment the major or minor version number in the ``config.ini`` file; -#. Make a package tarball as above; -#. Upload to distribution site. - -There is an example nipy data package ``nipy-examplepkg`` in the -``examples`` directory of the NIPY repository. - -The machinery for creating and maintaining data packages is available at -https://github.com/nipy/data-packaging. - -See the ``README.txt`` file there for more information. - -.. testcleanup:: - - import shutil - shutil.rmtree(tmpdir) diff --git a/doc/source/devel/devdiscuss.rst b/doc/source/devel/devdiscuss.rst index c864928d60..8383558838 100644 --- a/doc/source/devel/devdiscuss.rst +++ b/doc/source/devel/devdiscuss.rst @@ -21,7 +21,6 @@ progress. spm_use modified_images - data_pkg_design data_pkg_discuss data_pkg_uses scaling diff --git a/doc/source/devel/register_me.py b/doc/source/devel/register_me.py deleted file mode 100644 index 017f873abf..0000000000 --- a/doc/source/devel/register_me.py +++ /dev/null @@ -1,47 +0,0 @@ -import configparser as cfp -import sys -from os.path import abspath, dirname, expanduser -from os.path import join as pjoin - -if sys.platform == 'win32': - HOME_INI = pjoin(expanduser('~'), '_dpkg', 'local.dsource') -else: - HOME_INI = pjoin(expanduser('~'), '.dpkg', 'local.dsource') -SYS_INI = pjoin(abspath('etc'), 'dpkg', 'local.dsource') -OUR_PATH = dirname(__file__) -OUR_META = pjoin(OUR_PATH, 'meta.ini') -DISCOVER_INIS = {'user': HOME_INI, 'system': SYS_INI} - - -def main(): - # Get ini file to which to write - try: - reg_to = sys.argv[1] - except IndexError: - reg_to = 'user' - if reg_to in ('user', 'system'): - ini_fname = DISCOVER_INIS[reg_to] - else: # it is an ini file name - ini_fname = reg_to - - # Read parameters for our distribution - meta = cfp.ConfigParser() - files = meta.read(OUR_META) - if len(files) == 0: - raise RuntimeError('Missing meta.ini file') - name = meta.get('DEFAULT', 'name') - version = meta.get('DEFAULT', 'version') - - # Write into ini file - dsource = cfp.ConfigParser() - dsource.read(ini_fname) - if not dsource.has_section(name): - dsource.add_section(name) - dsource.set(name, version, OUR_PATH) - dsource.write(file(ini_fname, 'wt')) - - print(f'Registered package {name}, {version} to {ini_fname}') - - -if __name__ == '__main__': - main() diff --git a/doc/source/installing_data.rst b/doc/source/installing_data.rst deleted file mode 100644 index ce32de2375..0000000000 --- a/doc/source/installing_data.rst +++ /dev/null @@ -1,80 +0,0 @@ -:orphan: - -.. _installing-data: - -Installing data packages -======================== - -nibabel includes some machinery for using optional data packages. We use data -packages for some of the DICOM tests in nibabel. There are also data packages -for standard template images, and other packages for components of nipy, -including the main nipy package. - -For more details on data package design, see :ref:`data-package-design`. - -We haven't yet made a nice automated way of downloading and installing the -packages. For the moment you can find packages for the data and template files -at http://nipy.org/data-packages. - -Data package installation as an administrator ---------------------------------------------- - -The installation procedure, for now, is very basic. For example, let us -say that you want the 'nipy-templates' package at -http://nipy.org/data-packages/nipy-templates-0.1.tar.gz -. You simply download this archive, unpack it, and then run the standard -``python setup.py install`` on it. On a unix system this might look -like:: - - curl -O http://nipy.org/data-packages/nipy-templates-0.1.tar.gz - tar zxvf nipy-templates-0.1.tar.gz - cd nipy-templates-0.1 - sudo python setup.py install - -On windows, download the file, extract the archive to a folder using the -GUI, and then, using the windows shell or similar:: - - cd c:\path\to\extracted\files - python setup.py install - -Non-administrator data package installation -------------------------------------------- - -The commands above assume you are installing into the default system -directories. If you want to install into a custom directory, then (in -python, or ipython, or a text editor) look at the help for -``nipy.utils.data.get_data_path()`` . There are instructions there for -pointing your nipy installation to the installed data. - -On unix -~~~~~~~ - -For example, say you installed with:: - - cd nipy-templates-0.1 - python setup.py install --prefix=/home/my-user/some-dir - -Then you may want to do make a file ``~/.nipy/config.ini`` with the -following contents:: - - [DATA] - /home/my-user/some-dir/share/nipy - -On windows -~~~~~~~~~~ - -Say you installed with (windows shell):: - - cd nipy-templates-0.1 - python setup.py install --prefix=c:\some\path - -Then first, find out your home directory:: - - python -c "import os; print os.path.expanduser('~')" - -Let's say that was ``c:\Documents and Settings\My User``. Then, make a -new file called ``c:\Documents and Settings\My User\_nipy\config.ini`` -with contents:: - - [DATA] - c:\some\path\share\nipy From 7155e772c53d28a9fc4ffdbf4640b8ef3867ab3b Mon Sep 17 00:00:00 2001 From: Dimitri Papadopoulos <3234522+DimitriPapadopoulos@users.noreply.github.com> Date: Sun, 3 Dec 2023 21:57:59 +0100 Subject: [PATCH 2/2] MNT: remove more stuff about optional data package --- doc/source/devel/data_pkg_uses.rst | 255 ----------------------------- doc/source/devel/devdiscuss.rst | 2 - 2 files changed, 257 deletions(-) delete mode 100644 doc/source/devel/data_pkg_uses.rst diff --git a/doc/source/devel/data_pkg_uses.rst b/doc/source/devel/data_pkg_uses.rst deleted file mode 100644 index 8573e06cb7..0000000000 --- a/doc/source/devel/data_pkg_uses.rst +++ /dev/null @@ -1,255 +0,0 @@ -.. _data-pkg-uses: - -######################################## -Data package usecases and implementation -######################################## - -******** -Usecases -******** - -We are here working from :doc:`data_pkg_discuss` - -Prundles -======== - -See :ref:`prundle`. - -An *local path* format prundle is a directory on the local file system with prundle data stored in files in a -on the local filesystem. - -Examples -======== - -We'll call our package `dang` - data package new generation. - -Create local-path prundle -------------------------- - -:: - - >>> import os - >>> import tempfile - >>> pth = tempfile.mkdtemp() # temporary directory - -Make a pinstance object:: - - >>> from dang import Pinstance - >>> pri = Prundle(name='my-package') - >>> pri.pkg_name - 'my-package' - >>> pri.meta - {} - -Now we make a prundle. First a directory to contain it:: - - >>> import os - >>> import tempfile - >>> pth = tempfile.mkdtemp() # temporary directory - - >>> from dang.prundle import LocalPathPrundle - >>> prun = LocalPathPrundle(pri, pth) - -At the moment there's nothing in the directory. The 'write' method will write -the meta information - here just the package name:: - - >>> prun.write() # writes meta.ini file - >>> os.listdir(pth) - ['meta.ini'] - -The local path prundle data is just the set of files in the temporary directory -named in ``pth`` above. - -Now we've written the package, we can get it by a single call that reads in the -``meta.ini`` file:: - - >>> prun_back = LocalPathPrundle.from_path(pth) - >>> prun_back.pkg_name - 'my-package' - -Getting prundle data --------------------- - -The file-system prundle formats can return content by file names. - -For example, for the local path ``prun`` distribution objects we have seen so -far, the following should work:: - - >>> fobj = prun.get_fileobj('a_file.txt') - -In fact, local path distribution objects also have a ``path`` attribute:: - - >>> fname = os.path.join(prun.path, 'a_file.txt') - -The ``path`` attribute might not make sense for objects with greater -abstraction over the file-system - for example objects encapsulating web -content. - -********* -Discovery -********* - -So far, in order to create a prundle object, we have to know where the prundle -is (the path). - -We want to be able to tell the system where prundles are - and the system will -then be able to return a prundle on request - perhaps by package name. The -system here is answering a :ref:`prundle-discovery` query. - -We will then want to ask our packaging system whether it knows about the -prundle we are interested in. - -Discovery sources -================= - -A discovery source is an object that can answer a discovery query. -Specifically, it is an object with a ``discover`` method, like this:: - - >>> import dang - >>> dsrc = dang.get_source('local-system') - >>> dquery_result = dsrc.discover('my-package', version='0') - >>> dquery_result[0].pkg_name - 'my-package' - >>> dquery_result = dsrc.discover('implausible-pkg', version='0') - >>> len(dquery_result) - 0 - -The discovery version number spec may allow comparison operators, as for -``distutils.version.LooseVersion``:: - - >>> res = dsrc.discover(name='my-package', version='>=0') - >>> prun = rst[0] - >>> prun.pkg_name - 'my-package' - >>> prun.meta['version'] - '0' - -Default discovery sources -========================= - -We've used the ``local-system`` discovery source in this call:: - - >>> dsrc = dpkg.get_source('local-system') - -The ``get_source`` function is a convenience function that returns default -discovery sources by name. There are at least two named discovery sources, -``local-system``, and ``local-user``. ``local-system`` is a discovery source -for packages that are installed system-wide (``/usr/share/data`` type -installation in \*nix). ``local-user`` is for packages installed for this user -only (``/home/user/data`` type installations in \*nix). - -Discovery source pools -====================== - -We'll typically have more than one source from which we'd like to query. The -obvious case is where we want to look for both system and local sources. For -this we have a *source pool* which simply returns the first known distribution -from a list of sources. Something like this:: - - >>> local_sys = dpkg.get_source('local-system') - >>> local_usr = dpkg.get_source('local-user') - >>> src_pool = dpkg.SourcePool((local_usr, local_sys)) - >>> dq_res = src_pool.discover('my-package', version='0') - >>> dq_res[0].pkg_name - 'my-package' - -We'll often want to do exactly this, so we'll add this source pool to those -that can be returned from our ``get_source`` convenience function:: - - >>> src_pool = dpkg.get_source('local-pool') - -Register a prundle -================== - -In order to register a prundle, we need a prundle object and a -discovery source:: - - >>> from dang.prundle import LocalPathPrundle - >>> prun = LocalPathDistribution.from_path(path=/a/path') - >>> local_usr = dang.get_source('local-user') - >>> local_usr.register(prun) - -Let us then write the source to disk:: - - >>> local_usr.write() - -Now, when we start another process as the same user, we can do this:: - - >>> import dang - >>> local_usr = dang.get_source('local-user') - >>> prun = local_usr.discover('my-package', '0')[0] - -************** -Implementation -************** - -Here are some notes. We had the hope that we could implement something that -would be simple enough that someone using the system would not need our code, -but could work from the specification. - -Local path prundles -=================== - -These are directories accessible on the local filesystem. The directory needs -to give information about the prundle name and optionally, version, tag, -revision id and maybe other metadata. An ``ini`` file is probably enough for -this - something like a ``meta.ini`` file in the directory with:: - - [DEFAULT] - name = my-package - version = 0 - -might be enough to get started. - -Discovery sources -================= - -The discovery source has to be able to return prundle objects for the -prundles it knows about:: - - [my-package] - 0 = /some/path - 0.1 = /another/path - [another-package] - 0 = /further/path - -Registering a package -===================== - -So far we have a local path distribution, that is a directory with some files -in it, and our own ``meta.ini`` file, containing the package name and version. -How does this package register itself to the default sources? Of course, we -could use ``dpkg`` as above:: - - >>> dst = dpkg.LocalPathDistribution.from_path(path='/a/path') - >>> local_usr = dpkg.get_source('local-user') - >>> local_usr.register(dst) - >>> local_usr.save() - -but we wanted to be able to avoid using ``dpkg``. To do this, there might be -a supporting script, in the distribution directory, called ``register_me.py``, -of form given in :download:`register_me.py`. - -Using discovery sources without dpkg -==================================== - -The local discovery sources are ini files, so it would be easy to read and use -these outside the dpkg system, as long as the locations of the ini files are -well defined. Here is the code from ``register_me.py`` defining these files:: - - import os - import sys - - if sys.platform == 'win32': - _home_dpkg_sdir = '_dpkg' - _sys_drive, _ = os.path.splitdrive(sys.prefix) - else: - _home_dpkg_sdir = '.dpkg' - _sys_drive = '/' - # Can we get the user directory? - _home = os.path.expanduser('~') - if _home == '~': # if not, the user ini file is undefined - HOME_INI = None - else: - HOME_INI = os.path.join(_home, _home_dpkg_sdir, 'local.dsource') - SYS_INI = os.path.join(_sys_drive, 'etc', 'dpkg', 'local.dsource') diff --git a/doc/source/devel/devdiscuss.rst b/doc/source/devel/devdiscuss.rst index 8383558838..bc23e823c2 100644 --- a/doc/source/devel/devdiscuss.rst +++ b/doc/source/devel/devdiscuss.rst @@ -21,7 +21,5 @@ progress. spm_use modified_images - data_pkg_discuss - data_pkg_uses scaling bv_formats