Skip to content

Commit 0fe4518

Browse files
committed
Merge pull request #9 from nipy/master
Update from master
2 parents e40e305 + 47587b9 commit 0fe4518

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

62 files changed

+3406
-822
lines changed

.noserc

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
[nosetests]
2+
verbosity=3
3+
4+
with-coverage=1
5+
cover-branches=1
6+
cover-xml=1
7+
cover-xml-file=./coverage.xml
8+
cover-min-percentage=50
9+
10+
11+
with-xunit=1

.travis.yml

Lines changed: 21 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,24 +16,43 @@ before_install:
1616
- if [ ${TRAVIS_PYTHON_VERSION:0:1} == "2" ]; then export PATH=/home/travis/miniconda2/bin:$PATH; else export PATH=/home/travis/miniconda3/bin:$PATH; fi
1717
- if $INSTALL_DEB_DEPENDECIES; then sudo rm -rf /dev/shm; fi
1818
- if $INSTALL_DEB_DEPENDECIES; then sudo ln -s /run/shm /dev/shm; fi
19-
- if $INSTALL_DEB_DEPENDECIES; then bash <(wget -q -O- http://neuro.debian.net/_files/neurodebian-travis.sh);
20-
fi
19+
- bash <(wget -q -O- http://neuro.debian.net/_files/neurodebian-travis.sh)
20+
- sudo apt-get update
21+
- sudo apt-get install xvfb
2122
- if $INSTALL_DEB_DEPENDECIES; then travis_retry sudo apt-get install -qq --no-install-recommends
2223
fsl afni elastix; fi
2324
- if $INSTALL_DEB_DEPENDECIES; then travis_retry sudo apt-get install -qq fsl-atlases;
2425
fi
2526
- if $INSTALL_DEB_DEPENDECIES; then source /etc/fsl/fsl.sh; fi
2627
- if $INSTALL_DEB_DEPENDECIES; then source /etc/afni/afni.sh; fi
2728
- export FSLOUTPUTTYPE=NIFTI_GZ
29+
# Install vtk and fix numpy installation problem
30+
# Fix numpy problem: https://github.com/enthought/enable/issues/34#issuecomment-2029381
31+
- if [ ${TRAVIS_PYTHON_VERSION:0:1} == "2" ]; then travis_retry sudo apt-get install -qq libx11-dev swig;
32+
echo '[x11]' >> $HOME/.numpy-site.cfg;
33+
echo 'library_dirs = /usr/lib64:/usr/lib:/usr/lib/x86_64-linux-gnu' >> $HOME/.numpy-site.cfg;
34+
echo 'include_dirs = /usr/include:/usr/include/X11' >> $HOME/.numpy-site.cfg;
35+
fi
2836
install:
2937
- conda update --yes conda
3038
- conda create -n testenv --yes pip python=$TRAVIS_PYTHON_VERSION
3139
- source activate testenv
3240
- if [ ${TRAVIS_PYTHON_VERSION:0:1} == "2" ]; then pip install ordereddict; fi
3341
- conda install --yes numpy scipy nose networkx dateutil
3442
- if [ ${TRAVIS_PYTHON_VERSION:0:1} == "2" ]; then conda install --yes traits; else pip install traits; fi
43+
- if [ ${TRAVIS_PYTHON_VERSION:0:1} == "2" ]; then conda install --yes vtk; fi
3544
- pip install python-coveralls
3645
- pip install nose-cov
46+
# Add tvtk (PIL is required by blockcanvas)
47+
# Install mayavi (see https://github.com/enthought/mayavi/issues/271)
48+
- if [ ${TRAVIS_PYTHON_VERSION:0:1} == "2" ]; then
49+
pip install http://effbot.org/downloads/Imaging-1.1.7.tar.gz;
50+
pip install -e git+https://github.com/enthought/etsdevtools.git#egg=etsdevtools;
51+
pip install -e git+https://github.com/enthought/blockcanvas.git#egg=blockcanvas;
52+
pip install -e git+https://github.com/enthought/etsproxy.git#egg=etsproxy;
53+
pip install https://github.com/dmsurti/mayavi/archive/4d4aaf315a29d6a86707dd95149e27d9ed2225bf.zip;
54+
pip install -e git+https://github.com/enthought/ets.git#egg=ets;
55+
fi
3756
- pip install -r requirements.txt # finish remaining requirements
3857
- python setup.py install
3958
script:

CHANGES

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,14 @@
11
Next release
22
============
33

4+
* FIX: Prevent crash when tvtk is loaded - ETS_TOOLKIT=null (https://github.com/nipy/nipype/pull/973)
5+
* ENH: New interfaces in dipy: RESTORE, EstimateResponseSH, CSD and StreamlineTractography
6+
(https://github.com/nipy/nipype/pull/1090)
7+
* ENH: Added interfaces of AFNI (https://github.com/nipy/nipype/pull/1360,
8+
https://github.com/nipy/nipype/pull/1361)
9+
* ENH: Provides a Nipype wrapper for antsJointFusion (https://github.com/nipy/nipype/pull/1351)
10+
* ENH: Added support for PETPVC (https://github.com/nipy/nipype/pull/1335)
11+
* ENH: Merge S3DataSink into DataSink, added AWS documentation (https://github.com/nipy/nipype/pull/1316)
412
* TST: Cache APT in CircleCI (https://github.com/nipy/nipype/pull/1333)
513
* ENH: Add new flags to the BRAINSABC for new features (https://github.com/nipy/nipype/pull/1322)
614
* ENH: Provides a Nipype wrapper for ANTs DenoiseImage (https://github.com/nipy/nipype/pull/1291)

circle.yml

Lines changed: 17 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -9,19 +9,25 @@ dependencies:
99
pre:
1010
# Let CircleCI cache the apt archive
1111
- sudo rm -rf /var/cache/apt/archives && sudo ln -s ~/.apt-cache /var/cache/apt/archives && mkdir -p ~/.apt-cache/partial
12-
- wget -O- http://neuro.debian.net/lists/precise.us-ca.full | sudo tee /etc/apt/sources.list.d/neurodebian.sources.list
13-
- sudo apt-key adv --recv-keys --keyserver hkp://pgp.mit.edu:80 0xA5D32F012649A5A9
14-
- sudo apt-get update
12+
- bash <(wget -q -O- http://neuro.debian.net/_files/neurodebian-travis.sh)
1513
override:
1614
# Install apt packages
17-
- sudo apt-get install -y fsl-core fsl-atlases fsl-mni152-templates fsl-feeds afni
18-
- echo "source /etc/fsl/fsl.sh" >> $HOME/.profile
19-
- echo "source /etc/afni/afni.sh" >> $HOME/.profile
15+
- sudo apt-get install -y fsl-core fsl-atlases fsl-mni152-templates fsl-feeds afni swig python-vtk xvfb
16+
- echo 'source /etc/fsl/fsl.sh' >> $HOME/.profile
17+
- echo 'source /etc/afni/afni.sh' >> $HOME/.profile
2018
- mkdir -p ~/examples/ && ln -sf /usr/share/fsl-feeds/ ~/examples/feeds
19+
# Enable system-wide vtk
20+
- ln -sf /usr/lib/pymodules/python2.7/vtk ~/virtualenvs/venv-system/lib/python2.7/site-packages/
2121
# Set up python environment
2222
- pip install --upgrade pip
2323
- pip install -e .
24-
- pip install matplotlib sphinx ipython boto
24+
- pip install matplotlib sphinx ipython boto coverage dipy
25+
# Add tvtk
26+
- pip install http://effbot.org/downloads/Imaging-1.1.7.tar.gz
27+
- pip install -e git+https://github.com/enthought/etsdevtools.git#egg=etsdevtools
28+
- pip install -e git+https://github.com/enthought/blockcanvas.git#egg=blockcanvas
29+
- pip install -e git+https://github.com/enthought/etsproxy.git#egg=etsproxy
30+
- pip install -e git+https://github.com/enthought/ets.git#egg=ets
2531
- gem install fakes3
2632
- if [[ ! -d ~/examples/data ]]; then wget "http://tcpdiag.dl.sourceforge.net/project/nipy/nipype/nipype-0.2/nipype-tutorial.tar.bz2" && tar jxvf nipype-tutorial.tar.bz2 && mv nipype-tutorial/* ~/examples/; fi
2733
- if [[ ! -d ~/examples/fsl_course_data ]]; then wget -c "http://fsl.fmrib.ox.ac.uk/fslcourse/fdt1.tar.gz" && wget -c "http://fsl.fmrib.ox.ac.uk/fslcourse/fdt2.tar.gz" && wget -c "http://fsl.fmrib.ox.ac.uk/fslcourse/tbss.tar.gz" && mkdir ~/examples/fsl_course_data && tar zxvf fdt1.tar.gz -C ~/examples/fsl_course_data && tar zxvf fdt2.tar.gz -C ~/examples/fsl_course_data && tar zxvf tbss.tar.gz -C ~/examples/fsl_course_data; fi
@@ -32,7 +38,8 @@ machine:
3238
FSLOUTPUTTYPE: NIFTI_GZ
3339
test:
3440
override:
35-
- source $HOME/.profile; nosetests --with-doctest --logging-level=DEBUG --verbosity=3:
41+
- mkdir -p ${CIRCLE_TEST_REPORTS}/nose
42+
- source $HOME/.profile; nosetests --with-doctest --xunit-file="${CIRCLE_TEST_REPORTS}/nose/${CIRCLE_PROJECT_REPONAME}.xml" -c ./.noserc --logging-level=DEBUG --verbosity=3:
3643
environment:
3744
SPMMCRCMD: "$HOME/spm12/run_spm12.sh $HOME/mcr/v85/ script"
3845
FORCE_SPMMCR: 1
@@ -66,3 +73,5 @@ general:
6673
artifacts:
6774
- "doc/_build/html"
6875
- "~/log.txt"
76+
- "nosetests.xml"
77+
- "coverage.xml"

doc/users/aws.rst

Lines changed: 102 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,102 @@
1+
.. _aws:
2+
3+
============================================
4+
Using Nipype with Amazon Web Services (AWS)
5+
============================================
6+
Several groups have been successfully using Nipype on AWS. This procedure
7+
involves setting a temporary cluster using StarCluster and potentially
8+
transferring files to/from S3. The latter is supported by Nipype through
9+
DataSink and S3DataGrabber.
10+
11+
12+
Using DataSink with S3
13+
======================
14+
The DataSink class now supports sending output data directly to an AWS S3
15+
bucket. It does this through the introduction of several input attributes to the
16+
DataSink interface and by parsing the `base_directory` attribute. This class
17+
uses the `boto3 <https://boto3.readthedocs.org/en/latest/>`_ and
18+
`botocore <https://botocore.readthedocs.org/en/latest/>`_ Python packages to
19+
interact with AWS. To configure the DataSink to write data to S3, the user must
20+
set the ``base_directory`` property to an S3-style filepath. For example:
21+
22+
::
23+
24+
import nipype.interfaces.io as nio
25+
ds = nio.DataSink()
26+
ds.inputs.base_directory = 's3://mybucket/path/to/output/dir'
27+
28+
With the "s3://" prefix in the path, the DataSink knows that the output
29+
directory to send files is on S3 in the bucket "mybucket". "path/to/output/dir"
30+
is the relative directory path within the bucket "mybucket" where output data
31+
will be uploaded to (NOTE: if the relative path specified contains folders that
32+
don’t exist in the bucket, the DataSink will create them). The DataSink treats
33+
the S3 base directory exactly as it would a local directory, maintaining support
34+
for containers, substitutions, subfolders, "." notation, etc to route output
35+
data appropriately.
36+
37+
There are four new attributes introduced with S3-compatibility: ``creds_path``,
38+
``encrypt_bucket_keys``, ``local_copy``, and ``bucket``.
39+
40+
::
41+
42+
ds.inputs.creds_path = '/home/user/aws_creds/credentials.csv'
43+
ds.inputs.encrypt_bucket_keys = True
44+
ds.local_copy = '/home/user/workflow_outputs/local_backup'
45+
46+
``creds_path`` is a file path where the user's AWS credentials file (typically
47+
a csv) is stored. This credentials file should contain the AWS access key id and
48+
secret access key and should be formatted as one of the following (these formats
49+
are how Amazon provides the credentials file by default when first downloaded).
50+
51+
Root-account user:
52+
53+
::
54+
55+
AWSAccessKeyID=ABCDEFGHIJKLMNOP
56+
AWSSecretKey=zyx123wvu456/ABC890+gHiJk
57+
58+
IAM-user:
59+
60+
::
61+
62+
User Name,Access Key Id,Secret Access Key
63+
"username",ABCDEFGHIJKLMNOP,zyx123wvu456/ABC890+gHiJk
64+
65+
The ``creds_path`` is necessary when writing files to a bucket that has
66+
restricted access (almost no buckets are publicly writable). If ``creds_path``
67+
is not specified, the DataSink will check the ``AWS_ACCESS_KEY_ID`` and
68+
``AWS_SECRET_ACCESS_KEY`` environment variables and use those values for bucket
69+
access.
70+
71+
``encrypt_bucket_keys`` is a boolean flag that indicates whether to encrypt the
72+
output data on S3, using server-side AES-256 encryption. This is useful if the
73+
data being output is sensitive and one desires an extra layer of security on the
74+
data. By default, this is turned off.
75+
76+
``local_copy`` is a string of the filepath where local copies of the output data
77+
are stored in addition to those sent to S3. This is useful if one wants to keep
78+
a backup version of the data stored on their local computer. By default, this is
79+
turned off.
80+
81+
``bucket`` is a boto3 Bucket object that the user can use to overwrite the
82+
bucket specified in their ``base_directory``. This can be useful if one has to
83+
manually create a bucket instance on their own using special credentials (or
84+
using a mock server like `fakes3 <https://github.com/jubos/fake-s3>`_). This is
85+
typically used for developers unit-testing the DataSink class. Most users do not
86+
need to use this attribute for actual workflows. This is an optional argument.
87+
88+
Finally, the user needs only to specify the input attributes for any incoming
89+
data to the node, and the outputs will be written to their S3 bucket.
90+
91+
::
92+
93+
workflow.connect(inputnode, 'subject_id', ds, 'container')
94+
workflow.connect(realigner, 'realigned_files', ds, 'motion')
95+
96+
So, for example, outputs for sub001’s realigned_file1.nii.gz will be in:
97+
s3://mybucket/path/to/output/dir/sub001/motion/realigned_file1.nii.gz
98+
99+
100+
Using S3DataGrabber
101+
======================
102+
Coming soon...

doc/users/index.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,7 @@
3838
spmmcr
3939
mipav
4040
nipypecmd
41+
aws
4142

4243

4344

0 commit comments

Comments
 (0)