33
33
34
34
"""
35
35
36
- # Set the way matlab should be called
36
+ # Set the way Matlab should be called
37
37
mlab .MatlabCommand .set_default_matlab_cmd ("matlab -nodesktop -nosplash" )
38
38
39
39
"""
40
40
41
41
Setting up workflows
42
42
--------------------
43
- In this tutorial we will be setting up a hierarchical workflow for spm
44
- analysis. This will demonstrate how pre-defined workflows can be setup
43
+ In this tutorial we will be setting up a hierarchical workflow for SPM
44
+ analysis. This will demonstrate how predefined workflows can be setup
45
45
and shared across users, projects and labs.
46
46
47
47
Setup preprocessing workflow
51
51
"""
52
52
53
53
preproc = pe .Workflow (name = 'preproc' )
54
- """We strongly encourage to use 4D files insteead of series of 3D for fMRI analyses
54
+ """We strongly encourage to use 4D files instead of series of 3D for fMRI analyses
55
55
for many reasons (cleanness and saving and filesystem inodes are among them). However,
56
56
the the workflow presented in the SPM8 manual which this tutorial is based on
57
57
uses 3D files. Therefore we leave converting to 4D as an option. We are using ``merge_to_4d``
58
- variable, because switching between 3d and 4d requires some additional steps (explauned later on).
58
+ variable, because switching between 3D and 4dD requires some additional steps (explained later on).
59
59
Use :ref:`nipype.interfaces.fsl.utils.Merge` to merge a series
60
60
of 3D files along the time dimension creating a 4D file.
61
61
"""
@@ -118,8 +118,8 @@ def get_vox_dims(volume):
118
118
119
119
"""Here we are connecting all the nodes together.
120
120
Notice that we add the merge node only if you choose to use 4D.
121
- Also ``get_vox_dims`` function is passed along the input volume of normalise to set the optimal
122
- voxel sizes.
121
+ Also, the ``get_vox_dims`` function is passed along the input volume of
122
+ :ref:`nipype.interfaces.spm.preprocess.Normalize` to set the optimal voxel sizes.
123
123
"""
124
124
125
125
if merge_to_4d :
@@ -185,8 +185,8 @@ def get_vox_dims(volume):
185
185
('spmT_images' , 'stat_image' )]),
186
186
])
187
187
"""
188
- Preproc + Analysis pipeline
189
- ---------------------------
188
+ Preprocessing and analysis pipeline
189
+ -----------------------------------
190
190
"""
191
191
192
192
l1pipeline = pe .Workflow (name = 'firstlevel' )
@@ -195,7 +195,7 @@ def get_vox_dims(volume):
195
195
'modelspec.realignment_parameters' )])])
196
196
197
197
"""
198
- Pluging in ``functional_runs`` is a bit more complicated,
198
+ Plugging in ``functional_runs`` is a bit more complicated,
199
199
because model spec expects a list of ``runs``.
200
200
Every run can be a 4D file or a list of 3D files.
201
201
Therefore for 3D analysis we need a list of lists and to make one we need a helper function.
@@ -252,10 +252,7 @@ def makelist(item):
252
252
253
253
"""
254
254
Now we create a :ref:`nipype.interfaces.io.DataGrabber`
255
- object and fill in the information from above about the layout of our data. The
256
- :class:`nipype.pipeline.NodeWrapper` module wraps the interface object
257
- and provides additional housekeeping and pipeline specific
258
- functionality.
255
+ object and fill in the information from above about the layout of our data.
259
256
"""
260
257
261
258
datasource = pe .Node (
@@ -317,18 +314,26 @@ def makelist(item):
317
314
setup the connections between the nodes such that appropriate outputs
318
315
from nodes are piped into appropriate inputs of other nodes.
319
316
320
- Use the :class:`nipype.pipeline.engine.Pipeline` to create a
321
- graph-based execution pipeline for first level analysis. The config
322
- options tells the pipeline engine to use `workdir` as the disk
323
- location to use when running the processes and keeping their
324
- outputs. The `use_parameterized_dirs` tells the engine to create
325
- sub-directories under `workdir` corresponding to the iterables in the
326
- pipeline. Thus for this pipeline there will be subject specific
327
- sub-directories.
317
+ Use the :class:`~nipype.pipeline.engine.workflows.Workflow` to create a
318
+ graph-based execution pipeline for first level analysis.
319
+ Set the :py:attr:`~nipype.pipeline.engine.workflows.Workflow.base_dir`
320
+ option to instruct the pipeline engine to use ``spm_auditory_tutorial/workingdir``
321
+ as the filesystem location to use when running the processes and keeping their
322
+ outputs.
323
+ Other options can be set via `the configuration file
324
+ <https://miykael.github.io/nipype_tutorial/notebooks/basic_execution_configuration.html>`__.
325
+ For example, ``use_parameterized_dirs`` tells the engine to create
326
+ sub-directories under :py:attr:`~nipype.pipeline.engine.workflows.Workflow.base_dir`,
327
+ corresponding to the iterables in the pipeline.
328
+ Thus, for this pipeline there will be subject specific sub-directories.
329
+
330
+ When building a workflow, interface objects are wrapped within
331
+ a :class:`~nipype.pipeline.engine.nodes.Node` so that they can be inserted
332
+ in the workflow.
328
333
329
334
The :func:`~nipype.pipeline.engine.workflows.Workflow.connect` method creates the
330
- links between the processes , i.e., how data should flow in and out of
331
- the processing nodes.
335
+ links between :class:`~nipype.pipeline.engine.nodes.Node` instances , i.e.,
336
+ how data should flow in and out of the processing nodes.
332
337
"""
333
338
334
339
level1 = pe .Workflow (name = "level1" )
0 commit comments