diff --git a/CHANGES b/CHANGES
index 337ae51f42..5c927ac77c 100644
--- a/CHANGES
+++ b/CHANGES
@@ -1,6 +1,7 @@
Next release
============
+* ENH: Add nipype_crash_search command (https://github.com/nipy/nipype/pull/1422)
* ENH: Created interface for BrainSuite Cortical Surface Extraction command line tools (https://github.com/nipy/nipype/pull/1305)
* FIX: job execution on systems/approaches where locale is undefined (https://github.com/nipy/nipype/pull/1401)
* FIX: Clean up byte/unicode issues using subprocess (https://github.com/nipy/nipype/pull/1394)
@@ -28,6 +29,7 @@ Next release
* FIX: Correct linking/copying fallback behavior (https://github.com/nipy/nipype/pull/1391)
* ENH: Nipype workflow and interfaces for FreeSurfer's recon-all (https://github.com/nipy/nipype/pull/1326)
* FIX: Permit relative path for concatenated_file input to Concatenate() (https://github.com/nipy/nipype/pull/1411)
+* ENH: Makes ReconAll workflow backwards compatible with FreeSurfer 5.3.0 (https://github.com/nipy/nipype/pull/1434)
Release 0.11.0 (September 15, 2015)
============
diff --git a/CHANGES.orig b/CHANGES.orig
deleted file mode 100644
index b7a849dd7e..0000000000
--- a/CHANGES.orig
+++ /dev/null
@@ -1,538 +0,0 @@
-Next release
-============
-
-* FIX: job execution on systems/approaches where locale is undefined (https://github.com/nipy/nipype/pull/1401)
-* FIX: Clean up byte/unicode issues using subprocess (https://github.com/nipy/nipype/pull/1394)
-* FIX: Prevent crash when tvtk is loaded - ETS_TOOLKIT=null (https://github.com/nipy/nipype/pull/973)
-* ENH: New interfaces in dipy: RESTORE, EstimateResponseSH, CSD and StreamlineTractography
- (https://github.com/nipy/nipype/pull/1090)
-* ENH: Added interfaces of AFNI (https://github.com/nipy/nipype/pull/1360,
- https://github.com/nipy/nipype/pull/1361, https://github.com/nipy/nipype/pull/1382)
-* ENH: Provides a Nipype wrapper for antsJointFusion (https://github.com/nipy/nipype/pull/1351)
-* ENH: Added support for PETPVC (https://github.com/nipy/nipype/pull/1335)
-* ENH: Merge S3DataSink into DataSink, added AWS documentation (https://github.com/nipy/nipype/pull/1316)
-* TST: Cache APT in CircleCI (https://github.com/nipy/nipype/pull/1333)
-* ENH: Add new flags to the BRAINSABC for new features (https://github.com/nipy/nipype/pull/1322)
-* ENH: Provides a Nipype wrapper for ANTs DenoiseImage (https://github.com/nipy/nipype/pull/1291)
-* FIX: Minor bugfix logging hash differences (https://github.com/nipy/nipype/pull/1298)
-* FIX: Use released Prov python library (https://github.com/nipy/nipype/pull/1279)
-* ENH: Support for Python 3 (https://github.com/nipy/nipype/pull/1221)
-* FIX: VTK version check missing when using tvtk (https://github.com/nipy/nipype/pull/1219)
-* ENH: Added an OAR scheduler plugin (https://github.com/nipy/nipype/pull/1259)
-* ENH: New ANTs interface: antsBrainExtraction (https://github.com/nipy/nipype/pull/1231)
-* API: Default model level for the bedpostx workflow has been set to "2" following FSL 5.0.9 lead
-* ENH: New interfaces for interacting with AWS S3: S3DataSink and S3DataGrabber (https://github.com/nipy/nipype/pull/1201)
-* ENH: Interfaces for MINC tools (https://github.com/nipy/nipype/pull/1304)
-* FIX: Use realpath to determine hard link source (https://github.com/nipy/nipype/pull/1388)
-* FIX: Correct linking/copying fallback behavior (https://github.com/nipy/nipype/pull/1391)
-* ENH: Nipype workflow and interfaces for FreeSurfer's recon-all (https://github.com/nipy/nipype/pull/1326)
-=======
-
-Release 0.11.0 (September 15, 2015)
-============
-
-* API: Change how hash values are computed (https://github.com/nipy/nipype/pull/1174)
-* ENH: New algorithm: mesh.WarpPoints applies displacements fields to point sets
- (https://github.com/nipy/nipype/pull/889).
-* ENH: New interfaces for MRTrix3 (https://github.com/nipy/nipype/pull/1126)
-* ENH: New option in afni.3dRefit - zdel, ydel, zdel etc. (https://github.com/nipy/nipype/pull/1079)
-* FIX: ants.Registration composite transform outputs are no longer returned as lists (https://github.com/nipy/nipype/pull/1183)
-* BUG: ANTs Registration interface failed with multi-modal inputs
- (https://github.com/nipy/nipype/pull/1176) (https://github.com/nipy/nipype/issues/1175)
-* ENH: dipy.TrackDensityMap interface now accepts a reference image (https://github.com/nipy/nipype/pull/1091)
-* FIX: Bug in XFibres5 (https://github.com/nipy/nipype/pull/1168)
-* ENH: Attempt to use hard links for data sink.
- (https://github.com/nipy/nipype/pull/1161)
-* FIX: Updates to SGE Plugins
- (https://github.com/nipy/nipype/pull/1129)
-* ENH: Add ants JointFusion() node with testing
- (https://github.com/nipy/nipype/pull/1160)
-* ENH: Add --float option for antsRegistration calls
- (https://github.com/nipy/nipype/pull/1159)
-* ENH: Added interface to simulate DWIs using the multi-tensor model
- (https://github.com/nipy/nipype/pull/1085)
-* ENH: New interface for FSL fslcpgeom utility (https://github.com/nipy/nipype/pull/1152)
-* ENH: Added SLURMGraph plugin for submitting jobs to SLURM with dependencies (https://github.com/nipy/nipype/pull/1136)
-* FIX: Enable absolute path definitions in DCMStack (https://github.com/nipy/nipype/pull/1089,
- replaced by https://github.com/nipy/nipype/pull/1093)
-* ENH: New mesh.MeshWarpMaths to operate on surface-defined warpings
- (https://github.com/nipy/nipype/pull/1016)
-* FIX: Refactor P2PDistance, change name to ComputeMeshWarp, add regression tests,
- fix bug in area weighted distance, and added optimizations
- (https://github.com/nipy/nipype/pull/1016)
-* ENH: Add an option not to resubmit Nodes that finished running when using SGEGraph (https://github.com/nipy/nipype/pull/1002)
-* FIX: FUGUE is now properly listing outputs. (https://github.com/nipy/nipype/pull/978)
-* ENH: Improved FieldMap-Based (FMB) workflow for correction of susceptibility distortions in EPI seqs.
- (https://github.com/nipy/nipype/pull/1019)
-* FIX: In the FSLXcommand _list_outputs function fixed for loop range (https://github.com/nipy/nipype/pull/1071)
-* ENH: Dropped support for now 7 years old Python 2.6 (https://github.com/nipy/nipype/pull/1069)
-* FIX: terminal_output is not mandatory anymore (https://github.com/nipy/nipype/pull/1070)
-* ENH: Added "nipype_cmd" tool for running interfaces from the command line (https://github.com/nipy/nipype/pull/795)
-* FIX: Fixed Camino output naming (https://github.com/nipy/nipype/pull/1061)
-* ENH: Add the average distance to ErrorMap (https://github.com/nipy/nipype/pull/1039)
-* ENH: Inputs with name_source can be now chained in cascade (https://github.com/nipy/nipype/pull/938)
-* ENH: Improve JSON interfaces: default settings when reading and consistent output creation
- when writing (https://github.com/nipy/nipype/pull/1047)
-* FIX: AddCSVRow problems when using infields (https://github.com/nipy/nipype/pull/1028)
-* FIX: Removed unused ANTS registration flag (https://github.com/nipy/nipype/pull/999)
-* FIX: Amend create_tbss_non_fa() workflow to match FSL's tbss_non_fa command. (https://github.com/nipy/nipype/pull/1033)
-* FIX: remove unused mandatory flag from spm normalize (https://github.com/nipy/nipype/pull/1048)
-* ENH: Update ANTSCorticalThickness interface (https://github.com/nipy/nipype/pull/1013)
-* FIX: Edge case with sparsemodels and PEP8 cleanup (https://github.com/nipy/nipype/pull/1046)
-* ENH: New io interfaces for JSON files reading/writing (https://github.com/nipy/nipype/pull/1020)
-* ENH: Enhanced openfmri script to support freesurfer linkage (https://github.com/nipy/nipype/pull/1037)
-* BUG: matplotlib is supposed to be optional (https://github.com/nipy/nipype/pull/1003)
-* FIX: Fix split_filename behaviour when path has no file component (https://github.com/nipy/nipype/pull/1035)
-* ENH: Updated FSL dtifit to include option for grad non-linearities (https://github.com/nipy/nipype/pull/1032)
-* ENH: Updated Camino tracking interfaces, which can now use FSL bedpostx output.
- New options also include choice of tracker, interpolator, stepsize and
- curveinterval for angle threshold (https://github.com/nipy/nipype/pull/1029)
-* FIX: Interfaces redirecting X crashed if $DISPLAY not defined (https://github.com/nipy/nipype/pull/1027)
-* FIX: Bug crashed 'make api' (https://github.com/nipy/nipype/pull/1026)
-* ENH: Updated antsIntroduction to handle RA and RI registrations (https://github.com/nipy/nipype/pull/1009)
-* ENH: Updated N4BiasCorrection input spec to include weight image and spline order. Made
- argument formatting consistent. Cleaned ants.segmentation according to PEP8.
- (https://github.com/nipy/nipype/pull/990/files)
-* ENH: SPM12 Normalize interface (https://github.com/nipy/nipype/pull/986)
-* FIX: Utility interface test dir (https://github.com/nipy/nipype/pull/986)
-* FIX: IPython engine directory reset after crash (https://github.com/nipy/nipype/pull/987)
-* ENH: Resting state fMRI example with NiPy realignment and no SPM (https://github.com/nipy/nipype/pull/992)
-* FIX: Corrected Freesurfer SegStats _list_outputs to avoid error if summary_file is
- undefined (issue #994)(https://https://github.com/nipy/nipype/pull/996)
-* FIX: OpenfMRI support and FSL 5.0.7 changes (https://github.com/nipy/nipype/pull/1006)
-* FIX: Output prefix in SPM Normalize with modulation (https://github.com/nipy/nipype/pull/1023)
-* ENH: Usability improvements in cluster environments (https://github.com/nipy/nipype/pull/1025)
-* ENH: ANTs JointFusion() (https://github.com/nipy/nipype/pull/1042)
-* ENH: Added csvReader() utility (https://github.com/nipy/nipype/pull/1044)
-* FIX: typo in nipype.interfaces.freesurfer.utils.py Tkregister2 (https://github.com/nipy/nipype/pull/1083)
-* FIX: SSHDataGrabber outputs now return full path to the grabbed/downloaded files. (https://github.com/nipy/nipype/pull/1086)
-* FIX: Add QA output for TSNR to resting workflow (https://github.com/nipy/nipype/pull/1088)
-* FIX: Change N4BiasFieldCorrection to use short tag for dimensionality (backward compatible) (https://github.com/nipy/nipype/pull/1096)
-* ENH: Added -newgrid input to Warp in AFNI (3dWarp wrapper) (https://github.com/nipy/nipype/pull/1128)
-* FIX: Fixed AFNI Copy interface to use positional inputs as required (https://github.com/nipy/nipype/pull/1131)
-* ENH: Added a check in Dcm2nii to check if nipype created the config.ini file and remove if true (https://github.com/nipy/nipype/pull/1132)
-* ENH: Use a while loop to wait for Xvfb (up to a max wait time "xvfb_max_wait" in config file, default 10)
- (https://github.com/nipy/nipype/pull/1142)
-
-Release 0.10.0 (October 10, 2014)
-============
-
-* ENH: New miscelaneous interfaces: SplitROIs (mapper), MergeROIs (reducer)
- to enable parallel processing of very large images.
-* ENH: Updated FSL interfaces: BEDPOSTX and XFibres, former interfaces are still
- available with the version suffix: BEDPOSTX4 and XFibres4. Added gpu
- versions of BEDPOSTX: BEDPOSTXGPU, BEDPOSTX5GPU, and BEDPOSTX4GPU
-* ENH: Added experimental support for MIPAV algorithms thorugh JIST plugins
-* ENH: New dipy interfaces: Denoise, Resample
-* ENH: New Freesurfer interfaces: Tkregister2 (for conversion of fsl style matrices to freesurfer format), MRIPretess
-* ENH: New FSL interfaces: WarpPoints, WarpPointsToStd, EpiReg, ProbTrackX2, WarpUtils, ConvertWarp
-* ENH: New miscelaneous interfaces: AddCSVRow, NormalizeProbabilityMapSet, AddNoise
-* ENH: New AFNI interfaces: Eval, Means, SVMTest, SVMTrain
-* ENH: FUGUE interface has been refactored to use the name_template system, 3 examples
- added to doctests, some bugs solved.
-* API: Interfaces to external packages are no longer available in the top-level
- ``nipype`` namespace, and must be imported directly (e.g.
- ``from nipype.interfaces import fsl``).
-* ENH: Support for elastix via a set of new interfaces: Registration, ApplyWarp,
- AnalyzeWarp, PointsWarp, and EditTransform
-* ENH: New ANTs interface: ApplyTransformsToPoints, LaplacianThickness
-* ENH: New Diffusion Toolkit interface: TrackMerge
-* ENH: New MRtrix interface: FilterTracks
-* ENH: New metrics group in algorithms. Now Distance, Overlap, and FuzzyOverlap
- are found in nipype.algorithms.metrics instead of misc. Overlap interface
- extended to allow files containing multiple ROIs and volume physical units.
-* ENH: New interface in algorithms.metrics: ErrorMap (a voxel-wise diff map).
-* ENH: New FreeSurfer workflow: create_skullstripped_recon_flow()
-* ENH: Deep revision of workflows for correction of dMRI artifacts. New dmri_preprocessing
- example.
-* ENH: New data grabbing interface that works over SSH connections, SSHDataGrabber
-* ENH: New color mode for write_graph
-* ENH: You can now force MapNodes to be run serially
-* ENH: Added ANTS based openfmri workflow
-* ENH: MapNode now supports flattening of nested lists
-* ENH: Support for headless mode using Xvfb
-* ENH: nipype_display_crash has a debugging mode
-* FIX: MRTrix tracking algorithms were ignoring mask parameters.
-* FIX: FNIRT registration pathway and associated OpenFMRI example script
-* FIX: spm12b compatibility for Model estimate
-* FIX: Batch scheduler controls the number of maximum jobs properly
-* FIX: Update for FSL 5.0.7 which deprecated Contrast Manager
-
-Release 0.9.2 (January 31, 2014)
-============
-
-* FIX: DataFinder was broken due to a typo
-* FIX: Order of DataFinder outputs was not guaranteed, it's human sorted now
-* ENH: New interfaces: Vnifti2Image, VtoMat
-
-Release 0.9.1 (December 25, 2013)
-============
-
-* FIX: installation issues
-
-Release 0.9.0 (December 20, 2013)
-============
-
-* ENH: SelectFiles: a streamlined version of DataGrabber
-* ENH: new tools for defining workflows: JoinNode, synchronize and itersource
-* ENH: W3C PROV support with optional RDF export built into Nipype
-* ENH: Added support for Simple Linux Utility Resource Management (SLURM)
-* ENH: AFNI interfaces refactor, prefix, suffix are replaced by
- "flexible_%s_templates"
-* ENH: New SPM interfaces:
- - spm.ResliceToReference,
- - spm.DicomImport
-* ENH: New AFNI interfaces:
- - afni.AFNItoNIFTI
- - afni.TCorr1D
-* ENH: Several new interfaces related to Camino were added:
- - camino.SFPICOCalibData
- - camino.Conmat
- - camino.QBallMX
- - camino.LinRecon
- - camino.SFPeaks
- One outdated interface no longer part of Camino was removed:
- - camino.Conmap
-* ENH: Three new mrtrix interfaces were added:
- - mrtrix.GenerateDirections
- - mrtrix.FindShPeaks
- - mrtrix.Directions2Amplitude
-* ENH: New FSL interfaces:
- - fsl.PrepareFieldmap
- - fsl.TOPUP
- - fsl.ApplyTOPUP
- - fsl.Eddy
-* ENH: New misc interfaces:
- - FuzzyOverlap,
- - P2PDistance
-* ENH: New workflows: nipype.workflows.dmri.fsl.epi.[fieldmap_correction&topup_correction]
-* ENH: Added simplified outputname generation for command line interfaces.
-* ENH: Allow ants use a single mask image
-* ENH: Create configuration option for parameterizing directories with hashes
-* ENH: arrange nodes by topological sort with disconnected subgraphs
-* ENH: uses the nidm iri namespace for uuids
-* ENH: remove old reporting webpage
-* ENH: Added support for Vagrant
-
-* API: 'name' is now a positional argument for Workflow, Node, and MapNode constructors
-* API: SPM now defaults to SPM8 or SPM12b job format
-* API: DataGrabber and SelectFiles use human (or natural) sort now
-
-* FIX: Several fixes related to Camino interfaces:
- - ProcStreamlines would ignore many arguments silently (target, waypoint, exclusion ROIS, etc.)
- - DTLUTGen would silently round the "step", "snr" and "trace" parameters to integers
- - PicoPDFs would not accept more than one lookup table
- - PicoPDFs default pdf did not correspond to Camino default
- - Track input model names were outdated (and would generate an error)
- - Track numpds parameter could not be set for deterministic tractography
- - FA created output files with erroneous extension
-* FIX: Deals properly with 3d files in SPM Realign
-* FIX: SPM with MCR fixed
-* FIX: Cleaned up input and output spec metadata
-* FIX: example openfmri script now makes the contrast spec a hashed input
-* FIX: FILMGLS compatibility with FSL 5.0.5
-* FIX: Freesurfer recon-all resume now avoids setting inputs
-* FIX: File removal from node respects file associations img/hdr/mat, BRIK/HEAD
-
-Release 0.8.0 (May 8, 2013)
-===========================
-
-* ENH: New interfaces: nipy.Trim, fsl.GLM, fsl.SigLoss, spm.VBMSegment, fsl.InvWarp,
- dipy.TensorMode
-* ENH: Allow control over terminal output for commandline interfaces
-* ENH: Added preliminary support for generating Python code from Workflows.
-* ENH: New workflows for dMRI and fMRI pre-processing: added motion artifact correction
- with rotation of the B-matrix, and susceptibility correction for EPI imaging using
- fieldmaps. Updated eddy_correct pipeline to support both dMRI and fMRI, and new parameters.
-* ENH: Minor improvements to FSL's FUGUE and FLIRT interfaces
-* ENH: Added optional dilation of parcels in cmtk.Parcellate
-* ENH: Interpolation mode added to afni.Resample
-* ENH: Function interface can accept a list of strings containing import statements
- that allow external functions to run without their imports defined in the
- function body
-* ENH: Allow node configurations to override master configuration
-
-* FIX: SpecifyModel works with 3D files correctly now.
-
-Release 0.7.0 (Dec 18, 2012)
-============================
-
-* ENH: Add basic support for LSF plugin.
-* ENH: New interfaces: ICC, Meshfix, ants.Register, C3dAffineTool, ants.JacobianDeterminant,
- afni.AutoTcorrelate, DcmStack
-* ENH: New workflows: ants template building (both using 'ANTS' and the new 'antsRegistration')
-* ENH: New examples: how to use ANTS template building workflows (smri_ants_build_tmeplate),
- how to set SGE specific options (smri_ants_build_template_new)
-* ENH: added no_flatten option to Merge
-* ENH: added versioning option and checking to traits
-* ENH: added deprecation metadata to traits
-* ENH: Slicer interfaces were updated to version 4.1
-
-Release 0.6.0 (Jun 30, 2012)
-============================
-
-* API: display variable no longer encoded as inputs in commandline interfaces
-
-* ENH: input hash not modified when environment DISPLAY is changed
-* ENH: support for 3d files for TSNR calculation
-* ENH: Preliminary support for graph submission with SGE, PBS and Soma Workflow
-* ENH: New interfaces: MySQLSink, nipy.Similarity, WatershedBEM, MRIsSmooth,
- NetworkBasedStatistic, Atropos, N4BiasFieldCorrection, ApplyTransforms,
- fs.MakeAverageSubject, epidewarp.fsl, WarpTimeSeriesImageMultiTransform,
- AVScale, mri_ms_LDA
-* ENH: simple interfaces for spm
-
-* FIX: CompCor component calculation was erroneous
-* FIX: filename generation for AFNI and PRELUDE
-* FIX: improved slicer module autogeneration
-* FIX: added missing options for BBRegsiter
-* FIX: functionality of remove_unnecessary_ouputs cleaned up
-* FIX: local hash check works with appropriate inputs
-* FIX: Captures all stdout from commandline programs
-* FIX: Afni outputs should inherit from TraitedSpec
-
-Release 0.5.3 (Mar 23, 2012)
-============================
-
-* FIX: SPM model generation when output units is in scans
-
-Release 0.5.2 (Mar 14, 2012)
-============================
-
-* API: Node now allows specifying node level configuration for SGE/PBS clusters
-* API: Logging to file is disabled by default
-* API: New location of log file -> .nipype/nipype.cfg
-
-* ENH: Changing logging options via config works for distributed processing
-
-* FIX: Unittests on debian (logging and ipython)
-
-Release 0.5 (Mar 10, 2012)
-==========================
-
-* API: FSL defaults to Nifti when OUTPUTTYPE environment variable not found
-* API: By default inputs are removed from Node working directory
-* API: InterfaceResult class is now versioned and stores class type not instance
-* API: Added FIRST interface
-* API: Added max_jobs paramter to plugin_args. limits the number of jobs
- executing at any given point in time
-* API: crashdump_dir is now a config execution option
-* API: new config execution options for controlling hash checking, execution and
- logging behavior when running in distributed mode.
-* API: Node/MapNode has new attribute that allows it to run on master thread.
-* API: IPython plugin now invokes IPython 0.11 or greater
-* API: Canned workflows are now all under a different package structure
-* API: SpecifyModel event_info renamed to event_files
-* API: DataGrabber is always being rerun (unless overwrite is set to False on
- Node level)
-* API: "stop_on_first_rerun" does not stop for DataGrabber (unless overwrite is
- set to True on Node level)
-* API: Output prefix can be set for spm nodes (SliceTiming, Realign, Coregister,
- Normalize, Smooth)
-
-* ENH: Added fsl resting state workflow based on behzadi 2007 CompCorr method.
-* ENH: TSNR node produces mean and std-dev maps; allows polynomial detrending
-* ENH: IdentityNodes are removed prior to execution
-* ENH: Added Michael Notter's beginner's guide
-* ENH: Added engine support for status callback functions
-* ENH: SPM create warped node
-* ENH: All underlying interfaces (including python ones) are now optional
-* ENH: Added imperative programming option with Nodes and caching
-* ENH: Added debug mode to configuration
-* ENH: Results can be stored and loaded without traits exceptions
-* ENH: Added concurrent log handler for distributed writing to log file
-* ENH: Reporting can be turned off using config
-* ENH: Added stats files to FreeSurferOutput
-* ENH: Support for Condor through qsub emulation
-* ENH: IdentityNode with iterable expansion takes place after remaining Identity
- Node removal
-* ENH: Crashfile display script added
-* ENH: Added FmriRealign4d node wrapped from nipy
-* ENH: Added TBSS workflows and examples
-* ENH: Support for openfmri data processing
-* ENH: Package version check
-
-* FIX: Fixed spm preproc workflow to cater to multiple functional runs
-* FIX: Workflow outputs displays nodes with empty outputs
-* FIX: SUSAN workflow works without usans
-* FIX: SGE fixed for reading custom templates
-* FIX: warping in SPM realign, Dartel and interpolation parameters
-* FIX: Fixed voxel size parameter in freesurfer mri_convert
-* FIX: 4D images in spm coregister
-* FIX: Works around matlab tty bug
-* FIX: Overwriting connection raises exception
-* FIX: Outputs are loaded from results and not stored in memory for during
- distributed operation
-* FIX: SPM threshold uses SPM.mat name and improved error detection
-* FIX: Removing directory contents works even when a node has no outputs
-* FIX: DARTEL workflows will run only when SPM 8 is available
-* FIX: SPM Normalize estimate field fixed
-* FIX: hashmethod argument now used for calculating hash of old file
-* FIX: Modelgen now allows FSL style event files
-
-Release 0.4.1 (Jun 16, 2011)
-============================
-
-* Minor bugfixes
-
-Release 0.4 (Jun 11, 2011)
-==========================
-
-* API: Timestamp hashing does not use ctime anymore. Please update your hashes by
- running workflows with updatehash=True option
- NOTE: THIS IS THE DEFAULT CONFIG NOW, so unless you updatehash, workflows will
- rerun
-* API: Workflow run function no longer supports (inseries, createdirsonly).
- Functions used in connect string must be pickleable
-* API: SPM EstimateContrast: ignore_derivs replaced by use_derivs
-* API: All interfaces: added new config option ignore_exception
-* API: SpecifModel no longer supports (concatenate_runs, output_specs). high_pass_filter
- cutoff is mandatory (even if its set to np.inf). Additional interfaces
- SpecifySPMModel and SpecifySparseModel support other types of data.
-* API: fsl.DTIFit input "save" is now called "save_tensor"
-* API: All inputs of IdentityInterfaces are mandatory by default. You can turn
- this off by specifying mandatory_inputs=False to the constructor.
-* API: fsl FILMGLS input "autocorr_estimate" is now called "autocorr_estimate_only"
-* API: fsl ContrastMgr now requires access to specific files (no longer accepts
- the result directory)
-* API: freesurfer.GLMFit input "surf" is now a boolean with three corresponding
- inputs -- subject_id, hemi, and surf_geo
-
-* ENH: All commandline interfaces display stdout and stderr
-* ENH: All interfaces raise exceptions on error with an option to suppress
-* ENH: Supports a plugin interface for execution (current support for multiprocessing,
- IPython, SGE, PBS)
-* ENH: MapNode runs in parallel under IPython, SGE, MultiProc, PBS
-* ENH: Optionally allows keeping only required outputs
-* ENH: New interface: utility.Rename to change the name of files, optionally
- using python string-formatting with inputs or regular expressions matching
-* ENH: New interface: freesurfer.ApplyMask (mri_mask)
-* ENH: New FSL interface -- SwapDimensions (fslswapdim)
-* ENH: Sparse models allow regressor scaling and temporal derivatives
-* ENH: Added support for the component parts of FSL's TBSS workflow (TBSSSkeleton
- and DistanceMap)
-* ENH: dcm2nii interface exposes bvals, bvecs, reoriented and cropped images
-* ENH: Added several higher-level interfaces to the fslmaths command:
- - ChangeDataType, Threshold, MeanImage, IsotropicSmooth, ApplyMask, TemporalFilter
- DilateImage, ErodeImage, SpatialFilter, UnaryMaths, BinaryMaths, MultiImageMaths
-* ENH: added support for networx 1.4 and improved iterable expansion
-* ENH: Replaced BEDPOSTX and EddyCurrent with nipype pipelines
-* ENH: Ability to create a hierarchical dot file
-* ENH: Improved debugging information for rerunning nodes
-* ENH: Added 'stop_on_first_rerun' option
-* ENH: Added support for Camino
-* ENH: Added support for Camino2Trackvis
-* ENH: Added support for Connectome Viewer
-
-* BF: dcm2nii interface handles gzipped files correctly
-* BF: FNIRT generates proper outputs
-* BF: fsl.DTIFit now properly collects tensor volume
-* BF: updatehash now removes old result hash file
-
-Release 0.3.4 (Jan 12, 2011)
-============================
-
-* API: hash values for float use a string conversion up to the 10th decimal place.
-* API: Iterables in output path will always be generated as _var1_val1_var2_val2 pairs
-
-* ENH: Added support to nipy: GLM fit, contrast estimation and calculating mask from EPI
-* ENH: Added support for manipulating surface files in Freesurfer:
- - projecting volume images onto the surface
- - smoothing along the surface
- - transforming a surface image from one subject to another
- - using tksurfer to save pictures of the surface
-* ENH: Added support for flash processing using FreeSurfer
-* ENH: Added support for flirt matrix in BBRegister
-* ENH: Added support for FSL convert_xfm
-* ENH: hashes can be updated again without rerunning all nodes.
-* ENH: Added multiple regression design for FSL
-* ENH: Added SPM based Analyze to Nifti converter
-* ENH: Added increased support for PyXNAT
-* ENH: Added support for MCR-based binary version of SPM
-* ENH: Added SPM node for calculating various threshold statistics
-* ENH: Added distance and dissimilarity measurements
-
-* BF: Diffusion toolkit gets installed
-* BF: Changed FNIRT interface to accept flexible lists (rather than 4-tuples)
- on all options specific to different subsampling levels
-
-Release 0.3.3 (Sep 16, 2010)
-============================
-
-* API: subject_id in ModelSpec is now deprecated
-* API: spm.Threshold
- - does not need mask, beta, RPV anymore
- - takes only one image (stat_image - mind the name change)
- - works with SPM2 SPM.mat
- - returns additional map - pre topological FDR
-
-* ENH: Added support for Diffusion toolkit
-* ENH: Added support for FSL slicer and overlay
-* ENH: Added support for dcm2nii
-
-* BF: DataSink properly handles lists of lists now
-* BF: DataGrabber has option for raising Exception on getting empty lists
-* BF: Traits logic for 'requires' metadata
-* BF: allows workflows to be relocatable
-* BF: nested workflows with connections don't raise connection not found error
-* BF: multiple workflows with identical nodenames and iterables do not create nestsed workflows
-
-Release 0.3.2 (Aug 03, 2010)
-============================
-
-Enhancements
-------------
-
- - all outputs from nodes are now pickled as part of workflow processing
- - added git developer docs
-
-Bugs fixed
-----------
-
-* FreeSurfer
-
- - Fixed bugs in SegStats doctest
-
-Release 0.3.1 (Jul 29, 2010)
-============================
-
-Bugs fixed
-----------
-
-* FreeSurfer
-
- - Fixed bugs in glmfit and concatenate
- - Added group t-test to freesurfer tutorial
-
-Release 0.3 (Jul 27, 2010)
-==========================
-
-Incompatible changes
---------------------
-
-* Complete redesign of the Interface class - heavy use of Traits.
-
-* Changes in the engine API - added Workflow and MapNode. Compulsory name argument.
-
-Features added
---------------
-
-* General:
-
- - Type checking of inputs and outputs using Traits from ETS_.
- - Support for nested workflows.
- - Preliminary Slicer and AFNI support.
- - New flexible DataGrabber node.
- - AtlasPick and Threshold nodes.
- - Preliminary support for XNAT.
- - Doubled number of the tutorials.
-
-* FSL:
-
- - Added DTI processing nodes (note that TBSS nodes are still experimental).
- - Recreated FEAT workflow.
-
-* SPM:
-
- - Added New Segment and many other nodes.
- - Redesigned second level analysis.
diff --git a/bin/nipype_crash_search.py b/bin/nipype_crash_search.py
new file mode 100755
index 0000000000..068e5d600a
--- /dev/null
+++ b/bin/nipype_crash_search.py
@@ -0,0 +1,76 @@
+#!/usr/bin/env python
+"""Search for tracebacks inside a folder of nipype crash
+log files that match a given regular expression.
+
+Examples:
+nipype_crash_search -d nipype/wd/log -r '.*subject123.*'
+"""
+import re
+import os.path as op
+from glob import glob
+
+from traits.trait_errors import TraitError
+from nipype.utils.filemanip import loadcrash
+
+
+def load_pklz_traceback(crash_filepath):
+ """ Return the traceback message in the given crash file."""
+ try:
+ data = loadcrash(crash_filepath)
+ except TraitError as te:
+ return str(te)
+ except:
+ raise
+ else:
+ return '\n'.join(data['traceback'])
+
+
+def iter_tracebacks(logdir):
+ """ Return an iterator over each file path and
+ traceback field inside `logdir`.
+ Parameters
+ ----------
+ logdir: str
+ Path to the log folder.
+
+ field: str
+ Field name to be read from the crash file.
+
+ Yields
+ ------
+ path_file: str
+
+ traceback: str
+ """
+ crash_files = sorted(glob(op.join(logdir, '*.pkl*')))
+
+ for cf in crash_files:
+ yield cf, load_pklz_traceback(cf)
+
+
+def display_crash_search(logdir, regex):
+ rex = re.compile(regex, re.IGNORECASE)
+ for file, trace in iter_tracebacks(logdir):
+ if rex.search(trace):
+ print("-" * len(file))
+ print(file)
+ print("-" * len(file))
+ print(trace)
+
+
+if __name__ == "__main__":
+ from argparse import ArgumentParser, RawTextHelpFormatter
+ defstr = ' (default %(default)s)'
+ parser = ArgumentParser(prog='nipype_crash_search',
+ description=__doc__,
+ formatter_class=RawTextHelpFormatter)
+ parser.add_argument('-l','--logdir', type=str, dest='logdir',
+ action="store", default=None,
+ help='The working directory log file.' + defstr)
+ parser.add_argument('-r', '--regex', dest='regex',
+ default='*',
+ help='Regular expression to be searched in each traceback.' + defstr)
+
+ args = parser.parse_args()
+
+ display_crash_search(args.logdir, args.regex)
diff --git a/doc/_templates/layout.html b/doc/_templates/layout.html
index 8a3b08697e..a8de8d176b 100644
--- a/doc/_templates/layout.html
+++ b/doc/_templates/layout.html
@@ -14,6 +14,12 @@
ga('create', 'UA-339450-7', 'nipy.org/nipype');
ga('send', 'pageview');
+
+
{% endblock %}
{% block header %}
diff --git a/doc/users/debug.rst b/doc/users/debug.rst
index 91aa3b6f2f..303df9f82a 100644
--- a/doc/users/debug.rst
+++ b/doc/users/debug.rst
@@ -42,6 +42,9 @@ performance issues.
#. All Nipype crashfiles can be inspected with the `nipype_display_crash`
utility.
+#. The `nipype_crash_search` command allows you to search for regular expressions
+in the tracebacks of the Nipype crashfiles within a log folder.
+
#. Nipype determines the hash of the input state of a node. If any input
contains strings that represent files on the system path, the hash evaluation
mechanism will determine the timestamp or content hash of each of those
diff --git a/nipype/interfaces/freesurfer/model.py b/nipype/interfaces/freesurfer/model.py
index 607586d6f6..5196292835 100644
--- a/nipype/interfaces/freesurfer/model.py
+++ b/nipype/interfaces/freesurfer/model.py
@@ -752,8 +752,9 @@ def _list_outputs(self):
def _format_arg(self, name, spec, value):
if name in ('summary_file', 'avgwf_txt_file'):
- if not os.path.isabs(value):
- value = os.path.join('.', value)
+ if not isinstance(value, bool):
+ if not os.path.isabs(value):
+ value = os.path.join('.', value)
if name in ['avgwf_txt_file', 'avgwf_file', 'sf_avg_file']:
if isinstance(value, bool):
fname = self._list_outputs()[name]
@@ -779,7 +780,7 @@ class SegStatsReconAllInputSpec(SegStatsInputSpec):
# implicit
ribbon = traits.File(mandatory=True, exists=True,
desc="Input file mri/ribbon.mgz")
- presurf_seg = File(mandatory=True, exists=True,
+ presurf_seg = File(exists=True,
desc="Input segmentation volume")
transform = File(mandatory=True, exists=True,
desc="Input transform file")
@@ -795,6 +796,8 @@ class SegStatsReconAllInputSpec(SegStatsInputSpec):
desc="Input file must be /surf/lh.pial")
rh_pial = File(mandatory=True, exists=True,
desc="Input file must be /surf/rh.pial")
+ aseg = File(exists=True,
+ desc="Mandatory implicit input in 5.3")
copy_inputs = traits.Bool(desc="If running as a node, set this to True " +
"otherwise, this will copy the implicit inputs " +
"to the node directory.")
@@ -858,7 +861,7 @@ def run(self, **inputs):
copy2subjdir(self, self.inputs.lh_white,
'surf', 'lh.white')
copy2subjdir(self, self.inputs.rh_white,
- 'Surf', 'Rh.White')
+ 'surf', 'rh.white')
copy2subjdir(self, self.inputs.lh_pial,
'surf', 'lh.pial')
copy2subjdir(self, self.inputs.rh_pial,
@@ -867,12 +870,13 @@ def run(self, **inputs):
'mri', 'ribbon.mgz')
copy2subjdir(self, self.inputs.presurf_seg,
'mri', 'aseg.presurf.mgz')
+ copy2subjdir(self, self.inputs.aseg,
+ 'mri', 'aseg.mgz')
copy2subjdir(self, self.inputs.transform,
os.path.join('mri', 'transforms'),
'talairach.xfm')
copy2subjdir(self, self.inputs.in_intensity, 'mri')
- if isdefined(self.inputs.brainmask_file):
- copy2subjdir(self, self.inputs.brainmask_file, 'mri')
+ copy2subjdir(self, self.inputs.brainmask_file, 'mri')
return super(SegStatsReconAll, self).run(**inputs)
diff --git a/nipype/interfaces/freesurfer/tests/test_auto_Aparc2Aseg.py b/nipype/interfaces/freesurfer/tests/test_auto_Aparc2Aseg.py
index 27ea0eb162..8925054f05 100644
--- a/nipype/interfaces/freesurfer/tests/test_auto_Aparc2Aseg.py
+++ b/nipype/interfaces/freesurfer/tests/test_auto_Aparc2Aseg.py
@@ -19,6 +19,7 @@ def test_Aparc2Aseg_inputs():
environ=dict(nohash=True,
usedefault=True,
),
+ filled=dict(),
hypo_wm=dict(argstr='--hypo-as-wm',
mandatory=False,
),
diff --git a/nipype/interfaces/freesurfer/tests/test_auto_Curvature.py b/nipype/interfaces/freesurfer/tests/test_auto_Curvature.py
index 5be32abc21..851c2acff9 100644
--- a/nipype/interfaces/freesurfer/tests/test_auto_Curvature.py
+++ b/nipype/interfaces/freesurfer/tests/test_auto_Curvature.py
@@ -19,6 +19,7 @@ def test_Curvature_inputs():
usedefault=True,
),
in_file=dict(argstr='%s',
+ copyfile=True,
mandatory=True,
position=-2,
),
diff --git a/nipype/interfaces/freesurfer/tests/test_auto_MakeSurfaces.py b/nipype/interfaces/freesurfer/tests/test_auto_MakeSurfaces.py
index f2ca8c2a64..a34894fe99 100644
--- a/nipype/interfaces/freesurfer/tests/test_auto_MakeSurfaces.py
+++ b/nipype/interfaces/freesurfer/tests/test_auto_MakeSurfaces.py
@@ -35,6 +35,7 @@ def test_MakeSurfaces_inputs():
in_orig=dict(argstr='-orig %s',
mandatory=True,
),
+ in_white=dict(),
in_wm=dict(mandatory=True,
),
longitudinal=dict(argstr='-long',
@@ -66,6 +67,8 @@ def test_MakeSurfaces_inputs():
subjects_dir=dict(),
terminal_output=dict(nohash=True,
),
+ white=dict(argstr='-white %s',
+ ),
white_only=dict(argstr='-whiteonly',
mandatory=False,
),
diff --git a/nipype/interfaces/freesurfer/tests/test_auto_SegStatsReconAll.py b/nipype/interfaces/freesurfer/tests/test_auto_SegStatsReconAll.py
index e15967ffec..4c0f8dcde6 100644
--- a/nipype/interfaces/freesurfer/tests/test_auto_SegStatsReconAll.py
+++ b/nipype/interfaces/freesurfer/tests/test_auto_SegStatsReconAll.py
@@ -10,6 +10,7 @@ def test_SegStatsReconAll_inputs():
),
args=dict(argstr='%s',
),
+ aseg=dict(),
avgwf_file=dict(argstr='--avgwfvol %s',
),
avgwf_txt_file=dict(argstr='--avgwf %s',
@@ -85,8 +86,7 @@ def test_SegStatsReconAll_inputs():
),
partial_volume_file=dict(argstr='--pv %s',
),
- presurf_seg=dict(mandatory=True,
- ),
+ presurf_seg=dict(),
rh_orig_nofix=dict(mandatory=True,
),
rh_pial=dict(mandatory=True,
diff --git a/nipype/interfaces/freesurfer/tests/test_auto_VolumeMask.py b/nipype/interfaces/freesurfer/tests/test_auto_VolumeMask.py
index 81786550f5..c87f8716b5 100644
--- a/nipype/interfaces/freesurfer/tests/test_auto_VolumeMask.py
+++ b/nipype/interfaces/freesurfer/tests/test_auto_VolumeMask.py
@@ -6,6 +6,8 @@
def test_VolumeMask_inputs():
input_map = dict(args=dict(argstr='%s',
),
+ aseg=dict(xor=['in_aseg'],
+ ),
copy_inputs=dict(mandatory=False,
),
environ=dict(nohash=True,
@@ -16,6 +18,7 @@ def test_VolumeMask_inputs():
),
in_aseg=dict(argstr='--aseg_name %s',
mandatory=False,
+ xor=['aseg'],
),
left_ribbonlabel=dict(argstr='--label_left_ribbon %d',
mandatory=True,
diff --git a/nipype/interfaces/freesurfer/utils.py b/nipype/interfaces/freesurfer/utils.py
index 1b4b5e3d15..6f03a11f6e 100644
--- a/nipype/interfaces/freesurfer/utils.py
+++ b/nipype/interfaces/freesurfer/utils.py
@@ -35,21 +35,30 @@
def copy2subjdir(cls, in_file, folder=None, basename=None, subject_id=None):
+ """Method to copy an input to the subjects directory"""
+ # check that the input is defined
+ if not isdefined(in_file):
+ return in_file
+ # check that subjects_dir is defined
if isdefined(cls.inputs.subjects_dir):
subjects_dir = cls.inputs.subjects_dir
else:
- subjects_dir = os.getcwd()
+ subjects_dir = os.getcwd() #if not use cwd
+ # check for subject_id
if not subject_id:
if isdefined(cls.inputs.subject_id):
subject_id = cls.inputs.subject_id
else:
- subject_id = 'subject_id'
+ subject_id = 'subject_id' #default
+ # check for basename
if basename == None:
basename = os.path.basename(in_file)
+ # check which folder to put the file in
if folder != None:
out_dir = os.path.join(subjects_dir, subject_id, folder)
else:
out_dir = os.path.join(subjects_dir, subject_id)
+ # make the output folder if it does not exist
if not os.path.isdir(out_dir):
os.makedirs(out_dir)
out_file = os.path.join(out_dir, basename)
@@ -58,7 +67,7 @@ def copy2subjdir(cls, in_file, folder=None, basename=None, subject_id=None):
return out_file
def createoutputdirs(outputs):
- """create an output directories. If not created, some freesurfer interfaces fail"""
+ """create all output directories. If not created, some freesurfer interfaces fail"""
for output in outputs.itervalues():
dirname = os.path.dirname(output)
if not os.path.isdir(dirname):
@@ -527,11 +536,6 @@ class ApplyMask(FSCommand):
input_spec = ApplyMaskInputSpec
output_spec = ApplyMaskOutputSpec
- def _list_outputs(self):
- outputs = self._outputs().get()
- outputs["out_file"] = os.path.abspath(self.inputs.out_file)
- return outputs
-
class SurfaceSnapshotsInputSpec(FSTraitedSpec):
@@ -1872,6 +1876,7 @@ class MakeSurfacesInputSpec(FSTraitedSpec):
in_filled = File(exists=True, mandatory=True,
desc="Implicit input file filled.mgz")
# optional
+ in_white = File(exists=True, desc="Implicit input that is sometimes used")
in_label = File(exists=True, mandatory=False, xor=['noaparc'],
desc="Implicit input label/.aparc.annot")
orig_white = File(argstr="-orig_white %s", exists=True, mandatory=False,
@@ -1898,6 +1903,8 @@ class MakeSurfacesInputSpec(FSTraitedSpec):
argstr="-max %.1f", desc="No documentation (used for longitudinal processing)")
longitudinal = traits.Bool(
argstr="-long", desc="No documentation (used for longitudinal processing)")
+ white = traits.String(argstr="-white %s",
+ desc="White surface name")
copy_inputs = traits.Bool(mandatory=False,
desc="If running as a node, set this to True." +
"This will copy the input files to the node " +
@@ -1952,15 +1959,15 @@ def run(self, **inputs):
folder='mri', basename='wm.mgz')
copy2subjdir(self, self.inputs.in_filled,
folder='mri', basename='filled.mgz')
+ copy2subjdir(self, self.inputs.in_white,
+ 'surf', '{0}.white'.format(self.inputs.hemisphere))
for originalfile in [self.inputs.in_aseg,
self.inputs.in_T1]:
- if isdefined(originalfile):
- copy2subjdir(self, originalfile, folder='mri')
+ copy2subjdir(self, originalfile, folder='mri')
for originalfile in [self.inputs.orig_white,
self.inputs.orig_pial,
self.inputs.in_orig]:
- if isdefined(originalfile):
- copy2subjdir(self, originalfile, folder='surf')
+ copy2subjdir(self, originalfile, folder='surf')
if isdefined(self.inputs.in_label):
copy2subjdir(self, self.inputs.in_label, 'label',
'{0}.aparc.annot'.format(self.inputs.hemisphere))
@@ -1977,9 +1984,11 @@ def _format_arg(self, name, spec, value):
basename = os.path.basename(value)
# whent the -mgz flag is specified, it assumes the mgz extension
if self.inputs.mgz:
- prefix = basename.rstrip('.mgz')
+ prefix = os.path.splitext(basename)[0]
else:
prefix = basename
+ if prefix == 'aseg':
+ return # aseg is already the default
return spec.argstr % prefix
elif name in ['orig_white', 'orig_pial']:
# these inputs do take full file paths or even basenames
@@ -2016,8 +2025,8 @@ def _list_outputs(self):
dest_dir, str(self.inputs.hemisphere) + '.area')
# Something determines when a pial surface and thickness file is generated
# but documentation doesn't say what.
- # The orig_pial flag is just a guess
- if isdefined(self.inputs.orig_pial):
+ # The orig_pial input is just a guess
+ if isdefined(self.inputs.orig_pial) or self.inputs.white == 'NOWRITE':
outputs["out_curv"] = outputs["out_curv"] + ".pial"
outputs["out_area"] = outputs["out_area"] + ".pial"
outputs["out_pial"] = os.path.join(
@@ -2034,7 +2043,7 @@ def _list_outputs(self):
class CurvatureInputSpec(FSTraitedSpec):
in_file = File(argstr="%s", position=-2, mandatory=True, exists=True,
- desc="Input file for Curvature")
+ copyfile=True, desc="Input file for Curvature")
# optional
threshold = traits.Float(
argstr="-thresh %.3f", mandatory=False, desc="Undocumented input threshold")
@@ -2078,7 +2087,6 @@ def _format_arg(self, name, spec, value):
if self.inputs.copy_input:
if name == 'in_file':
basename = os.path.basename(value)
- shutil.copy(value, basename)
return spec.argstr % basename
return super(Curvature, self)._format_arg(name, spec, value)
@@ -2306,11 +2314,16 @@ class VolumeMaskInputSpec(FSTraitedSpec):
desc="Implicit input left white matter surface")
rh_white = File(mandatory=True, exists=True,
desc="Implicit input right white matter surface")
+ aseg = File(exists=True,
+ xor=['in_aseg'],
+ desc="Implicit aseg.mgz segmentation. " +
+ "Specify a different aseg by using the 'in_aseg' input.")
subject_id = traits.String('subject_id', usedefault=True,
position=-1, argstr="%s", mandatory=True,
desc="Subject being processed")
# optional
- in_aseg = File(argstr="--aseg_name %s", mandatory=False, exists=True,
+ in_aseg = File(argstr="--aseg_name %s", mandatory=False,
+ exists=True, xor=['aseg'],
desc="Input aseg file for VolumeMask")
save_ribbon = traits.Bool(argstr="--save_ribbon", mandatory=False,
desc="option to save just the ribbon for the " +
@@ -2369,6 +2382,8 @@ def run(self, **inputs):
copy2subjdir(self, self.inputs.lh_white, 'surf', 'lh.white')
copy2subjdir(self, self.inputs.rh_white, 'surf', 'rh.white')
copy2subjdir(self, self.inputs.in_aseg, 'mri')
+ copy2subjdir(self, self.inputs.aseg, 'mri', 'aseg.mgz')
+
return super(VolumeMask, self).run(**inputs)
def _format_arg(self, name, spec, value):
@@ -2731,6 +2746,8 @@ class Aparc2AsegInputSpec(FSTraitedSpec):
rh_annotation = File(mandatory=True, exists=True,
desc="Input file must be /label/rh.aparc.annot")
# optional
+ filled = File(exists=True,
+ desc="Implicit input filled file. Only required with FS v5.3.")
aseg = File(argstr="--aseg %s", mandatory=False, exists=True,
desc="Input aseg file")
volmask = traits.Bool(argstr="--volmask", mandatory=False,
@@ -2813,6 +2830,7 @@ def run(self, **inputs):
copy2subjdir(self, self.inputs.rh_ribbon, 'mri', 'rh.ribbon.mgz')
copy2subjdir(self, self.inputs.ribbon, 'mri', 'ribbon.mgz')
copy2subjdir(self, self.inputs.aseg, 'mri')
+ copy2subjdir(self, self.inputs.filled, 'mri', 'filled.mgz')
copy2subjdir(self, self.inputs.lh_annotation, 'label')
copy2subjdir(self, self.inputs.rh_annotation, 'label')
@@ -2821,7 +2839,8 @@ def run(self, **inputs):
def _format_arg(self, name, spec, value):
if name == 'aseg':
# aseg does not take a full filename
- return spec.argstr % os.path.basename(value).replace('.mgz', '')
+ basename = os.path.basename(value).replace('.mgz', '')
+ return spec.argstr % basename
elif name == 'out_file':
return spec.argstr % os.path.abspath(value)
diff --git a/nipype/pipeline/plugins/sge.py b/nipype/pipeline/plugins/sge.py
index 547a3cbbb8..bbe5be9db8 100644
--- a/nipype/pipeline/plugins/sge.py
+++ b/nipype/pipeline/plugins/sge.py
@@ -60,7 +60,7 @@ def is_initializing(self):
return self._job_queue_state == "initializing"
def is_zombie(self):
- return self._job_queue_state == "zombie"
+ return self._job_queue_state == "zombie" or self._job_queue_state == "finished"
def is_running(self):
return self._job_queue_state == "running"
diff --git a/nipype/workflows/smri/freesurfer/autorecon1.py b/nipype/workflows/smri/freesurfer/autorecon1.py
index bc4fb8bff2..3b94315218 100644
--- a/nipype/workflows/smri/freesurfer/autorecon1.py
+++ b/nipype/workflows/smri/freesurfer/autorecon1.py
@@ -1,60 +1,40 @@
-import sys
-import os
-import nipype
from nipype.interfaces.utility import Function,IdentityInterface
import nipype.pipeline.engine as pe # pypeline engine
from nipype.interfaces.freesurfer import *
-from .utils import copy_file, copy_files
+from .utils import copy_file
+
def checkT1s(T1_files, cw256=False):
"""Verifying size of inputs and setting workflow parameters"""
- import SimpleITK as sitk
- import os
import sys
- # check that the files are in a list
- if not type(T1_files) == list:
- T1_files = [T1_files]
+ import nibabel as nib
+ from nipype.utils.filemanip import filename_to_list
+
+ T1_files = filename_to_list(T1_files)
if len(T1_files) == 0:
print("ERROR: No T1's Given")
sys.exit(-1)
- for i, t1 in enumerate(T1_files):
- if t1.endswith(".mgz"):
- # convert input fs files to NIFTI
- convert = MRIConvert()
- convert.inputs.in_file = t1
- convert.inputs.out_file = os.path.abspath(os.path.basename(t1).replace('.mgz', '.nii.gz'))
- convert.run()
- T1_files[i] = convert.inputs.out_file
- size = None
- origvol_names = list()
- for i, t1 in enumerate(T1_files):
- # assign an input number
- file_num = str(i + 1)
- while len(file_num) < 3:
- file_num = '0' + file_num
- origvol_names.append("{0}.mgz".format(file_num))
- # check the size of the image
- img = sitk.ReadImage(t1)
- if not size:
- size = img.GetSize()
- elif size != img.GetSize():
- print("ERROR: T1s not the same size. Cannot process {0} {1} together".format(T1_files[0],
- otherFilename))
+
+ shape = nib.load(T1_files[0]).shape
+ for t1 in T1_files[1:]:
+ if nib.load(t1).shape != shape:
+ print("ERROR: T1s not the same size. Cannot process {0} and {1} "
+ "together".format(T1_files[0], t1))
sys.exit(-1)
+
+ origvol_names = ["{0:03d}.mgz".format(i + 1) for i in range(len(T1_files))]
+
# check if cw256 is set to crop the images if size is larger than 256
- if not cw256:
- for dim in size:
- if dim > 256:
- print("Setting MRI Convert to crop images to 256 FOV")
- cw256 = True
- if len(T1_files) > 1:
- resample_type = 'cubic'
- else:
- resample_type = 'interpolate'
+ if not cw256 and any(dim > 256 for dim in shape):
+ print("Setting MRI Convert to crop images to 256 FOV")
+ cw256 = True
+
+ resample_type = 'cubic' if len(T1_files) > 1 else 'interpolate'
return T1_files, cw256, resample_type, origvol_names
-def create_AutoRecon1(name="AutoRecon1", longitudinal=False, field_strength='1.5T',
- custom_atlas=None, plugin_args=None):
+def create_AutoRecon1(name="AutoRecon1", longitudinal=False, distance=None,
+ custom_atlas=None, plugin_args=None, shrink=None, stop=None,
+ fsvernum=5.3):
"""Creates the AutoRecon1 workflow in nipype.
Inputs::
@@ -255,22 +235,11 @@ def createTemplate(in_files, out_file):
bias_correction = pe.Node(MNIBiasCorrection(), name="Bias_correction")
bias_correction.inputs.iterations = 1
bias_correction.inputs.protocol_iterations = 1000
- if field_strength == '3T':
- # 3T params from Zheng, Chee, Zagorodnov 2009 NeuroImage paper
- # "Improvement of brain segmentation accuracy by optimizing
- # non-uniformity correction using N3"
- # namely specifying iterations, proto-iters and distance:
- bias_correction.inputs.distance = 50
- else:
- # 1.5T default
- bias_correction.inputs.distance = 200
- # per c.larsen, decrease convergence threshold (default is 0.001)
- bias_correction.inputs.stop = 0.0001
- # per c.larsen, decrease shrink parameter: finer sampling (default is 4)
- bias_correction.inputs.shrink = 2
- # add the mask, as per c.larsen, bias-field correction is known to work
- # much better when the brain area is properly masked, in this case by
- # brainmask.mgz.
+ bias_correction.inputs.distance = distance
+ if stop:
+ bias_correction.inputs.stop = stop
+ if shrink:
+ bias_correction.inputs.shrink = shrink
bias_correction.inputs.no_rescale = True
bias_correction.inputs.out_file = 'orig_nu.mgz'
@@ -364,6 +333,27 @@ def awkfile(in_file, log_file):
tal_qc = pe.Node(TalairachQC(), name="Detect_Aligment_Failures")
ar1_wf.connect([(awk_logfile, tal_qc, [('log_file', 'log_file')])])
+
+ if fsvernum < 6:
+ # intensity correction is performed before normalization
+ intensity_correction = pe.Node(
+ MNIBiasCorrection(), name="Intensity_Correction")
+ intensity_correction.inputs.out_file = 'nu.mgz'
+ intensity_correction.inputs.iterations = 2
+ ar1_wf.connect([(add_xform_to_orig, intensity_correction, [('out_file', 'in_file')]),
+ (copy_transform, intensity_correction, [('out_file', 'transform')])])
+
+
+ add_to_header_nu = pe.Node(AddXFormToHeader(), name="Add_XForm_to_NU")
+ add_to_header_nu.inputs.copy_name = True
+ add_to_header_nu.inputs.out_file = 'nu.mgz'
+ ar1_wf.connect([(intensity_correction, add_to_header_nu, [('out_file', 'in_file'),
+ ]),
+ (copy_transform, add_to_header_nu,
+ [('out_file', 'transform')])
+ ])
+
+
# Intensity Normalization
# Performs intensity normalization of the orig volume and places the result in mri/T1.mgz.
# Attempts to correct for fluctuations in intensity that would otherwise make intensity-based
@@ -373,10 +363,13 @@ def awkfile(in_file, log_file):
mri_normalize = pe.Node(Normalize(), name="Normalize_T1")
mri_normalize.inputs.gradient = 1
mri_normalize.inputs.out_file = 'T1.mgz'
- ar1_wf.connect([(add_xform_to_orig_nu, mri_normalize, [('out_file', 'in_file')]),
- (copy_transform, mri_normalize,
- [('out_file', 'transform')]),
- ])
+
+ if fsvernum < 6:
+ ar1_wf.connect([(add_to_header_nu, mri_normalize, [('out_file', 'in_file')])])
+ else:
+ ar1_wf.connect([(add_xform_to_orig_nu, mri_normalize, [('out_file', 'in_file')])])
+
+ ar1_wf.connect([(copy_transform, mri_normalize, [('out_file', 'transform')])])
# Skull Strip
"""
@@ -390,8 +383,12 @@ def awkfile(in_file, log_file):
if plugin_args:
mri_em_register.plugin_args = plugin_args
- ar1_wf.connect([(add_xform_to_orig_nu, mri_em_register, [('out_file', 'in_file')]),
- (inputspec, mri_em_register, [('num_threads', 'num_threads'),
+ if fsvernum < 6:
+ ar1_wf.connect(add_to_header_nu, 'out_file', mri_em_register, 'in_file')
+ else:
+ ar1_wf.connect(add_xform_to_orig_nu, 'out_file', mri_em_register, 'in_file')
+
+ ar1_wf.connect([(inputspec, mri_em_register, [('num_threads', 'num_threads'),
('reg_template_withskull', 'template')])])
brainmask = pe.Node(WatershedSkullStrip(),
@@ -446,8 +443,14 @@ def awkfile(in_file, log_file):
'brainmask_auto',
'brainmask',
'braintemplate']
- outputspec = pe.Node(IdentityInterface(fields=outputs),
- name="outputspec")
+
+ if fsvernum < 6:
+ outputspec = pe.Node(IdentityInterface(fields=outputs + ['nu']),
+ name="outputspec")
+ ar1_wf.connect([(add_to_header_nu, outputspec, [('out_file', 'nu')])])
+ else:
+ outputspec = pe.Node(IdentityInterface(fields=outputs),
+ name="outputspec")
ar1_wf.connect([(T1_image_preparation, outputspec, [('out_file', 'origvols')]),
(T2_convert, outputspec, [('out_file', 't2_raw')]),
diff --git a/nipype/workflows/smri/freesurfer/autorecon2.py b/nipype/workflows/smri/freesurfer/autorecon2.py
index 2fd85bce13..41f15a017b 100644
--- a/nipype/workflows/smri/freesurfer/autorecon2.py
+++ b/nipype/workflows/smri/freesurfer/autorecon2.py
@@ -1,5 +1,3 @@
-import os
-import nipype
from nipype.interfaces.utility import Function, IdentityInterface, Merge
import nipype.pipeline.engine as pe # pypeline engine
from nipype.interfaces.freesurfer import *
@@ -11,12 +9,14 @@ def copy_ltas(in_file, subjects_dir, subject_id, long_template):
return out_file
def create_AutoRecon2(name="AutoRecon2", longitudinal=False,
- field_strength="1.5T", plugin_args=None):
+ plugin_args=None, fsvernum=5.3,
+ stop=None, shrink=None, distance=None):
# AutoRecon2
# Workflow
ar2_wf = pe.Workflow(name=name)
inputspec = pe.Node(IdentityInterface(fields=['orig',
+ 'nu', # version < 6
'brainmask',
'transform',
'subject_id',
@@ -47,40 +47,39 @@ def create_AutoRecon2(name="AutoRecon2", longitudinal=False,
inputspec.inputs.timepoints = config['timepoints']
- # NU Intensity Correction
- """
- Non-parametric Non-uniform intensity Normalization (N3), corrects for
- intensity non-uniformity in MR data, making relatively few assumptions about
- the data. This runs the MINC tool 'nu_correct'.
- """
- intensity_correction = pe.Node(
- MNIBiasCorrection(), name="Intensity_Correction")
- intensity_correction.inputs.iterations = 1
- intensity_correction.inputs.protocol_iterations = 1000
- intensity_correction.inputs.stop = 0.0001
- intensity_correction.inputs.shrink = 2
- if field_strength == '3T':
- intensity_correction.inputs.distance = 50
- else:
- # default for 1.5T scans
- intensity_correction.inputs.distance = 200
-
- intensity_correction.inputs.out_file = 'nu.mgz'
- ar2_wf.connect([(inputspec, intensity_correction, [('orig', 'in_file'),
- ('brainmask', 'mask'),
- ('transform', 'transform')
- ])
- ])
-
- add_to_header_nu = pe.Node(AddXFormToHeader(), name="Add_XForm_to_NU")
- add_to_header_nu.inputs.copy_name = True
- add_to_header_nu.inputs.out_file = 'nu.mgz'
- ar2_wf.connect([(intensity_correction, add_to_header_nu, [('out_file', 'in_file'),
+ if fsvernum >= 6:
+ # NU Intensity Correction
+ """
+ Non-parametric Non-uniform intensity Normalization (N3), corrects for
+ intensity non-uniformity in MR data, making relatively few assumptions about
+ the data. This runs the MINC tool 'nu_correct'.
+ """
+ intensity_correction = pe.Node(
+ MNIBiasCorrection(), name="Intensity_Correction")
+ intensity_correction.inputs.out_file = 'nu.mgz'
+ ar2_wf.connect([(inputspec, intensity_correction, [('orig', 'in_file'),
+ ('brainmask', 'mask'),
+ ('transform', 'transform')])])
+
+ # intensity correction parameters are more specific in 6+
+ intensity_correction.inputs.iterations = 1
+ intensity_correction.inputs.protocol_iterations = 1000
+ if stop:
+ intensity_correction.inputs.stop = stop
+ if shrink:
+ intensity_correction.inputs.shrink = shrink
+ intensity_correction.inputs.distance = distance
+
+ add_to_header_nu = pe.Node(AddXFormToHeader(), name="Add_XForm_to_NU")
+ add_to_header_nu.inputs.copy_name = True
+ add_to_header_nu.inputs.out_file = 'nu.mgz'
+ ar2_wf.connect([(intensity_correction, add_to_header_nu, [('out_file', 'in_file'),
]),
- (inputspec, add_to_header_nu,
- [('transform', 'transform')])
+ (inputspec, add_to_header_nu,
+ [('transform', 'transform')])
])
+
# EM Registration
"""
Computes the transform to align the mri/nu.mgz volume to the default GCA
@@ -103,8 +102,12 @@ def create_AutoRecon2(name="AutoRecon2", longitudinal=False,
align_transform.plugin_args = plugin_args
ar2_wf.connect([(inputspec, align_transform, [('brainmask', 'mask'),
('reg_template', 'template'),
- ('num_threads', 'num_threads')]),
- (add_to_header_nu, align_transform, [('out_file', 'in_file')])])
+ ('num_threads', 'num_threads')])])
+ if fsvernum >= 6:
+ ar2_wf.connect([(add_to_header_nu, align_transform, [('out_file', 'in_file')])])
+ else:
+ ar2_wf.connect([(inputspec, align_transform, [('nu', 'in_file')])])
+
# CA Normalize
"""
@@ -128,8 +131,12 @@ def create_AutoRecon2(name="AutoRecon2", longitudinal=False,
ar2_wf.connect([(align_transform, ca_normalize, [('out_file', 'transform')]),
(inputspec, ca_normalize, [('brainmask', 'mask'),
- ('reg_template', 'atlas')]),
- (add_to_header_nu, ca_normalize, [('out_file', 'in_file')])])
+ ('reg_template', 'atlas')])])
+ if fsvernum >= 6:
+ ar2_wf.connect([(add_to_header_nu, ca_normalize, [('out_file', 'in_file')])])
+ else:
+ ar2_wf.connect([(inputspec, ca_normalize, [('nu', 'in_file')])])
+
# CA Register
# Computes a nonlinear transform to align with GCA atlas.
@@ -148,7 +155,7 @@ def create_AutoRecon2(name="AutoRecon2", longitudinal=False,
else:
ca_register.inputs.levels = 2
ca_register.inputs.A = 1
- ar2_wf.connect([(ar1_inputs, ca_register, [('template_talairach_m3z', 'l_files')])])
+ ar2_wf.connect([(inputspec, ca_register, [('template_talairach_m3z', 'l_files')])])
# Remove Neck
"""
@@ -159,8 +166,11 @@ def create_AutoRecon2(name="AutoRecon2", longitudinal=False,
remove_neck.inputs.radius = 25
remove_neck.inputs.out_file = 'nu_noneck.mgz'
ar2_wf.connect([(ca_register, remove_neck, [('out_file', 'transform')]),
- (add_to_header_nu, remove_neck, [('out_file', 'in_file')]),
(inputspec, remove_neck, [('reg_template', 'template')])])
+ if fsvernum >= 6:
+ ar2_wf.connect([(add_to_header_nu, remove_neck, [('out_file', 'in_file')])])
+ else:
+ ar2_wf.connect([(inputspec, remove_neck, [('nu', 'in_file')])])
# SkullLTA (EM Registration, with Skull)
# Computes transform to align volume mri/nu_noneck.mgz with GCA volume
@@ -207,8 +217,9 @@ def create_AutoRecon2(name="AutoRecon2", longitudinal=False,
fuse_segmentations.inputs.out_file = 'aseg.fused.mgz'
ca_label = pe.Node(CALabel(), name='CA_Label')
- ca_label.inputs.relabel_unlikely = (9, .3)
- ca_label.inputs.prior = 0.5
+ if fsvernum >= 6:
+ ca_label.inputs.relabel_unlikely = (9, .3)
+ ca_label.inputs.prior = 0.5
ca_label.inputs.align = True
ca_label.inputs.out_file = 'aseg.auto_noCCseg.mgz'
if plugin_args:
@@ -680,8 +691,14 @@ def create_AutoRecon2(name="AutoRecon2", longitudinal=False,
outputspec = pe.Node(IdentityInterface(fields=outputs),
name="outputspec")
- ar2_wf.connect([(add_to_header_nu, outputspec, [('out_file', 'nu')]),
- (align_transform, outputspec, [('out_file', 'tal_lta')]),
+ if fsvernum >= 6:
+ ar2_wf.connect([(add_to_header_nu, outputspec, [('out_file', 'nu')])])
+ else:
+ # add to outputspec to perserve datasinking
+ ar2_wf.connect([(inputspec, outputspec, [('nu', 'nu')])])
+
+
+ ar2_wf.connect([(align_transform, outputspec, [('out_file', 'tal_lta')]),
(ca_normalize, outputspec, [('out_file', 'norm')]),
(ca_normalize, outputspec, [('control_points', 'ctrl_pts')]),
(ca_register, outputspec, [('out_file', 'tal_m3z')]),
diff --git a/nipype/workflows/smri/freesurfer/autorecon3.py b/nipype/workflows/smri/freesurfer/autorecon3.py
index ffc0eec7b9..daace1c8c7 100644
--- a/nipype/workflows/smri/freesurfer/autorecon3.py
+++ b/nipype/workflows/smri/freesurfer/autorecon3.py
@@ -1,14 +1,11 @@
-import os
-import nipype
-from nipype.interfaces.utility import Function, IdentityInterface, Merge
+from nipype.interfaces.utility import IdentityInterface, Merge
import nipype.pipeline.engine as pe # pypeline engine
from nipype.interfaces.freesurfer import *
from .ba_maps import create_ba_maps_wf
-from .utils import createsrcsubj
from nipype.interfaces.io import DataGrabber
def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
- th3=True):
+ th3=True, exvivo=True, entorhinal=True, fsvernum=5.3):
# AutoRecon3
# Workflow
@@ -165,16 +162,22 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
# Pial Surface
ar3_pial = pe.Node(MakeSurfaces(), name="Make_Pial_Surface")
- ar3_pial.inputs.no_white = True
ar3_pial.inputs.mgz = True
ar3_pial.inputs.hemisphere = hemisphere
ar3_pial.inputs.copy_inputs = True
+
+ if fsvernum < 6:
+ ar3_pial.inputs.white = 'NOWRITE'
+ hemi_wf.connect(hemi_inputspec1, 'white', ar3_pial, 'in_white')
+ else:
+ ar3_pial.inputs.no_white = True
+ hemi_wf.connect([(hemi_inputspec1, ar3_pial, [('white', 'orig_pial'),
+ ('white', 'orig_white')])])
+
hemi_wf.connect([(hemi_inputspec1, ar3_pial, [('wm', 'in_wm'),
('orig', 'in_orig'),
('filled', 'in_filled'),
- ('white', 'orig_pial'),
- ('white', 'orig_white'),
('brain_finalsurfs', 'in_T1'),
('aseg_presurf', 'in_aseg')]),
(ar3_parcellation, ar3_pial, [('out_file', 'in_label')])
@@ -274,25 +277,24 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
volume_mask.inputs.copy_inputs = True
- ar3_wf.connect([(inputspec, volume_mask, [('aseg_presurf', 'in_aseg'),
- ('lh_white', 'lh_white'),
- ('rh_white', 'rh_white'),
- ]),
+ ar3_wf.connect([(inputspec, volume_mask, [('lh_white', 'lh_white'),
+ ('rh_white', 'rh_white')]),
(ar3_lh_wf1, volume_mask, [('outputspec.pial', 'lh_pial')]),
(ar3_rh_wf1, volume_mask, [('outputspec.pial', 'rh_pial')]),
])
+ if fsvernum >= 6:
+ ar3_wf.connect([(inputspec, volume_mask, [('aseg_presurf', 'in_aseg')])])
+ else:
+ ar3_wf.connect([(inputspec, volume_mask, [('aseg_presurf', 'aseg')])])
+
ar3_lh_wf2 = pe.Workflow(name="AutoRecon3_Left_2")
ar3_rh_wf2 = pe.Workflow(name="AutoRecon3_Right_2")
for hemisphere, hemiwf2 in [('lh', ar3_lh_wf2), ('rh', ar3_rh_wf2)]:
if hemisphere == 'lh':
- opp_hemi = 'rh'
- opp_wf = ar3_rh_wf2
hemiwf1 = ar3_lh_wf1
else:
- opp_hemi = 'lh'
- opp_wf = ar3_lh_wf2
hemiwf1 = ar3_rh_wf1
hemi_inputs2 = ['wm',
@@ -541,14 +543,6 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
#End hemisphere2 workflow
- # Relabel Hypointensities
- relabel_hypos = pe.Node(RelabelHypointensities(), name="Relabel_Hypointensities")
- relabel_hypos.inputs.out_file = 'aseg.presurf.hypos.mgz'
- ar3_wf.connect([(inputspec, relabel_hypos, [('aseg_presurf', 'aseg'),
- ('lh_white', 'lh_white'),
- ('rh_white', 'rh_white'),
- ])])
-
# APARC to ASEG
# Adds information from the ribbon into the aseg.mgz (volume parcellation).
aparc_2_aseg = pe.Node(Aparc2Aseg(), name="Aparc2Aseg")
@@ -569,9 +563,17 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
(volume_mask, aparc_2_aseg, [('rh_ribbon', 'rh_ribbon'),
('lh_ribbon', 'lh_ribbon'),
('out_ribbon', 'ribbon'),
- ]),
- (relabel_hypos, aparc_2_aseg, [('out_file', 'aseg')])
- ])
+ ])])
+ if fsvernum < 6:
+ ar3_wf.connect([(inputspec, aparc_2_aseg, [('aseg_presurf', 'aseg')])])
+ else:
+ # Relabel Hypointensities
+ relabel_hypos = pe.Node(RelabelHypointensities(), name="Relabel_Hypointensities")
+ relabel_hypos.inputs.out_file = 'aseg.presurf.hypos.mgz'
+ ar3_wf.connect([(inputspec, relabel_hypos, [('aseg_presurf', 'aseg'),
+ ('lh_white', 'lh_white'),
+ ('rh_white', 'rh_white')])])
+ ar3_wf.connect([(relabel_hypos, aparc_2_aseg, [('out_file', 'aseg')])])
aparc_2_aseg_2009 = pe.Node(Aparc2Aseg(), name="Aparc2Aseg_2009")
aparc_2_aseg_2009.inputs.volmask = True
@@ -592,14 +594,31 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
(volume_mask, aparc_2_aseg_2009, [('rh_ribbon', 'rh_ribbon'),
('lh_ribbon',
'lh_ribbon'),
- ('out_ribbon', 'ribbon'),
- ]),
- (relabel_hypos, aparc_2_aseg_2009, [('out_file', 'aseg')])
- ])
+ ('out_ribbon', 'ribbon')])])
+
+ if fsvernum >= 6:
+ apas_2_aseg = pe.Node(Apas2Aseg(), name="Apas_2_Aseg")
+ ar3_wf.connect([(aparc_2_aseg, apas_2_aseg, [('out_file', 'in_file')]),
+ (relabel_hypos, aparc_2_aseg_2009, [('out_file', 'aseg')])])
+ else:
+ # aseg.mgz gets edited in place, so we'll copy and pass it to the
+ # outputspec once aparc_2_aseg has completed
+ def out_aseg(in_aparcaseg, in_aseg, out_file):
+ import shutil
+ import os
+ out_file = os.path.abspath(out_file)
+ shutil.copy(in_aseg, out_file)
+ return out_file
+ apas_2_aseg = pe.Node(Function(['in_aparcaseg', 'in_aseg', 'out_file'],
+ ['out_file'],
+ out_aseg),
+ name="Aseg")
+ ar3_wf.connect([(aparc_2_aseg, apas_2_aseg, [('out_file', 'in_aparcaseg')]),
+ (inputspec, apas_2_aseg, [('aseg_presurf', 'in_aseg')]),
+ (inputspec, aparc_2_aseg_2009, [('aseg_presurf', 'aseg')])])
- apas_2_aseg = pe.Node(Apas2Aseg(), name="Apas_2_Aseg")
apas_2_aseg.inputs.out_file = "aseg.mgz"
- ar3_wf.connect([(aparc_2_aseg, apas_2_aseg, [('out_file', 'in_file')])])
+
# Segmentation Stats
"""
@@ -626,7 +645,6 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
ar3_wf.connect([(apas_2_aseg, segstats, [('out_file', 'segmentation_file')]),
(inputspec, segstats, [('lh_white', 'lh_white'),
('rh_white', 'rh_white'),
- ('aseg_presurf', 'presurf_seg'),
('transform', 'transform'),
('norm', 'in_intensity'),
('norm', 'partial_volume_file'),
@@ -642,6 +660,12 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
]),
])
+ if fsvernum >= 6:
+ ar3_wf.connect(inputspec, 'aseg_presurf', segstats, 'presurf_seg')
+ else:
+ ar3_wf.connect(inputspec, 'aseg_presurf', segstats, 'aseg')
+
+
# White Matter Parcellation
# Adds WM Parcellation info into the aseg and computes stat.
@@ -670,8 +694,10 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
('out_ribbon', 'ribbon'),
]),
(apas_2_aseg, wm_parcellation, [('out_file', 'aseg')]),
- (aparc_2_aseg, wm_parcellation, [('out_file', 'ctxseg')])
- ])
+ (aparc_2_aseg, wm_parcellation, [('out_file', 'ctxseg')])])
+
+ if fsvernum < 6:
+ ar3_wf.connect([(inputspec, wm_parcellation, [('filled', 'filled')])])
# White Matter Segmentation Stats
@@ -686,7 +712,6 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
ar3_wf.connect([(wm_parcellation, wm_segstats, [('out_file', 'segmentation_file')]),
(inputspec, wm_segstats, [('lh_white', 'lh_white'),
('rh_white', 'rh_white'),
- ('aseg_presurf', 'presurf_seg'),
('transform', 'transform'),
('norm', 'in_intensity'),
('norm', 'partial_volume_file'),
@@ -704,8 +729,15 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
]),
])
+ if fsvernum >= 6:
+ ar3_wf.connect(inputspec, 'aseg_presurf', wm_segstats, 'presurf_seg')
+ else:
+ ar3_wf.connect(inputspec, 'aseg_presurf', wm_segstats, 'aseg')
+
+
# add brodman area maps to the workflow
- ba_WF, ba_outputs = create_ba_maps_wf(th3=th3)
+ ba_WF, ba_outputs = create_ba_maps_wf(th3=th3, exvivo=exvivo,
+ entorhinal=entorhinal)
ar3_wf.connect([(ar3_lh_wf1, ba_WF, [('outputspec.sphere_reg', 'inputspec.lh_sphere_reg'),
('outputspec.thickness_pial', 'inputspec.lh_thickness'),
@@ -894,11 +926,12 @@ def create_AutoRecon3(name="AutoRecon3", qcache=False, plugin_args=None,
(segstats, outputspec, [('summary_file', 'aseg_stats')]),
(aparc_2_aseg_2009, outputspec, [('out_file', 'aparc_a2009s_aseg')]),
(aparc_2_aseg, outputspec, [('out_file', 'aparc_aseg')]),
- (relabel_hypos, outputspec, [('out_file', 'aseg_presurf_hypos')]),
(volume_mask, outputspec, [('out_ribbon', 'ribbon'),
('lh_ribbon', 'lh_ribbon'),
- ('rh_ribbon', 'rh_ribbon')]),
- ])
+ ('rh_ribbon', 'rh_ribbon')])])
+ if fsvernum >= 6:
+ ar3_wf.connect([(relabel_hypos, outputspec, [('out_file', 'aseg_presurf_hypos')])])
+
for i, outputs in enumerate([hemi_outputs1, hemi_outputs2]):
if i == 0:
diff --git a/nipype/workflows/smri/freesurfer/ba_maps.py b/nipype/workflows/smri/freesurfer/ba_maps.py
index 337e1f40cb..7fa266250c 100644
--- a/nipype/workflows/smri/freesurfer/ba_maps.py
+++ b/nipype/workflows/smri/freesurfer/ba_maps.py
@@ -6,7 +6,8 @@
from nipype.interfaces.io import DataGrabber
from nipype.interfaces.utility import Merge
-def create_ba_maps_wf(name="Brodmann_Area_Maps", th3=True):
+def create_ba_maps_wf(name="Brodmann_Area_Maps", th3=True, exvivo=True,
+ entorhinal=True):
# Brodmann Area Maps (BA Maps) and Hinds V1 Atlas
inputs = ['lh_sphere_reg',
'rh_sphere_reg',
@@ -55,40 +56,54 @@ def create_ba_maps_wf(name="Brodmann_Area_Maps", th3=True):
name="outputspec")
labels = ["BA1", "BA2", "BA3a", "BA3b", "BA4a", "BA4p", "BA6",
- "BA44", "BA45", "V1", "V2", "MT", "entorhinal", "perirhinal"]
+ "BA44", "BA45", "V1", "V2", "MT", "perirhinal"]
+ if entorhinal:
+ labels.insert(-1, 'entorhinal')
for hemisphere in ['lh', 'rh']:
for threshold in [True, False]:
field_template = dict(sphere_reg='surf/{0}.sphere.reg'.format(hemisphere),
white='surf/{0}.white'.format(hemisphere))
out_files = list()
+ source_fields = list()
if threshold:
for label in labels:
- out_file = '{0}.{1}_exvivo.thresh.label'.format(hemisphere, label)
+ if label == 'perirhinal' and not entorhinal:
+ # versions < 6.0 do not use thresh.perirhinal
+ continue
+ if exvivo:
+ out_file = '{0}.{1}_exvivo.thresh.label'.format(hemisphere, label)
+ else:
+ out_file = '{0}.{1}.thresh.label'.format(hemisphere, label)
out_files.append(out_file)
field_template[label] = 'label/' + out_file
- node_name = 'BA_Maps_' + hemisphere + '_Tresh'
+ source_fields.append(label)
+ node_name = 'BA_Maps_' + hemisphere + '_Thresh'
else:
for label in labels:
- out_file = '{0}.{1}_exvivo.label'.format(hemisphere, label)
+ if exvivo:
+ out_file = '{0}.{1}_exvivo.label'.format(hemisphere, label)
+ else:
+ out_file = '{0}.{1}.label'.format(hemisphere, label)
+
out_files.append(out_file)
field_template[label] = 'label/' + out_file
+ source_fields.append(label)
node_name = 'BA_Maps_' + hemisphere
- source_fields = labels + ['sphere_reg', 'white']
- source_subject = pe.Node(DataGrabber(outfields=source_fields),
+ source_subject = pe.Node(DataGrabber(outfields=source_fields + ['sphere_reg', 'white']),
name=node_name + "_srcsubject")
source_subject.inputs.template = '*'
source_subject.inputs.sort_filelist = False
source_subject.inputs.field_template = field_template
ba_WF.connect([(inputspec, source_subject, [('src_subject_dir', 'base_directory')])])
- merge_labels = pe.Node(Merge(len(labels)),
+ merge_labels = pe.Node(Merge(len(out_files)),
name=node_name + "_Merge")
- for i,label in enumerate(labels):
+ for i,label in enumerate(source_fields):
ba_WF.connect([(source_subject, merge_labels, [(label, 'in{0}'.format(i+1))])])
- node = pe.MapNode(Label2Label(), name=node_name,
+ node = pe.MapNode(Label2Label(), name=node_name + '_Label2Label',
iterfield=['source_label', 'out_file'])
node.inputs.hemisphere = hemisphere
node.inputs.out_file = out_files
diff --git a/nipype/workflows/smri/freesurfer/recode_tables/fs2abc.csv b/nipype/workflows/smri/freesurfer/recode_tables/fs2abc.csv
deleted file mode 100644
index 76035d3b51..0000000000
--- a/nipype/workflows/smri/freesurfer/recode_tables/fs2abc.csv
+++ /dev/null
@@ -1,41 +0,0 @@
-orig_label,target_label
-41,1
-2,1
-42,2
-3,2
-77,2
-251,2
-252,2
-253,2
-254,2
-255,2
-43,4
-44,4
-4,4
-5,4
-14,4
-15,4
-24,4
-58,21
-50,21
-51,21
-52,21
-26,21
-11,21
-12,21
-13,21
-49,24
-60,24
-28,24
-10,24
-63,25
-53,25
-54,25
-31,25
-17,25
-18,25
-47,11
-8,11
-46,12
-7,12
-16,30
diff --git a/nipype/workflows/smri/freesurfer/recon.py b/nipype/workflows/smri/freesurfer/recon.py
index c9206812f3..4010923bc5 100644
--- a/nipype/workflows/smri/freesurfer/recon.py
+++ b/nipype/workflows/smri/freesurfer/recon.py
@@ -4,7 +4,7 @@
from .autorecon1 import create_AutoRecon1
from .autorecon2 import create_AutoRecon2
from .autorecon3 import create_AutoRecon3
-from ....interfaces.freesurfer import AddXFormToHeader
+from ....interfaces.freesurfer import AddXFormToHeader, Info
from ....interfaces.io import DataSink
from .utils import getdefaultconfig
@@ -79,10 +79,9 @@ def link_masks(subjects_dir, subject_id):
wf.connect(autorecon_resume, "subject_id", outputnode, "subject_id")
return wf
-def create_reconall_workflow(name="ReconAll", plugin_args=None,
- recoding_file=None):
+def create_reconall_workflow(name="ReconAll", plugin_args=None):
"""Creates the ReconAll workflow in nipype.
-
+
Example
-------
>>> from nipype.workflows.smri.freesurfer import create_skullstripped_recon_flow
@@ -135,6 +134,39 @@ def create_reconall_workflow(name="ReconAll", plugin_args=None,
run_without_submitting=True,
name='inputspec')
+ # check freesurfer version and set parameters
+ fs_version_full = Info.version()
+ if 'v6.0' in fs_version_full or 'dev' in fs_version_full:
+ # assuming that dev is 6.0
+ fsvernum = 6.0
+ fs_version = 'v6.0'
+ th3 = True
+ shrink = 2
+ distance = 200 # 3T should be 50
+ stop = 0.0001
+ exvivo = True
+ entorhinal = True
+ rb_date = "2014-08-21"
+ else:
+ # 5.3 is default
+ fsvernum = 5.3
+ if 'v5.3' in fs_version_full:
+ fs_version = 'v5.3'
+ else:
+ fs_vesion = fs_version_full.split('-')[-1]
+ print("Warning: Workflow may not work properly if FREESURFER_HOME " +
+ "environmental variable is not set or if you are using an older " +
+ "version of FreeSurfer")
+ th3 = False
+ shrink = None
+ distance = 50
+ stop = None
+ exvivo = False
+ entorhinal = False
+ rb_date = "2008-03-26"
+
+ print("FreeSurfer Version: {0}".format(fs_version))
+
def setconfig(reg_template=None,
reg_template_withskull=None,
lh_atlas=None,
@@ -150,7 +182,8 @@ def setconfig(reg_template=None,
color_table=None,
lookup_table=None,
wm_lookup_table=None,
- awk_file=None):
+ awk_file=None,
+ rb_date=None):
"""Set optional configurations to the default"""
from nipype.workflows.smri.freesurfer.utils import getdefaultconfig
def checkarg(arg, default):
@@ -159,7 +192,7 @@ def checkarg(arg, default):
return arg
else:
return default
- defaultconfig = getdefaultconfig(exitonfail=True)
+ defaultconfig = getdefaultconfig(exitonfail=True, rb_date=rb_date)
# set the default template and classifier files
reg_template = checkarg(reg_template, defaultconfig['registration_template'])
reg_template_withskull = checkarg(reg_template_withskull,
@@ -200,17 +233,21 @@ def checkarg(arg, default):
'lookup_table',
'wm_lookup_table',
'awk_file']
-
- config_node = pe.Node(niu.Function(params,
+
+ config_node = pe.Node(niu.Function(params + ['rb_date'],
params,
setconfig),
name="config")
+ config_node.inputs.rb_date = rb_date
+
for param in params:
reconall.connect(inputspec, param, config_node, param)
# create AutoRecon1
- ar1_wf, ar1_outputs = create_AutoRecon1(plugin_args=plugin_args)
+ ar1_wf, ar1_outputs = create_AutoRecon1(plugin_args=plugin_args, stop=stop,
+ distance=distance, shrink=shrink,
+ fsvernum=fsvernum)
# connect inputs for AutoRecon1
reconall.connect([(inputspec, ar1_wf, [('T1_files', 'inputspec.T1_files'),
('T2_file', 'inputspec.T2_file'),
@@ -220,9 +257,9 @@ def checkarg(arg, default):
(config_node, ar1_wf, [('reg_template_withskull',
'inputspec.reg_template_withskull'),
('awk_file', 'inputspec.awk_file')])])
-
# create AutoRecon2
- ar2_wf, ar2_outputs = create_AutoRecon2(plugin_args=plugin_args)
+ ar2_wf, ar2_outputs = create_AutoRecon2(plugin_args=plugin_args, fsvernum=fsvernum,
+ stop=stop, shrink=shrink, distance=distance)
# connect inputs for AutoRecon2
reconall.connect([(inputspec, ar2_wf, [('num_threads', 'inputspec.num_threads')]),
(config_node, ar2_wf, [('reg_template_withskull',
@@ -231,8 +268,14 @@ def checkarg(arg, default):
(ar1_wf, ar2_wf, [('outputspec.brainmask', 'inputspec.brainmask'),
('outputspec.talairach', 'inputspec.transform'),
('outputspec.orig', 'inputspec.orig')])])
+
+ if fsvernum < 6:
+ reconall.connect([(ar1_wf, ar2_wf, [('outputspec.nu', 'inputspec.nu')])])
+
# create AutoRecon3
- ar3_wf, ar3_outputs = create_AutoRecon3(plugin_args=plugin_args, th3=True)
+ ar3_wf, ar3_outputs = create_AutoRecon3(plugin_args=plugin_args, th3=th3,
+ exvivo=exvivo, entorhinal=entorhinal,
+ fsvernum=fsvernum)
# connect inputs for AutoRecon3
reconall.connect([(config_node, ar3_wf, [('lh_atlas', 'inputspec.lh_atlas'),
('rh_atlas', 'inputspec.rh_atlas'),
@@ -509,15 +552,6 @@ def completemethod(datasinked_files, subject_id):
reconall.connect([(datasink, completion, [('out_file', 'datasinked_files')]),
(inputspec, completion, [('subject_id', 'subject_id')]),
(completion, postds_outputspec, [('subject_id', 'subject_id')])])
-
-
- #### Workflow additions go here
- if recoding_file:
- from utils import create_recoding_wf
- recode = create_recoding_wf(recoding_file)
- reconall.connect([(ar3_wf, recode, [('outputspec.aseg', 'inputspec.labelmap')]),
- (recode, outputspec, [('outputspec.recodedlabelmap', 'recoded_labelmap')])])
-
return reconall
diff --git a/nipype/workflows/smri/freesurfer/utils.py b/nipype/workflows/smri/freesurfer/utils.py
index 7ce98a4642..65e352f5e5 100644
--- a/nipype/workflows/smri/freesurfer/utils.py
+++ b/nipype/workflows/smri/freesurfer/utils.py
@@ -420,7 +420,7 @@ def mkdir_p(path):
else:
raise
-def getdefaultconfig(exitonfail=False):
+def getdefaultconfig(exitonfail=False, rb_date="2014-08-21"):
config = { 'custom_atlas' : None,
'cw256' : False,
'field_strength' : '1.5T',
@@ -440,9 +440,9 @@ def getdefaultconfig(exitonfail=False):
config['awk_file'] = os.path.join(config['fs_home'], 'bin',
'extract_talairach_avi_QA.awk')
config['registration_template'] = os.path.join(config['fs_home'], 'average',
- 'RB_all_2014-08-21.gca')
+ 'RB_all_{0}.gca'.format(rb_date))
config['registration_template_withskull'] = os.path.join(config['fs_home'], 'average',
- 'RB_all_withskull_2014-08-21.gca')
+ 'RB_all_withskull_{0}.gca'.format(rb_date))
for hemi in ('lh', 'rh'):
config['{0}_atlas'.format(hemi)] = os.path.join(
config['fs_home'], 'average',
@@ -492,376 +492,3 @@ def checkenv(exitonfail=False):
sys.exit(2)
else:
print("Warning: " + msg)
-
-def center_volume(in_file):
- import SimpleITK as sitk
- import os
- img = sitk.ReadImage(in_file)
- size = img.GetSize()
- origin = img.GetOrigin()
- new_origin = [0,0,0]
- for i, xx in enumerate(origin):
- new_origin[i] = float(size[i])/2
- if xx < 0:
- new_origin[i] = -new_origin[i]
- img.SetOrigin(new_origin)
- out_file = os.path.abspath(os.path.basename(in_file))
- sitk.WriteImage(img, out_file)
- return out_file
-
-
-def recodeLabelMap(in_file, out_file, recode_file):
- """This function has been adapted from BRAINSTools and serves
- as a means to recode a label map based upon an input csv
- file."""
- import SimpleITK as sitk
- import os
- import csv
- import sys
-
- # Convert csv to RECODE_TABLE
- CSVFile = open(recode_file, 'rb')
- reader = csv.reader(CSVFile)
- header = reader.next()
- n_cols = len(header)
- if n_cols == 4:
- # ignore label names
- label_keys = (0, 2)
- elif n_cols == 2:
- # no label names present
- label_keys = (0, 1)
- else:
- # csv does not match format requirements
- print("ERROR: input csv file for label recoding does meet requirements")
- sys.exit()
-
- # read csv file
- RECODE_TABLE = list()
- for line in reader:
- RECODE_TABLE.append((int(line[label_keys[0]]), int(line[label_keys[1]])))
-
- def minimizeSizeOfImage(outlabels):
- """This function will find the largest integer value in the labelmap, and
- cast the image to the smallest possible integer size so that no loss of data
- results."""
- measureFilt = sitk.StatisticsImageFilter()
- measureFilt.Execute(outlabels)
- imgMin=measureFilt.GetMinimum()
- imgMax=measureFilt.GetMaximum()
- if imgMax < (2**8)-1:
- outlabels = sitk.Cast( outlabels, sitk.sitkUInt8 )
- elif imgMax < (2**16)-1:
- outlabels = sitk.Cast( outlabels, sitk.sitkUInt16 )
- elif imgMax < (2**32)-1:
- outlabels = sitk.Cast( outlabels, sitk.sitkUInt32 )
- elif imgMax < (2**64)-1:
- outlabels = sitk.Cast( outlabels, sitk.sitkUInt64 )
- return outlabels
-
- LabelImage=sitk.Cast(sitk.ReadImage(in_file), sitk.sitkUInt32)
- for (old, new) in RECODE_TABLE:
- LabelImage = sitk.Cast((LabelImage == old), sitk.sitkUInt32)*(new - old)+LabelImage
- LabelImage = minimizeSizeOfImage(LabelImage)
- out_file = os.path.abspath(out_file)
- sitk.WriteImage(LabelImage, out_file)
- return out_file
-
-
-def create_recoding_wf(in_file, out_file=None):
- wf = nipype.Workflow(name="RecodeLabels")
-
- inputspec = nipype.pipeline.Node(nipype.IdentityInterface(['labelmap',
- 'recode_file']),
- name="inputspec")
- inputspec.inputs.recode_file = in_file
-
- convert_labelmap = nipype.pipeline.Node(fs.MRIConvert(), name="ConvertLabelMap")
- convert_labelmap.inputs.in_type = 'mgz'
- convert_labelmap.inputs.out_type = 'nii'
- convert_labelmap.inputs.out_orientation = 'RAS'
- convert_labelmap.inputs.out_file = 'labelmap.nii'
- wf.connect([(inputspec, convert_labelmap, [('labelmap', 'in_file')])])
-
- recode = nipype.Node(nipype.Function(['in_file',
- 'out_file',
- 'recode_file'],
- ['out_file'],
- recodeLabelMap),
- name = "RecodeLabelMap")
- if out_file == None:
- recode.inputs.out_file = 'recodedlabelmap.nii'
- else:
- recode.inputs.out_file = out_file
-
- wf.connect([(convert_labelmap, recode, [('out_file', 'in_file')]),
- (inputspec, recode, [('recode_file', 'recode_file')])])
-
- center_labelmap = nipype.Node(nipype.Function(['in_file'], ['out_file'],
- center_volume),
- name="CenterLabelMap")
-
- wf.connect([(recode, center_labelmap, [('out_file', 'in_file')])])
-
- outputspec = nipype.Node(nipype.IdentityInterface(['recodedlabelmap']), name="outputspec")
-
- wf.connect([(center_labelmap, outputspec, [('out_file', 'recodedlabelmap')])])
- return wf
-
-def createsrcsubj(source_directory):
- """
- Returns a node that acts as the datasource for a source subject such as
- 'fsaverage'
- """
- outfields = ['lh_BA1_exvivo',
- 'lh_BA2_exvivo',
- 'lh_BA3a_exvivo',
- 'lh_BA3b_exvivo',
- 'lh_BA4a_exvivo',
- 'lh_BA4p_exvivo',
- 'lh_BA6_exvivo',
- 'lh_BA44_exvivo',
- 'lh_BA45_exvivo',
- 'lh_V1_exvivo',
- 'lh_V2_exvivo',
- 'lh_MT_exvivo',
- 'lh_entorhinal_exvivo',
- 'lh_perirhinal_exvivo',
- 'lh_BA1_exvivo_thresh',
- 'lh_BA2_exvivo_thresh',
- 'lh_BA3a_exvivo_thresh',
- 'lh_BA3b_exvivo_thresh',
- 'lh_BA4a_exvivo_thresh',
- 'lh_BA4p_exvivo_thresh',
- 'lh_BA6_exvivo_thresh',
- 'lh_BA44_exvivo_thresh',
- 'lh_BA45_exvivo_thresh',
- 'lh_V1_exvivo_thresh',
- 'lh_V2_exvivo_thresh',
- 'lh_MT_exvivo_thresh',
- 'lh_entorhinal_exvivo_thresh',
- 'lh_perirhinal_exvivo_thresh',
- 'rh_BA1_exvivo',
- 'rh_BA2_exvivo',
- 'rh_BA3a_exvivo',
- 'rh_BA3b_exvivo',
- 'rh_BA4a_exvivo',
- 'rh_BA4p_exvivo',
- 'rh_BA6_exvivo',
- 'rh_BA44_exvivo',
- 'rh_BA45_exvivo',
- 'rh_V1_exvivo',
- 'rh_V2_exvivo',
- 'rh_MT_exvivo',
- 'rh_entorhinal_exvivo',
- 'rh_perirhinal_exvivo',
- 'rh_BA1_exvivo_thresh',
- 'rh_BA2_exvivo_thresh',
- 'rh_BA3a_exvivo_thresh',
- 'rh_BA3b_exvivo_thresh',
- 'rh_BA4a_exvivo_thresh',
- 'rh_BA4p_exvivo_thresh',
- 'rh_BA6_exvivo_thresh',
- 'rh_BA44_exvivo_thresh',
- 'rh_BA45_exvivo_thresh',
- 'rh_V1_exvivo_thresh',
- 'rh_V2_exvivo_thresh',
- 'rh_MT_exvivo_thresh',
- 'rh_entorhinal_exvivo_thresh',
- 'rh_perirhinal_exvivo_thresh']
- datasource = pe.Node(nio.nio.DataGrabber(outfields=outfields), name="Source_Subject")
- datasource.inputs.base_directory = source_directory
- datasource.inputs.template = '*'
- datasource.inputs.field_template = dict(
- lh_BA1_exvivo='label/lh.BA1_exvivo.label',
- lh_BA2_exvivo='label/lh.BA2_exvivo.label',
- lh_BA3a_exvivo='label/lh.BA3a_exvivo.label',
- lh_BA3b_exvivo='label/lh.BA3b_exvivo.label',
- lh_BA4a_exvivo='label/lh.BA4a_exvivo.label',
- lh_BA4p_exvivo='label/lh.BA4p_exvivo.label',
- lh_BA6_exvivo='label/lh.BA6_exvivo.label',
- lh_BA44_exvivo='label/lh.BA44_exvivo.label',
- lh_BA45_exvivo='label/lh.BA45_exvivo.label',
- lh_V1_exvivo='label/lh.V1_exvivo.label',
- lh_V2_exvivo='label/lh.V2_exvivo.label',
- lh_MT_exvivo='label/lh.MT_exvivo.label',
- lh_entorhinal_exvivo='label/lh.entorhinal_exvivo.label',
- lh_perirhinal_exvivo='label/lh.perirhinal_exvivo.label',
- lh_BA1_exvivo_thresh='label/lh.BA1_exvivo.thresh.label',
- lh_BA2_exvivo_thresh='label/lh.BA2_exvivo.thresh.label',
- lh_BA3a_exvivo_thresh='label/lh.BA3a_exvivo.thresh.label',
- lh_BA3b_exvivo_thresh='label/lh.BA3b_exvivo.thresh.label',
- lh_BA4a_exvivo_thresh='label/lh.BA4a_exvivo.thresh.label',
- lh_BA4p_exvivo_thresh='label/lh.BA4p_exvivo.thresh.label',
- lh_BA6_exvivo_thresh='label/lh.BA6_exvivo.thresh.label',
- lh_BA44_exvivo_thresh='label/lh.BA44_exvivo.thresh.label',
- lh_BA45_exvivo_thresh='label/lh.BA45_exvivo.thresh.label',
- lh_V1_exvivo_thresh='label/lh.V1_exvivo.thresh.label',
- lh_V2_exvivo_thresh='label/lh.V2_exvivo.thresh.label',
- lh_MT_exvivo_thresh='label/lh.MT_exvivo.thresh.label',
- lh_entorhinal_exvivo_thresh='label/lh.entorhinal_exvivo.thresh.label',
- lh_perirhinal_exvivo_thresh='label/lh.perirhinal_exvivo.thresh.label',
- rh_BA1_exvivo='label/rh.BA1_exvivo.label',
- rh_BA2_exvivo='label/rh.BA2_exvivo.label',
- rh_BA3a_exvivo='label/rh.BA3a_exvivo.label',
- rh_BA3b_exvivo='label/rh.BA3b_exvivo.label',
- rh_BA4a_exvivo='label/rh.BA4a_exvivo.label',
- rh_BA4p_exvivo='label/rh.BA4p_exvivo.label',
- rh_BA6_exvivo='label/rh.BA6_exvivo.label',
- rh_BA44_exvivo='label/rh.BA44_exvivo.label',
- rh_BA45_exvivo='label/rh.BA45_exvivo.label',
- rh_V1_exvivo='label/rh.V1_exvivo.label',
- rh_V2_exvivo='label/rh.V2_exvivo.label',
- rh_MT_exvivo='label/rh.MT_exvivo.label',
- rh_entorhinal_exvivo='label/rh.entorhinal_exvivo.label',
- rh_perirhinal_exvivo='label/rh.perirhinal_exvivo.label',
- rh_BA1_exvivo_thresh='label/rh.BA1_exvivo.thresh.label',
- rh_BA2_exvivo_thresh='label/rh.BA2_exvivo.thresh.label',
- rh_BA3a_exvivo_thresh='label/rh.BA3a_exvivo.thresh.label',
- rh_BA3b_exvivo_thresh='label/rh.BA3b_exvivo.thresh.label',
- rh_BA4a_exvivo_thresh='label/rh.BA4a_exvivo.thresh.label',
- rh_BA4p_exvivo_thresh='label/rh.BA4p_exvivo.thresh.label',
- rh_BA6_exvivo_thresh='label/rh.BA6_exvivo.thresh.label',
- rh_BA44_exvivo_thresh='label/rh.BA44_exvivo.thresh.label',
- rh_BA45_exvivo_thresh='label/rh.BA45_exvivo.thresh.label',
- rh_V1_exvivo_thresh='label/rh.V1_exvivo.thresh.label',
- rh_V2_exvivo_thresh='label/rh.V2_exvivo.thresh.label',
- rh_MT_exvivo_thresh='label/rh.MT_exvivo.thresh.label',
- rh_entorhinal_exvivo_thresh='label/rh.entorhinal_exvivo.thresh.label',
- rh_perirhinal_exvivo_thresh='label/rh.perirhinal_exvivo.thresh.label')
- return datasource, outfields
-
-def source_long_files_workflow(name="Source_Longitudinal_Files"):
- """Creates a workflow to source the longitudinal files from a freesurfer directory.
- This should only be used when the files are not in a prexisting workflow"""
-
- wf = Workflow(name=name)
-
- inputspec = Node(IdentityInterface(fields=['subject_id',
- 'subjects_dir',
- 'timepoints']),
- name="inputspec")
-
- # TODO: Create outputspec
-
- # grab files from the initial single session run
- grab_inittp_files = pe.Node(nio.DataGrabber(), name="Grab_Initial_Files",
- infields=['subject_id'],
- outfileds=['inputvols', 'iscales', 'ltas'])
- grab_inittp_files.inputs.template = '*'
- grab_inittp_files.inputs.base_directory = config['subjects_dir']
- grab_inittp_files.inputs.field_template = dict(inputvols='%s/mri/orig/0*.mgz',
- iscales='%s/mri/orig/0*-iscale.txt',
- ltas='%s/mri/orig/0*.lta')
-
- grab_inittp_files.inputs.template_args = dict(inputvols=[['subject_id']],
- iscales=[['subject_id']],
- ltas=[['subject_id']])
-
- wf.connect([(grab_inittp_files, outputspec, [('inputvols', 'inputspec.in_T1s'),
- ('iscales', 'inputspec.iscales'),
- ('ltas', 'inputspec.ltas')])])
-
- merge_norms = pe.Node(Merge(len(config['timepoints'])), name="Merge_Norms")
- merge_segs = pe.Node(Merge(len(config['timepoints'])), name="Merge_Segmentations")
- merge_segs_noCC = pe.Node(Merge(len(config['timepoints'])), name="Merge_Segmentations_noCC")
- merge_template_ltas = pe.Node(Merge(len(config['timepoints'])), name="Merge_Template_ltas")
-
- for i, tp in enumerate(config['timepoints']):
- # datasource timepoint files
- tp_data_source = pe.Node(FreeSurferSource(), name="{0}_DataSource".format(tp))
- tp_data_source.inputs.subject_id = tp
- tp_data_source.inputs.subjects_dir = config['subjects_dir']
-
- tp_data_grabber = pe.Node(nio.DataGrabber(), name="{0}_DataGrabber".format(tp),
- infields=['tp', 'long_tempate'],
- outfileds=['subj_to_template_lta', 'seg_noCC', 'seg_presurf'])
- tp_data_grabber.inputs.template = '*'
- tp_data_grabber.inputs.base_directory = config['subjects_dir']
- tp_data_grabber.inputs.field_template = dict(
- subj_to_template_lta='%s/mri/transforms/%s_to_%s.lta',
- seg_noCC='%s/mri/aseg.auto_noCCseg.mgz',
- seg_presurf='%s/mri/aseg.presurf.mgz',)
-
- tp_data_grabber.inputs.template_args = dict(
- subj_to_template_lta=[['long_template', 'tp', 'long_template']],
- seg_noCC=[['tp']],
- seg_presurf=[['tp']])
-
- wf.connect([(tp_data_source, merge_norms, [('norm',
- 'in{0}'.format(i))]),
- (tp_data_grabber, merge_segs, [('seg_presurf',
- 'in{0}'.format(i))]),
- (tp_data_grabber, merge_segs_noCC, [('seg_noCC',
- 'in{0}'.format(i))]),
- (tp_data_grabber, merge_template_ltas, [('subj_to_template_lta',
- 'in{0}'.format(i))])])
-
- if tp == config['subject_id']:
- wf.connect([(tp_data_source, outputspec, [('wm', 'inputspec.init_wm')]),
- (tp_data_grabber, outputspec, [('subj_to_template_lta',
- 'inputspec.subj_to_template_lta')]),
- (tp_data_grabber, outputspec, [('subj_to_template_lta',
- 'inputspec.subj_to_template_lta')])])
-
- wf.connect([(merge_norms, outputspec, [('out', 'inputspec.alltps_norms')]),
- (merge_segs, outputspec, [('out', 'inputspec.alltps_segs')]),
- (merge_template_ltas, outputspec, [('out', 'inputspec.alltps_to_template_ltas')]),
- (merge_segs_noCC, outputspec, [('out', 'inputspec.alltps_segs_noCC')])])
-
-
-
- # datasource files from the template run
- ds_template_files = pe.Node(FreeSurferSource(), name="Datasource_Template_Files")
- ds_template_files.inputs.subject_id = config['subject_id']
- ds_template_files.inputs.subjects_dir = config['subjects_dir']
-
- wf.connect([(ds_template_files, ar1_wf, [('brainmask', 'inputspec.template_brainmask')]),
- (ds_template_files, outputspec, [('aseg', 'inputspec.template_aseg')])])
-
- # grab files from template run
- grab_template_files = pe.Node(nio.DataGrabber(), name="Grab_Template_Files",
- infields=['subject_id', 'long_template'],
- outfields=['template_talairach_xfm',
- 'template_talairach_lta',
- 'template_talairach_m3z',
- 'template_label_intensities',
- 'template_lh_white',
- 'template_rh_white',
- 'template_lh_pial',
- 'template_rh_pial'])
- grab_template_files.inputs.template = '*'
- grab_template_files.inputs.base_directory = config['subjects_dir']
- grab_template_files.inputs.subject_id = config['subject_id']
- grab_template_files.inputs.long_template = config['long_template']
- grab_template_files.inputs.field_template = dict(
- template_talairach_xfm='%s/mri/transfroms/talairach.xfm',
- template_talairach_lta='%s/mri/transfroms/talairach.lta',
- template_talairach_m3z='%s/mri/transfroms/talairach.m3z',
- template_label_intensities='%s/mri/aseg.auto_noCCseg.label_intensities.txt',
- template_lh_white='%s/surf/lh.white',
- template_rh_white='%s/surf/rh.white',
- template_lh_pial='%s/surf/lh.pial',
- template_rh_pial='%s/surf/rh.pial')
-
- grab_template_files.inputs.template_args = dict(
- template_talairach_xfm=[['long_template']],
- template_talairach_lta=[['long_template']],
- template_talairach_m3z=[['long_template']],
- template_lh_white=[['long_template']],
- template_rh_white=[['long_template']],
- template_lh_pial=[['long_template']],
- template_rh_pial=[['long_template']])
- wf.connect([(grab_template_files, outputspec, [('template_talairach_xfm',
- 'inputspec.template_talairach_xfm'),
- ('template_talairach_lta',
- 'inputspec.template_talairach_lta'),
- ('template_talairach_m3z',
- 'inputspec.template_talairach_m3z'),
- ('template_label_intensities',
- 'inputspec.template_label_intensities'),
- ('template_lh_white', 'inputspec.template_lh_white'),
- ('template_rh_white', 'inputspec.template_rh_white'),
- ('template_lh_pial', 'inputspec.template_lh_pial'),
- ('template_rh_pial', 'inputspec.template_rh_pial')])
- ])
- return wf