Neuroimaging Data Processing Workflow

public public 1yr ago 0 bookmarks

clinical-atlasreg

Inputs:

  • participants.tsv with target subject IDs

  • bids folder

    • other folder containing bids-like processed data

Singularity containers required:

  • khanlab/neuroglia-core:latest

Authors

  • Your name here @yourgithubid

Usage

If you use this workflow in a paper, don't forget to give credits to the authors by citing the URL of this (original) repository and, if available, its DOI (see above).

Step 1: Obtain a copy of this workflow

  1. Create a new github repository using this workflow as a template .

  2. Clone the newly created repository to your local system, into the place where you want to perform the data analysis.

Step 2: Configure workflow

Configure the workflow according to your needs via editing the files in the config/ folder. Adjust config.yml to configure the workflow execution, and participants.tsv to specify your subjects.

Step 3: Install Snakemake

Install Snakemake using conda :

conda create -c bioconda -c conda-forge -n snakemake snakemake

For installation details, see the instructions in the Snakemake documentation .

Step 4: Execute workflow

Activate the conda environment:

conda activate snakemake

Test your configuration by performing a dry-run via

snakemake --use-singularity -n

Execute the workflow locally via

snakemake --use-singularity --cores $N

using $N cores or run it in a cluster environment via

snakemake --use-singularity --cluster qsub --jobs 100

or

snakemake --use-singularity --drmaa --jobs 100

If you are using Compute Canada, you can use the cc-slurm profile, which submits jobs and takes care of requesting the correct resources per job (including GPUs). Once it is set-up with cookiecutter, run:

snakemake --profile cc-slurm

Or, with neuroglia-helpers can get a 8-core, 32gb node and run locally there. First, get a node (default 8-core, 32gb, 3 hour limit):

regularInteractive 

Then, run:

snakemake --use-singularity --cores 8 --resources mem=32000

See the Snakemake documentation for further details.

Step 5: Investigate results

After successful execution, you can create a self-contained interactive HTML report with all results via:

snakemake --report report.html

This report can, e.g., be forwarded to your collaborators. An example (using some trivial test data) can be seen here .

Step 6: Commit changes

Whenever you change something, don't forget to commit the changes back to your github copy of the repository:

git commit -a
git push

Step 7: Obtain updates from upstream

Whenever you want to synchronize your workflow copy with new developments from upstream, do the following.

  1. Once, register the upstream repository in your local copy: git remote add -f upstream [email protected]:snakemake-workflows/{{cookiecutter.repo_name}}.git or git remote add -f upstream https://github.com/snakemake-workflows/{{cookiecutter.repo_name}}.git if you do not have setup ssh keys.

  2. Update the upstream version: git fetch upstream .

  3. Create a diff with the current version: git diff HEAD upstream/master workflow > upstream-changes.diff .

  4. Investigate the changes: vim upstream-changes.diff .

  5. Apply the modified diff via: git apply upstream-changes.diff .

  6. Carefully check whether you need to update the config files: git diff HEAD upstream/master config . If so, do it manually, and only where necessary, since you would otherwise likely overwrite your settings and samples.

Step 8: Contribute back

In case you have also changed or added steps, please consider contributing them back to the original repository:

  1. Fork the original repo to a personal or lab account.

  2. Clone the fork to your local system, to a different place than where you ran your analysis.

  3. Copy the modified files from your analysis to the clone of your fork, e.g., cp -r workflow path/to/fork . Make sure to not accidentally copy config file contents or sample sheets. Instead, manually update the example config files if necessary.

  4. Commit and push your changes to your fork.

  5. Create a pull request against the original repository.

Testing

TODO: create some test datasets

Code Snippets

23
script: '../scripts/vis_electrodes.py'
15
shell: 'cp {input} {output}'
27
28
shell:
    'reg_aladin -flo {input.flo} -ref {input.ref} -res {output.warped_subj} -aff {output.xfm_ras}'
37
38
shell:
    'c3d_affine_tool {input}  -oitk {output}'
49
50
51
52
53
54
55
56
57
58
59
60
    shell: 'antsApplyTransforms -d 3 --interpolation NearestNeighbor -i {input.mask} -o {output.mask} -r {input.ref} '
            ' -t [{input.xfm},1] ' #use inverse xfm (going from template to subject)

rule warp_tissue_probseg_from_template_affine:
    input: 
        probseg = config['template_tissue_probseg'],
        ref = bids(root='results',subject='{subject}',suffix='T1w.nii.gz'),
        xfm = bids(root='results',subject='{subject}',suffix='xfm.txt',from_='subject',to='{template}',desc='{desc}',type_='itk'),
    output:
        probseg = bids(root='results',subject='{subject}',suffix='probseg.nii.gz',label='{tissue}',from_='{template}',reg='{desc}'),
    container: config['singularity']['neuroglia']
    group: 'preproc'
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
    shell: 
        'ITK_GLOBAL_DEFAULT_NUMBER_OF_THREADS={threads} '
        'antsApplyTransforms -d 3 --interpolation Linear -i {input.probseg} -o {output.probseg} -r {input.ref} '
            ' -t [{input.xfm},1]' #use inverse xfm (going from template to subject)


rule n4biasfield:
    input: 
        t1 = bids(root='results',subject='{subject}',suffix='T1w.nii.gz'),
        mask = bids(root='results',subject='{subject}',suffix='mask.nii.gz',from_='{template}'.format(template=config['template']),reg='affine',desc='brain'),
    output:
        t1 = bids(root='results',subject='{subject}',desc='n4', suffix='T1w.nii.gz'),
    threads: 8
    container: config['singularity']['neuroglia']
    group: 'preproc'
79
80
81
shell:
    'ITK_GLOBAL_DEFAULT_NUMBER_OF_THREADS={threads} '
    'N4BiasFieldCorrection -d 3 -i {input.t1} -x {input.mask} -o {output}'
92
93
shell:
    'fslmaths {input.t1} -mas {input.mask} {output}'
104
105
shell:
    'fslmaths {input.t1} -mas {input.mask} {output}'
141
142
143
144
145
146
147
148
shell: 
    'ITK_GLOBAL_DEFAULT_NUMBER_OF_THREADS={threads} '
    'antsRegistration {params.base_opts} {params.intensity_opts} '
    '{params.init_transform} ' #initial xfm  -- rely on this for affine
#    '-t Rigid[0.1] {params.linear_metric} {params.linear_multires} ' # rigid registration
#    '-t Affine[0.1] {params.linear_metric} {params.linear_multires} ' # affine registration
    '{params.deform_model} {params.deform_metric} {params.deform_multires} '  # deformable registration
    '-o [{params.out_prefix},{output.warped_flo}]'
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
    shell: 
        'ITK_GLOBAL_DEFAULT_NUMBER_OF_THREADS={threads} '
        'antsApplyTransforms -d 3 --interpolation NearestNeighbor -i {input.dseg} -o {output.dseg} -r {input.ref} '
            ' -t {input.inv_composite} ' #use inverse xfm (going from template to subject)


rule warp_tissue_probseg_from_template:
    input: 
        probseg = config['template_tissue_probseg'],
        ref = bids(root='results',subject='{subject}',suffix='T1w.nii.gz'),
        inv_composite = bids(root='results',suffix='InverseComposite.h5',from_='subject',to='{template}',subject='{subject}'),
    output:
        probseg = bids(root='results',subject='{subject}',suffix='probseg.nii.gz',label='{tissue}',from_='{template}',reg='SyN'),
    container: config['singularity']['neuroglia']
    group: 'preproc'
183
184
185
186
187
188
189
190
191
192
193
194
195
196
    shell: 
        'ITK_GLOBAL_DEFAULT_NUMBER_OF_THREADS={threads} '
        'antsApplyTransforms -d 3 --interpolation Linear -i {input.probseg} -o {output.probseg} -r {input.ref} '
            ' -t {input.inv_composite} ' #use inverse xfm (going from template to subject)

rule warp_brainmask_from_template:
    input: 
        mask = config['template_mask'],
        ref = bids(root='results',subject='{subject}',suffix='T1w.nii.gz'),
        inv_composite = bids(root='results',suffix='InverseComposite.h5',from_='subject',to='{template}',subject='{subject}'),
    output:
        mask = bids(root='results',subject='{subject}',suffix='mask.nii.gz',from_='{template}',reg='SyN',desc='brain'),
    container: config['singularity']['neuroglia']
    group: 'preproc'
200
201
202
203
204
205
206
207
208
209
210
211
212
213
    shell: 
        'ITK_GLOBAL_DEFAULT_NUMBER_OF_THREADS={threads} '
        'antsApplyTransforms -d 3 --interpolation NearestNeighbor -i {input.mask} -o {output.mask} -r {input.ref} '
            ' -t {input.inv_composite} ' #use inverse xfm (going from template to subject)

rule dilate_brainmask:
    input:
        mask = bids(root='results',subject='{subject}',suffix='mask.nii.gz',from_='{template}',reg='{desc}',desc='brain'),
    params:
        dil_opt =  ' '.join([ '-dilD' for i in range(config['n_init_mask_dilate'])])
    output:
        mask = bids(root='results',subject='{subject}',suffix='mask.nii.gz',from_='{template}',reg='{desc}',desc='braindilated'),
    container: config['singularity']['neuroglia']
    group: 'preproc'
214
215
shell:
    'fslmaths {input} {params.dil_opt} {output}'
228
229
shell:
    'fslmaths {input} {params.dil_opt} {output}'
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
    shell:
        'Atropos -d 3 -a {input.t1} -i KMeans[{params.k}] -x {input.mask} -o [{output.seg},{params.posterior_fmt}] && '
        'fslmerge -t {output.posteriors} {params.posterior_glob} ' #merge posteriors into a 4d file (intermediate files will be removed b/c shadow)


rule map_channels_to_tissue:
    input:
        tissue_priors = expand(bids(root='results',subject='{subject}',suffix='probseg.nii.gz',label='{tissue}',from_='{template}'.format(template=config['template']),reg='affine'),
                            tissue=config['tissue_labels'],allow_missing=True),
        seg_channels_4d = bids(root='results',subject='{subject}',suffix='probseg.nii.gz',desc='atroposKseg'),
    output:
        mapping_json = bids(root='results',subject='{subject}',suffix='mapping.json',desc='atropos3seg'),
        tissue_segs = expand(bids(root='results',subject='{subject}',suffix='probseg.nii.gz',label='{tissue}',desc='atropos3seg'),
                            tissue=config['tissue_labels'],allow_missing=True),
    group: 'preproc'
37
script: '../scripts/map_channels_to_tissue.py'
47
48
shell:
    'fslmerge -t {output} {input}'
17
script: '../scripts/vis_regqc.py'
30
script: '../scripts/vis_qc_probseg.py'
47
script: '../scripts/vis_qc_dseg.py'
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
import numpy as np
import nibabel as nib
import json

#load up tissue probability, warped from template
tissue_prob_vol = dict()

for label,nii in zip(snakemake.config['tissue_labels'], snakemake.input.tissue_priors):
    print(label)
    print(nii)
    tissue_prob_vol[label] = nib.load(nii).get_fdata()


#load up k-class tissue segmentation
tissue_k_seg = nib.load(snakemake.input.seg_channels_4d)
tissue_k_seg.shape

sim_prior_k = np.zeros([len(snakemake.config['tissue_labels']),tissue_k_seg.shape[3]])

#for each prior, need to find the channel that best fits
for i,label in enumerate(snakemake.config['tissue_labels']):
    for k in range(tissue_k_seg.shape[3]):

        print(f'Computing overlap of {label} prior and channel {k}... ')
        #compute intersection over union
        s1 = tissue_prob_vol[label] >0.5
        s2 = tissue_k_seg.slicer[:,:,:,k].get_fdata() >0.5
        sim_prior_k[i,k] = np.sum(np.logical_and(s1,s2).flat) / np.sum(np.logical_or(s1,s2).flat) 


print('Overlap table:')
print(sim_prior_k)

label_to_k_dict = dict()

for i,label in enumerate(snakemake.config['tissue_labels']):
    label_to_k_dict[label] = int(np.argmax(sim_prior_k[i,:]))
    #write nii to file
    print('writing image at channel {} to output file: {}'.format(label_to_k_dict[label], \
                                                    snakemake.output.tissue_segs[i]))
    nib.save(tissue_k_seg.slicer[:,:,:,label_to_k_dict[label]],\
                    snakemake.output.tissue_segs[i])


with open(snakemake.output.mapping_json, 'w') as outfile:
    json.dump(label_to_k_dict, outfile,indent=4)
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
import pandas as pd
import numpy as np
import matplotlib
matplotlib.use('Agg')

#read fcsv electrodes file
df = pd.read_table(snakemake.input.fcsv,sep=',',header=2)

#load transform from subj to template
sub2template= np.loadtxt(snakemake.input.xfm_ras)

#plot electrodes transformed (affine) to MNI space, with MNI glass brain
from nilearn import plotting

coords = df[['x','y','z']].to_numpy()
#print(coords.shape)

#to plot in mni space, need to transform coords
tcoords = np.zeros(coords.shape)
for i in range(len(coords)):

    vec = np.hstack([coords[i,:],1])
    tvec = np.linalg.inv(sub2template) @ vec.T
    tcoords[i,:] = tvec[:3]


html_view = plotting.view_markers(tcoords)
html_view.save_as_html(snakemake.output.html)

#plot subject native space electrodes with glass brain
adjacency_matrix = np.zeros([len(coords),len(coords)])


display = plotting.plot_connectome(adjacency_matrix, tcoords)
display.savefig(snakemake.output.png)
display.close()
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
from nilearn import plotting
import matplotlib.pyplot as plt


import matplotlib
matplotlib.use('Agg')

html_view = plotting.view_img(stat_map_img=snakemake.input.seg,bg_img=snakemake.input.img,
                              opacity=0.5,cmap='viridis',dim=-1,threshold=0.5,
                              symmetric_cmap=False,title='sub-{subject}'.format(**snakemake.wildcards))

html_view.save_as_html(snakemake.output.html)



display = plotting.plot_roi(roi_img=snakemake.input.seg, bg_img=snakemake.input.img, display_mode='ortho')
display.savefig(snakemake.output.png)
display.close()
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
from nilearn import plotting
import matplotlib.pyplot as plt


import matplotlib
matplotlib.use('Agg')

#html_view = plotting.view_img(stat_map_img=snakemake.input.seg,bg_img=snakemake.input.img,
#                              opacity=0.5,cmap='viridis',dim=-1,threshold=0.5,
#                              symmetric_cmap=False,title='sub-{subject}'.format(**snakemake.wildcards))
#
#html_view.save_as_html(snakemake.output.html)



display = plotting.plot_prob_atlas(bg_img=snakemake.input.img,maps_img=snakemake.input.seg4d,display_mode='ortho')
display.savefig(snakemake.output.png)
display.close()
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
from nilearn import plotting
import matplotlib.pyplot as plt


import matplotlib
matplotlib.use('Agg')

html_view = plotting.view_img(stat_map_img=snakemake.input.ref,bg_img=snakemake.input.flo,
                              opacity=0.5,cmap='viridis',dim=-1,
                              symmetric_cmap=False,title='sub-{subject}'.format(**snakemake.wildcards))

html_view.save_as_html(snakemake.output.html)



display = plotting.plot_anat(snakemake.input.flo,display_mode='ortho')
display.add_contours(snakemake.input.ref,colors='r')
display.savefig(snakemake.output.png)
display.close()
46
shell: 'tar -cvf {output} {input}'
53
shell: 'tar -cvf {output} {input}'
ShowHide 24 more snippets with no or duplicated tags.

Login to post a comment if you would like to share your experience with this workflow.

Do you know this workflow well? If so, you can request seller status , and start supporting this workflow.

Free

Created: 1yr ago
Updated: 1yr ago
Maitainers: public
URL: https://github.com/akhanf/clinical-atlasreg
Name: clinical-atlasreg
Version: 1
Badge:
workflow icon

Insert copied code into your website to add a link to this workflow.

Downloaded: 0
Copyright: Public Domain
License: MIT License
  • Future updates

Related Workflows

cellranger-snakemake-gke
snakemake workflow to run cellranger on a given bucket using gke.
A Snakemake workflow for running cellranger on a given bucket using Google Kubernetes Engine. The usage of this workflow ...