nnUnet training workflow for fetal bold MRI

public public 1yr ago Version: 2 0 bookmarks

Snakemake workflow for fetal bold brain segmentation

If you don't have AFNI and FSL installed, you need to use the --use-singularity option when running snakemake .

Training is best with a GPU, but inference can be done reasonably fast with CPU only.

Step 1: Obtain a copy of this workflow

  1. Create a new github repository using this workflow as a template .

  2. Clone the newly created repository to your local system, into the place where you want to perform the data analysis.

Step 2: Configure workflow

Configure the workflow according to your needs via editing config.yml file, specifically the paths to your nifti images.

Step 3: Install python dependencies

You should install your dependencies in a virtual environment. Once you have activated your virtual environment, you can install the dependencies with pip install .

A recommended alternative that also takes care of creating a virtual environment is to use Poetry. On OSX or Linux can be installed with:

curl -sSL https://install.python-poetry.org | python3 -

Once you have poetry installed you can simply use the following to install dependencies into a virtual environment, then activate it:

cd nnunet-fetalbrain
poetry install
poetry shell

Step 4: Execute workflow

To run inference on your test datasets, use:

snakemake all_test --cores all

By default, the trained model in the config will be downloaded and applied.

If you want to train a new model, set the use_downloaded config variable to one that is not in the download_model , then use:

snakemake all_train --cores all

Code Snippets

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
import json

#load template json
with open(snakemake.input.template_json) as f:
    dataset = json.load(f)

dataset['training'] = [{'image': img, 'label': lbl} for img,lbl in zip(snakemake.params.training_imgs_nosuffix,snakemake.input.training_lbls)]



dataset['numTraining'] = len(dataset['training'])

#write modified json
with open(snakemake.output.dataset_json, 'w') as f:
    json.dump(dataset, f, indent=4)
67
shell: 'wget {params.url}'
73
shell: 'mkdir -p trained_model && tar -C trained_models -xvf {input}'
84
shell: 'fslsplit {input} {params.prefix} -t'
96
shell: 'fslsplit {input} {params.prefix} -t'
103
shell: '3dresample -dxyz 3.5 3.5 3.5 -prefix {output} -input {input}'
SnakeMake From line 103 of master/Snakefile
109
shell: '3dZeropad -RL 96 -AP 96 -prefix {output} {input}'
SnakeMake From line 109 of master/Snakefile
117
shell: 'cp {input} {output}'
SnakeMake From line 117 of master/Snakefile
125
shell: 'cp {input} {output}'
SnakeMake From line 125 of master/Snakefile
135
shell: 'cp {input} {output}'
SnakeMake From line 135 of master/Snakefile
149
script: 'create_json.py' 
SnakeMake From line 149 of master/Snakefile
169
170
171
shell:
    '{params.nnunet_env_cmd} && '
    'nnUNet_plan_and_preprocess  -t {params.task_num} --verify_dataset_integrity'
SnakeMake From line 169 of master/Snakefile
196
197
198
199
shell:
    '{params.nnunet_env_cmd} && '
    '{params.rsync_to_tmp} && '
    'nnUNet_train {params.checkpoint_opt} {wildcards.arch} {wildcards.trainer} {wildcards.unettask} {wildcards.fold}'
SnakeMake From line 196 of master/Snakefile
213
214
shell:
    'tar -cvf {output} -C {params.trained_model_dir} {params.files_to_tar}'
SnakeMake From line 213 of master/Snakefile
242
243
244
shell:
    '{params.nnunet_env_cmd} && '
    'nnUNet_predict  -chk {wildcards.checkpoint}  -i {params.in_folder} -o {params.out_folder} -t {wildcards.unettask}'
SnakeMake From line 242 of master/Snakefile
265
266
shell:
    'fslmerge -t {output} {input}'
SnakeMake From line 265 of master/Snakefile
ShowHide 15 more snippets with no or duplicated tags.

Login to post a comment if you would like to share your experience with this workflow.

Do you know this workflow well? If so, you can request seller status , and start supporting this workflow.

Free

Created: 1yr ago
Updated: 1yr ago
Maitainers: public
URL: https://github.com/akhanf/nnunet-fetalbrain
Name: nnunet-fetalbrain
Version: 2
Badge:
workflow icon

Insert copied code into your website to add a link to this workflow.

Other Versions:
Downloaded: 0
Copyright: Public Domain
License: None
  • Future updates

Related Workflows

cellranger-snakemake-gke
snakemake workflow to run cellranger on a given bucket using gke.
A Snakemake workflow for running cellranger on a given bucket using Google Kubernetes Engine. The usage of this workflow ...