Page tree
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 17 Next »

 Read further if:

  • you develop or run image processing locally in bash shell;
  • you want to use XNAT as data source and/or remote archive;
  • you want to automate processing of multiple XNAT sessions and want to report your results in spreadsheets.
  • you want to focus on local script development and minimize programming effort to interact with XNAT.

xtolm installation

Client machine prerequisites to run xtolm on a 64 bit Linux: download xtolm (XNAT tools and  Java if you plan to upload data back to XNAT), make sure all required components are on your path.

Use case: compute whole brain volume using Brain Extraction Tool for all MPRAGE scans in an MRI project in XNAT and save statistics in a spreadsheet.

This use case prerequisites: FSL and dcm2nii on your path; we'll use data from surfmask_smpl project on XNAT central so you'll need an XNAT Central account.

1. Write a bash script that computes human brain volume from a T1-weighted MR image in DICOM format (assuming FSL v4+ is on the path).
in=$1 #input DICOM directory
dcm2nii $in
cp $in/[0-9]*.nii head.nii
if [ ! -f "head.nii" ]; then exit -1; fi
bet head brain -m
vol=(`fslstats brain_mask -V`); vol=${vol[1]}
#report the volume in mm3 to console.
echo $vol

2. Create a spreadsheet listing all scans that you want to process. In many cases, you'll have such spreadsheet; if not, you can start with the one created using the one generated by XNAT:

Now, modify the spreadsheet so that column names don't contain spaces, empty and irrelevant cells are removed, and set "Scans" to "SPGR", the scan type that we want to process for each session. If any fields have trailing zeroes, you'll need to modify the csv outside of Excel to add them.


3. Now we are ready to adapt the to process all sessions from test.csv.

#this line will convert test.csv to a bash source test.params that assigns corresponding arrays.
rcsv test.csv `pwd`/test.params; source `pwd`/test.params; rm test.params
#as a result, the following arrays are assigned:
#MR_ID=("001_obscured" "002_obscured" ...
#Subject=("001" "002" "003" ...
#Scans=("SPGR" "SPGR" "SPGR" ...
#now, iterate over all MR sessions. 
set -x
for ((i=0; i<${#MR_ID[*]}; i++)); do
    #this will create directory named ${MR_ID[i]} and cd to it.
    set_context ${Subject[i]} ${MR_ID[i]}
    #load specified scan from XNAT.
    load_type "${Scans[i]}"
    #find the scan that was created by load_type.
    set -x
    in=`ls -d study*`
    #compute brain volume, similarly to
    dcm2nii $in
    #select the unmodified NIFTI volume to process.
    cp $in/[0-9]*[0-9].nii head.nii
    if [ ! -f "head.nii" ]; then exit -1; fi
    bet head brain -m
    vol=(`fslstats brain_mask -V`); vol=${vol[1]}
    #now we use special command to save a variable with this offline session.
    save_vars vol
#return to the starting directory.
set_context null 
#this will go through all sessions and generate a spreadsheet summarizing variables saved with "save_vars" command.
summary mpr_vol


4. Test the batch processing.

We will run the developed script on a single session to see if it works. For that, uncomment break statement on line 26. Now we are ready to run the script:

running brain_volume.xt
xtolm -sr -u your_xnat_central_user -pr surfmask_smpl -o brain_volume.xt
cat mpr_vol.csv

After making sure that mpr_vol.csv produces correct output, comment the break statement on line 26 and rerun the brain_volume.xt. Now we'll have the BET calculations saved in mpr_vol.csv file ready for analysis (note that Excel removed trailing zeroes from the subject again).

Xtolm Commands:

Online: mostly, load/save scans, resources and metadata as resources.

        load_scans <id1>[,...]  load scan(s) to DICOM by ID to the processing dir

        load_type <type>        load scan(s) DICOM of a given type to the processing dir

        load_dir <dir1>[,...]           load resource dir(s) to the processing dir

        load_file <fil1> [fil2..]       load resource file(s) to the processing dir

        load_workspace                  load context variables from XNAT session (stored under unique analysis ID)

        save_workspace                  save context variables to XNAT session (stored under unique analysis ID)

        save_dir <dir>                  write resource dir to XNAT (overwrites existing)

Offline: The purpose is to enable repeated analysis. Each study is loaded into a separate directory and is processed within 'processing context': XNAT subject and experiment label. Each study directory can have DICOM, resource dirs, and some configuration files that store context information. Workspace is defined as all user-defined variables that should be saved as key-value pairs.

        set_context <subject> <session> set current context of subject and session

        save_vars [var1] [,..]          save one or more variables to current context

        summary <label>                 generate a comma-separated summary of existing contexts

        help                            list console commands

        quit                            quit interactive console

Command Details

set_context: move to another processing dir that corresponds to another XNAT session, and load/init workspace variables for that session. Previously computed variables are loaded using this command.
save_vars: save local vars (name+value) to current context (context is saved in a file within processing dir or online in XNAT experiment).
load/save workspace: sync context variables with XNAT session (load/save to designated resource)

Script execution mode: the user writes a script which is interpreted by xtolm, converted to a bash script and executed.

xtolm [options] <script file> [...]

Interactive mode : line-by-line command input.
Offline mode: all code that interacts with XNAT is ignored.
Debug mode:

Useful capabilities

rcsv: convert a csv file into a set of bash arrays - can be used in xtolm to change session context automatically, load specific scans/dirs, etc.
summary: create a summary spreadsheet that aggregates all contexts accross the batch processing session.
xtolm design features:

  • flexible coupling with online XNAT server. The analysis can be saved/loaded from/to XNAT using unique analysis ID. However, completely offline processing is possible (-o mode), when repeated or advanced analyses are required. Context is still saved in session configuration files. The design tolerates the same script in offline mode, ignoring all local<->xnat transactions.
  • repeatability of analysis - helped by re-using saved session context. Modifying the script and re-running is easy.
  • minimal local data structure requirements.
  • No labels