Read further if:
- you develop or run image processing locally in bash shell;
- you want to use XNAT as data source and/or remote archive;
- you want to automate processing of multiple XNAT sessions and want to report your results in spreadsheets.
- you want to focus on local script development and minimize programming effort to interact with XNAT.
Use case: compute whole brain volume using Brain Extraction Tool for all MPRAGE scans in an MRI project in XNAT and save statistics in a spreadsheet.
1. Write a bash script that computes human brain volume from a T1-weighted MR image in DICOM format (assuming FSL v4+ is on the path).
2. Create a spreadsheet listing all scans that you want to process. In many cases, you'll have such spreadsheet; if not, you can start with the one created using the one generated by XNAT:
Now, modify the spreadsheet so that column names don't contain spaces, empty and irrelevant cells are removed, and set "Scans" to "SPGR", the scan type that we want to process for each session. If any fields have trailing zeroes, you'll need to modify the csv outside of Excel to add them.
3. Now we are ready to adapt the brain_volume.sh to process all sessions from test.csv.
4. Test the batch processing.
We will run the developed script on a single session to see if it works. For that, uncomment break statement on line 26. Now we are ready to run the script:
After making sure that mpr_vol.csv produces correct output, comment the break statement on line 26 and rerun the brain_volume.xt. Now we'll have the BET calculations saved in mpr_vol.csv file ready for analysis (note that Excel removed trailing zeroes from the subject again).
Online: mostly, load/save scans, resources and metadata as resources.
load_scans <id1>[,...] load scan(s) to DICOM by ID to the processing dir
load_type <type> load scan(s) DICOM of a given type to the processing dir
load_dir <dir1>[,...] load resource dir(s) to the processing dir
load_file <fil1> [fil2..] load resource file(s) to the processing dir
load_workspace load context variables from XNAT session (stored under unique analysis ID)
save_workspace save context variables to XNAT session (stored under unique analysis ID)
save_dir <dir> write resource dir to XNAT (overwrites existing)
Offline: The purpose is to enable repeated analysis. Each study is loaded into a separate directory and is processed within 'processing context': XNAT subject and experiment label. Each study directory can have DICOM, resource dirs, and some configuration files that store context information. Workspace is defined as all user-defined variables that should be saved as key-value pairs.
set_context <subject> <session> set current context of subject and session
save_vars [var1] [,..] save one or more variables to current context
summary <label> generate a comma-separated summary of existing contexts
help list console commands
quit quit interactive console
set_context: move to another processing dir that corresponds to another XNAT session, and load/init workspace variables for that session. Previously computed variables are loaded using this command.
save_vars: save local vars (name+value) to current context (context is saved in a file within processing dir or online in XNAT experiment).
load/save workspace: sync context variables with XNAT session (load/save to designated resource)
Script execution mode: the user writes a script which is interpreted by xtolm, converted to a bash script and executed.
xtolm [options] <script file> [...]
Interactive mode : line-by-line command input.
Offline mode: all code that interacts with XNAT is ignored.
rcsv: convert a csv file into a set of bash arrays - can be used in xtolm to change session context automatically, load specific scans/dirs, etc.
summary: create a summary spreadsheet that aggregates all contexts accross the batch processing session.
xtolm design features:
- flexible coupling with online XNAT server. The analysis can be saved/loaded from/to XNAT using unique analysis ID. However, completely offline processing is possible (-o mode), when repeated or advanced analyses are required. Context is still saved in session configuration files. The design tolerates the same script in offline mode, ignoring all local<->xnat transactions.
- repeatability of analysis - helped by re-using saved session context. Modifying the script and re-running is easy.
- minimal local data structure requirements.