- About XNAT
- News & Events
- XNAT Marketplace
- Contact Us
Many studies collect imaging data at multiple sites. A single XNAT system is often used in studies like this to aggregate the data in a single, secure system from which the data can be reviewed, processed, and shared. However, getting the data from the remote sites to that central XNAT system can be challenging. XNAT's web-based upload tools provide a manual solution that is often a good choice. But for study sites with high volume data collection, complex protocols, or large images, a more automated approach can reduce the effort and error rate.
Within a local network, this sort of automation is typically achieved by sending data directly from the scanner to XNAT using DICOM network protocols. However, DICOM is not a secure protocol and should not be used on wide-area unsecured networks.
We have implemented XNAT Remote Data Relays to enable DICOM data to be automatically and securely sent from a remote scanner to a central XNAT system.
The XNAT Remote Data Relay receives data from the scanner over standard DICOM protocols and then forwards that data to the central XNAT over the secure, encrypted HTTPS protocol. What's the secret sauce? The relay actually runs a lightweight XNAT instance that includes the Xsync plugin. Xsync is quite flexible and can be configured at the individual project level to relay data to one or more remote XNAT systems. The XNAT relay server is very lightweight and can be run on very small footprint hardware such as an Intel NUC.
The XNAT Remote Data Relay can be run on most any hardware or virtual computing systems. In multi-center studies where a central coordinating site is building and shipping the relays, a small footprint system can be a cost effective and usable approach. The sample systems illustrated below follow the small footprint principle. They are relatively cheap (<$600), super portable, and have proven to be quite hardy.
Very little compute power is needed; however, reliability is essential. We have had very good results from the Intel NUC computing platform with SSD storage. Here is a typical build we have used in 2016/2017:
Siemens raw k-space data can be rather data high volume and requires more storage. While the NUC with an SSD could potentially store 4TB of data if using all SSD, the cost becomes very high. In this case we build a mini-server.
In order to collect raw data the data relay must be connected directly to the scanner's back end network.