Today’s neuroscientists encounter an ever-expanding variety of data processing technologies and related neuroimaging data. This is especially true in the realm of Brain Connectomics, which combines complex processing techniques and state of the art imaging data (such as diffusion weighted imaging and resting state functional MRI) in order to explore how brain networks are organized and impacted by various factors. ConnPipe is a brain connectivity pipeline developed by researchers at the IU School of Medicine’s Center for Neuroimaging (CfN) in collaboration with the Scalable Compute Archive (SCA). ConnPipe combines state of the art tools from various image processing packages along with in-house developed algorithms that make it possible to process these complex data sets within one coherent framework.
In neuroimaging, each step of processing is designed either to improve image quality, standardize the image’s geometric and intensity patterns, or calculate measures for further analysis. However, because preprocessing pipelines vary across neuroimaging research, scientists face a challenge in reproducing results across studies. Andrea Avena-Koenigsberger, Senior Analyst/Programmer & Software Developer, SCA, says ConnPipe is a state-of-the-art, in-house solution to a pervasive problem. “Because many available pipelines are like black boxes, or difficult to update, modify and maintain, the CfN has opted to develop their own pipelines, with the hope of standardizing processing across research groups within CfN, and eventually, across IU,” said Avena-Koenigsberger. Two important features of this pipeline are that it relies on open-source technologies, and that in the near future, it will be available for download within a container, allowing researchers to run the pipeline from any computer while preserving data reproducibility. “The ConnPipe project represents the addition of domain science application development to our ongoing partnership with CfN,” said Arvind Gopu, manager of the SCA.
Standardizing the preprocessing pipeline makes data easier to interpret while also helping to improve scientific consensus. “Utilizing ConnPipe bodes well for everyone,” said Matt Tharp, Data Specialist, CfN. “On one hand, applying the same processing strategies across laboratories helps to ensure consistency and verifiability of results. On the other hand, as methods inevitably evolve, organizing novel applications under a common framework helps to ensure that all laboratories are equipped with a full repertoire of tools for their research,” said Tharp.
At IU, ConnPipe is already being utilized to explore changes in brain networks for patients undergoing hormone therapy (HT) for breast cancer. HT treatments for breast cancer have been shown to lead to cognitive impairment, but the precise nature of this impairment is an active topic of research. To explore this problem, ConnPipe has been used to process resting state functional MRI data and construct functional connectivity networks, which can then be analyzed to identify characteristic functional subnetworks that are affected by HT treatment. “This research is one study among many which hopes to apply ConnPipe strategically to the broad and ever-growing realm of research and discovery within neuroimaging,” said Meichen Yu,postdoctoral researcher, CfN. “Research of this nature helps to uncover reliable indicators of impairment which may provide evidence toward previously undiscovered causal biological and physiological mechanisms underlying the nature of treatment and its side effects,” Yu said.
The collaboration of SCA, CfN, and IU SoM researchers has allowed ConnPipe to be significantly improved by moving it out of MATLAB into a Bash/Python format. This has allowed the team to make the pipeline faster and more efficient, more modular and adaptable to new techniques, while allowing for better utilization of IU’s vast supercomputing resources.