# Neuroimaging Analysis at Scale ## Setup [Instructions for setting up the workshop](https://docs.google.com/document/d/1HW6fZc54aiYRWtGQ-8dZhFMxnVm9Pt1A2DTEzI70Pzo/edit?usp=sharing) ## Useful Links [workshop slides](https://docs.google.com/presentation/d/10y_padozF4URvHDzjwFFqhDFBmCLumz2Gfe_3K59Reg/edit?usp=sharing) [workshop github repo](https://github.com/carpentries-incubator/SDC-BIDS-IntroMRI) [gather town social](https://gather.town/r1eoLgJHSvMiK8gB/scinet) - password is "brains!" ### Getting help after the workshop - [Neurostars](https://neurostars.org/) ### DICOM to NIfTI Converter - [dcm2niix](https://github.com/rordenlab/dcm2niix) ### BIDS - [BIDS Specification](https://bids-specification.readthedocs.io/en/stable/) - [OHBM 2020 TrainTrack presentation](https://www.youtube.com/watch?v=8vRU-AgPfbY) - skip to 2:36:37 - [other BIDS presentations on OSF](https://osf.io/yn93h/) - [BIDS validator](https://bids-standard.github.io/bids-validator/) - [BIDS starter kit](https://github.com/bids-standard/bids-starter-kit/wiki/Tutorials) - BIDS converters - [heudiconv](https://github.com/nipy/heudiconv) - [dcm2bids](https://github.com/cbedetti/Dcm2Bids) - [bidscoin](https://github.com/Donders-Institute/bidscoin) - [bidskit](https://github.com/jmtyszka/bidskit) ### Datalad - [Handbook](http://handbook.datalad.org/en/latest/index.html) - [Cheat Sheet](http://handbook.datalad.org/en/latest/basics/101-136-cheatsheet.html#cheat) ### Open Datasets - [Google Dataset Search](https://datasetsearch.research.google.com/) - [OpenNeuro](https://openneuro.org/) - [Datalad Datasets](datasets.datalad.org) ### Other Neuroimaging Tutorials - [Neurohackademy Tutorials](https://neurohackademy.org/course_type/lectures/) - [Learn Neuroimaging](https://learn-neuroimaging.github.io/tutorials-and-resources/) - [The Princeton Handbook for Reproducible Neuroimaging](https://brainhack-princeton.github.io/handbook/) - [DartBrains Neuroimaging Analysis Course](https://dartbrains.org/intro) - [ReproNim Courses](https://courses.repronim.org/) - [Neuroimaging Analysis Methods For Naturalistic Data](http://naturalistic-data.org) ## BIDS Apps There are some special considerations for running BIDS Apps on SciNet: 1. only use singularity containers - [example instructions for creating an fMRIPrep container](https://fmriprep.org/en/stable/singularity.html#) 3. need to manage resources properly to that 40 cpus/node are used maximally - split subject list into portions and let nipype-based apps handle parallelization 4. for high input/output operations, request access to the [burst buffer](https://docs.scinet.utoronto.ca/index.php/Burst_Buffer) and use it as a work dir 5. [download templateflow templates](https://fmriprep.org/en/stable/faq.html#how-do-you-use-templateflow-in-the-absence-of-access-to-the-internet) before submitting your job since SciNet's processing nodes don't have internet access and will not let you download them mid-job Below is an example script: ``` #!/bin/bash #SBATCH --job-name=fmriprep #SBATCH --output=${SCRATCH}/%x_%j.out #SBATCH --error=${SCRATCH}/%x_%j.err #SBATCH --chdir=${SCRATCH} #SBATCH --nodes=1 #SBATCH --cpus-per-task=40 #SBATCH --time=24:00:00 STUDY='BIDS_example' export BIDS_DIR=${SCRATCH}/${STUDY}/data export SUBJECTS=`cat ${SCRATCH}/${STUDY}/code/x00` export FREESURFER_LICENSE=${SCRATCH}/freesurfer/6.0.0/build/license.txt export SING_CONTAINER=${SCRATCH}/containers/FMRIPREP/poldracklab_fmriprep_1.1.1-2018-06-07-2f08547a0732.img export SINGULARITYENV_TEMPLATEFLOW_HOME=/home/fmriprep/.cache/templateflow export OUTPUT_DIR=${SCRATCH}/${STUDY}/derivatives/baseline export WORK_DIR=${BBUFFER}/${STUDY}/fmriprep export TMP_DIR=${SCRATCH}/tmp mkdir -vp ${TMP_DIR} ${OUTPUT_DIR} ${WORK_DIR} singularity run --cleanenv \ -H ${TMP_DIR} \ -B ${BIDS_DIR}:/bids \ -B ${OUTPUT_DIR}:/out \ -B ${WORK_DIR}:/work \ -B ${TMP_DIR}/templateflow:/home/fmriprep/.cache/templateflow \ -B ${FREESURFER_LICENSE}:/li \ ${SING_CONTAINER} \ /bids /out participant \ --participant_label ${SUBJECTS} \ -w /work \ --fs-license-file /li \ --low-mem \ --omp-nthreads 4 \ --output-space T1w template fsaverage \ --use-aroma \ --notrack ``` --- ## Assignment We've downloaded a few DICOM datasets to SciNet's teach cluster. They are located in `/scinet/course/ss2020/5_neuroimaging/data/` You can symlink the DICOM data to your own folder using: `ln -s /path/to/original /path/to/link` eg. `ln -s /scinet/course/ss2020/5_neuroimaging/data/ixi ~/data/ixi` If you need to access `dcm2niix`, it's available on SciNet using `module load dcm2niix/1.0.20200331` The assignment is to convert one of these datasets to be BIDS-compatible. Please save your work in a text file or script for submission. Once you have completed the conversion, you can check whether it's correct by running the BIDS Validator either: 1. in the [browser](https://bids-standard.github.io/bids-validator/) 2. command line script `/scinet/course/ss2020/5_neuroimaging/run_bids_validator.sh <BIDS_FOLDER>` Keep in mind, your BIDS conversion doesn't have to be completely accurate. We'll be taking a look at the submissions and guide everyone on any common missteps on Thursday. ### Bonus Section - use pybids and nibabel to understand more about the data 1. Use pybids to see if the dataset is complete. How many subjects do you have? How many sessions? Are there fMRI task? Have all participants completed the whole neuroimaging run (or are some expected scans missing?). 2. Use pybids to print the filepath of one the first subjects anatomical nifti file. 3. Use nibabel to get the voxel size of this file from question 2. ## HELP **Post your questions here**