# UOW Hippocampus Workshop
## Table of Contents
[TOC]
## How to Login into the CVL
---
1. https://desktop.cvl.org.au/ (recommended to use Chrome)
1. Select M3 and CVL @MASSIVE
1. Login with your institution credentials
1. Pick Light compute and project **sk75** click **LAUNCH**
1. Then, click **SHOW DESKTOP**
1. On the top menu click on the black button **terminal**
2. You can also access various neuroimaging software packages in the drop-down menu
## Create a massive account
https://www.cvl.org.au/cvl-desktop/cvl-accounts
## Course code
---
:::info
Case sensitive code instructions
:::
> First, let's check our username
```gherkin=
echo ${USER}
```
> Next, let's try some simple commands.
```gherkin=
pwd
ls
cd /home/${USER}/sk75
ls -la
```
> Let's try out some simple commands...
```gherkin=
module load freesurfer/6.0
SUBJECTS_DIR=/home/${USER}/sk75/hippocampus_segmentation/SUBJECTS_DIR
vglrun freeview -recon tshaw_3T &
```
> Now, close Freeview and let's explore the module system for loading software...
```gherkin=
module purge
vglrun freeview -recon tshaw_3T &
module avail
module load freesurfer/6.0
SUBJECTS_DIR=/home/${USER}/sk75/hippocampus_segmentation/SUBJECTS_DIR
vglrun freeview -recon tshaw_3T &
```
# Hippocampus Segmentation
Let's start by loading the modules we will be using
```gherkin=
module load ants/2.3.1 ashs/1.0.0
```
Now we test everything is working nicely
```gherkin=
$ASHS_ROOT/bin/ashs_main.sh -h
antsRegistration
```
Now we can start by inspecting our data.
```gherkin=
module load itksnap/3.8.0-beta
cd /home/${USER}/sk75/hippocampus_segmentation
vglrun itksnap atlas_creation/rawdata/participant_ses-01_7T_T2w_LinMoCo_res-iso.3_N4corrected_denoised_brain_preproc.nii.gz &
vglrun itksnap atlas_creation/rawdata/participant_ses-01_7T_T1w_N4corrected_norm_preproc.nii.gz &
```
### ANTs Multivariate Template Construction
One of the first things we do is preprocessing.
One way for reducing movement artefacts in the same participant is by creating a template
We will now build a quick multivariate template for creating priors for a segmentation strategy.
```gherkin=
cd atlas_creation/
mkdir ${USER}_template
cd ${USER}_template
antsMultivariateTemplateConstruction2.sh -d 3 -i 3 -k 2 -f 4x2x1 -s 2x1x0vox -q 30x20x4 -t SyN -r 1 -n 0 -g 0.20 -m CC -c 0 -o ${USER}_template /home/${USER}/sk75/hippocampus_segmentation/atlas_creation/rawdata/participant_names.csv
```
### ASHS
```gherkin=
cd ~/sk75/hippocampus_segmentation/ASHS/
mkdir ${USER}_ASHS
export ASHS_ROOT=/home/${USER}/sk75/hippocampus_segmentation/ASHS/ashs-fastashs_beta
ashs_main.sh -h
$ASHS_ROOT/bin/ashs_main.sh -g ../rawdata/participant_ses-01_7T_T1w_N4corrected_norm_preproc.nii.gz -f ../rawdata/participant_ses-01_7T_T2w_LinMoCo_res-iso.3_N4corrected_denoised_brain_preproc.nii.gz -a ../ASHS_atlasses/ashs_atlas_umcutrecht_7t_20170810 -w ./ -I ${USER}_
```
### Freesurfer hippocampus subfields
```gherkin=
freesurfer/devel-20190128
segmentHA_T1.sh tshaw_3T ~/sk75/hippocampus_segmentation/SUBJECTS_DIR
segmentHA_T2.sh tshaw_3T ~/sk75/hippocampus_segmentation/rawdata/participant_ses-01_7T_T2w_LinMoCo_res-iso.3_N4corrected_denoised_brain_preproc.nii.gz ~/sk75/hippocampus_segmentation/SUBJECTS_DIR/
```
### Longitudinal
```=
segmentHA_T1_long.sh <baseID> [SUBJECTS_DIR]
```
However, I was unconvinced by this result....
### LASHiS
```gherkin=
module load ants/2.2.0
cd ~/sk75/hippocampus_segmentation/
./scripts/LASHiS/LASHiS.sh
./scripts/LASHiS/LASHiS.sh -a ~/sk75/hippocampus_segmentation/ASHS_atlasses/ashs_atlas_umcutrecht_7t_20170810/ -o ./LASHiS/${USER}_LASHiS ~/sk75/hippocampus_segmentation/rawdata/participant_ses-01_7T_T1w_N4corrected_norm_preproc.nii.gz ~/sk75/hippocampus_segmentation/rawdata/participant_ses-01_7T_T2w_LinMoCo_res-iso.3_N4corrected_denoised_brain_preproc.nii.gz
~/sk75/hippocampus_segmentation/rawdata/participant_ses-01_7T_T1w_N4corrected_norm_preproc.nii.gz ~/sk75/hippocampus_segmentation/rawdata/participant_ses-01_7T_T2w_LinMoCo_res-iso.3_N4corrected_denoised_brain_preproc.nii.gz
```
### Template based hippocampus subfield segmentation using Diet LASHiS
```gherkin=
./scripts/LASHiS/LASHiS.sh -a ~/sk75/hippocampus_segmentation/ASHS_atlasses/ashs_atlas_umcutrecht_7t_20170810/ \
-f 1
-o ./LASHiS/${USER}_LASHiS \
~/sk75/hippocampus_segmentation/rawdata/participant_ses-01_7T_T1w_N4corrected_norm_preproc.nii.gz ~/sk75/hippocampus_segmentation/rawdata/participant_ses-01_7T_T2w_LinMoCo_res-iso.3_N4corrected_denoised_brain_preproc.nii.gz \
~/sk75/hippocampus_segmentation/rawdata/participant_ses-01_7T_T1w_N4corrected_norm_preproc.nii.gz ~/sk75/hippocampus_segmentation/rawdata/participant_ses-01_7T_T2w_LinMoCo_res-iso.3_N4corrected_denoised_brain_preproc.nii.gz
```
# Freesurfer Code
> Now, let's set up for running the analysis... starting by making our own copies of the script we will use.
```gherkin=
cd /projects/sk75/scripts
cp recon-all_example_script_3T.sh recon-all_example_script_3T_${USER}.sh
cp recon-all_example_script_7T.sh recon-all_example_script_7T_${USER}.sh
```
> See all the scripts and names
```gherkin=
ls -lthr
```
> To whom does this script belong?
```gherkin=
ls -la /home/${USER}/nj86/scripts/recon-all_example_script_3T_${USER}.sh
```
> Let's change the permissions...
```gherkin=
chmod 744 /home/${USER}/sk75/scripts/recon-all_example_script_3T_${USER}.sh
ls -la
```
:::info
https://chmod-calculator.com/
:::
> Looks good! Let's see what is inside...
```gherkin=
emacs /home/${USER}/nj86/scripts/recon-all_example_script_3T_${USER}.sh &
```
The "&" at the end tells the shell to run emacs in the background. If you forget to include the & you can hit ctrl + z and then type "bg" to background the program.
> Here is our code:
```gherkin=
#!/bin/bash
#SBATCH --job-name=3T_Freesurfer_test
# To set a project account for credit charging,
#SBATCH --account=nj86
# Request CPU resource for a serial job
#SBATCH --ntasks=1
#SBATCH --ntasks-per-node=1
#SBATCH --cpus-per-task=8
# Memory usage (MB)
#SBATCH --mem-per-cpu=8000
# Set your minimum acceptable walltime, format: day-hours:minutes:seconds
#SBATCH --time=0-48:00:00
# Set the file for output (stdout)
#SBATCH --output=/home/%u/sk75/logs/3T_Freesurfer-test-%u%j.out
# Set the file for error log (stderr)
#SBATCH --error=/home/%u/sk75/logs/3T_Freesurfer-test-%u%j.err
# Command to run a serial job
# you will need to load freesurfer (don't worry too much what this means)
module purge
module load freesurfer/6.0
#change directory so the logs know where to go
PROJECT_FOLDER="sk75"
cd /home/${USER}/${PROJECT_FOLDER}/logs/
# set up some things
#freesurfer set up
source ${FREESURFER_HOME}/SetUpFreeSurfer.sh
#subjects directory set up
SUBJECTS_DIR="/scratch/${PROJECT_FOLDER}/SUBJECTS_DIR"
if [[ ! -e ${SUBJECTS_DIR} ]] ; then
mkdir ${SUBJECTS_DIR} ;
fi
#run freesurfer recon-all
#this is the 3T data
input_file="/home/${USER}/${PROJECT_FOLDER}/rawdata/T1_3T_anon/S01_TS.MR.BRAIN_-_UQ_FMRI.0010.0002.2014.03.12.14.26.21.968750.17558629.IMA.dcm"
echo "recon-all -s ${USER}_3T -i ${input_file} -all -3T -no-isrunning"
recon-all -s ${USER}_3T -i ${input_file} -all -3T -no-isrunning
if grep -q "without error" ${SUBJECTS_DIR}/${USER}_3T/scripts/recon-all-status.log
then echo "Freesurfer ran without any errors!"
else echo "Freesurfer ran into an error, check the output in ${SUBJECTS_DIR}/${USER}_3T/scripts/ "
fi
```
:::info
The script is different to what I have shown you here. We will go through why I changed it.
:::
### Running recon-all in the terminal
```gherkin=
module load freesurfer/6.0
SUBJECTS_DIR=/home/${USER}/sk75/hippocampus_segmentation/SUBJECTS_DIR
PROJECT_FOLDER="sk75"
recon-all
recon-all -help
input_file="/home/${USER}/${PROJECT_FOLDER}/hippocampus_segmentation/rawdata/participant*T1*nii.gz"
recon-all -s ${USER}_hippo_seg_workshop -i ${input_file} -all -wsless -cm
```
:::info
Once we have run the job from the command line, we can kill it with ctrl+c.
It is then time to submit the job on the cluster (!)
:::
> We need to either delete the whole folder we just created from the $SUBJECTS_DIR
```gherkin=
rm -rf ${SUBJECTS_DIR}/${USER}_3T
```
:::info
***IMPORTANT***: DO NOT USE `rm -rf` unless you know what you are doing. The shell always does EXACTLY what you ask it to do.
:::
> A safer way is to add the flag -no-isrunning to the end of our recon-all command
```gherkin=
recon-all -s ${USER}_3T -all -3T -no-isrunning
```
:::info
**Don't forget to save your file if you've changed it!**
:::
:::info
Make sure you remove -no-isrunning if this is the first time you are running the analysis. You'll also need the input flag (-i) with the path to the nifti or dicom series (as above).
:::
>Time for the big moment!
### Submitting a job on the cluster
We need to tell SLURM to submit our script to the cluster with the sbatch command.
```gherkin=
sbatch /home/${USER}/nj86/scripts/recon-all_example_script_3T_${USER}.sh
```
> and now we wait...
```gherkin=
squeue -u ${USER}
squeue -u ${USER}
squeue -u ${USER}
squeue -u ${USER}
squeue -u ${USER}
squeue -u ${USER}
ad infinitum...
tail -f /home/${USER}/nj86/logs/3T_Freesurfer-test-${USER}.out
```
:::info
*Time to check our output...*
:::
>You will replace tshaw with your username ( or ${USER} ) if you have already run the analysis...
```gherkin=
tree ${SUBJECTS_DIR}/tshaw_3T
cat ${SUBJECTS_DIR}/tshaw_3T/scripts/recon-all-status.log
vglrun freeview -recon tshaw_3T &
```
## References and useful links
---
:::info
MASSIVE M3 Documentation: https://docs.massive.org.au/M3/m3users.htmlhttps://demo.codimd.org/p/SJlh1PQAV#/
:::
Some useful websites:
* How to transfer files to MASSIVE:
https://docs.massive.org.au/M3/transferring-files.html
* N4 bias Correction
https://www-ncbi-nlm-nih-gov.ezproxy.library.uq.edu.au/pubmed/20378467
* Some useful resources for simple bash coding
http://swcarpentry.github.io/shell-novice
https://devhints.io/bash
* The most useful resource for all issues related to Freesurfer:
https://mail.nmr.mgh.harvard.edu/pipermail//freesurfer/
* Collaborative editing!
http://brainbox.pasteur.fr/
* BIDS data structure
https://bids.neuroimaging.io/
* BIDSCoin for easy conversion of your DICOMs into BIDS format!
https://github.com/Donders-Institute/bidscoin
### SLURM references
---
> https://slurm.schedmd.com/pdfs/summary.pdf
> https://slurm.schedmd.com/sbatch.html
>
- Some useful SLURM commands:
```gherkin=
squeue
sbatch <slurm_script_file>
scontrol show job <JOBID>
scancel <JOBID>
sinfo -i 5 -S"-O" -o "%.9n %.6t %.10e/%m %.10O %.15C"
squeue -o"%.7i %9P %.8j %.8u %.2t %.10M %.6D %C"
```
- Massive User Scripts:
```gherkin=
show_job
show_job <JOBID>
show_cluster
user_info
```
- Slurm Sample Scripts are Here:
```
/usr/local/hpcusr/latest/training/samples/slurm/
```
- We recommend using smux to compile and test code on compute nodes.
- How to use smux: https://docs.massive.org.au/M3/slurm/interactive-jobs.html
For more details, please see:
https://docs.massive.org.au/M3/slurm/slurm-overview.html
### copying files to massive
```
rsync -auv -e ssh adirectory username@m3-dtn.massive.org.au:~/destinationdirectory/
```
## Shameless plug for Tom's research
>MND patient hippocampus volume assessment:
![](https://i.imgur.com/8hpJN1D.jpg)
> Longitudinal Automatic Segmentation of Hippocampus subfields using multi-contrast MRI
![](https://i.imgur.com/C6Brftv.jpg)
:::info
https://github.com/thomshaw92/LASHiS
:::
## Want to make a 3D brain?
```gherkin=
mris_convert ${SUBJECTS_DIR}/${subjName}/surf/?h.pial \
${SUBJECTS_DIR}/${subjName}/surf/?h.pial.stl
```
This .stl file can be used by 3D printers.
## FAQ and Feedback
:::info
**Need any help with a particular part?** Leave a comment!
:::
###### tags: `Hippocampus Subfields` `Data analysis` `shape analysis` `CVL`