# Week 1 You can access Zoom meetings [here.](https://ukri.zoom.us/j/99401085708) (All Zoom meeting will have the same link and password, unless you got notified otherwise). Click [here](https://hackmd.io/@SIRF-CIL-Fully3D/r1AxJKNou) to return to the main note. ## Schedule ### Mon 28th of June 14:00 – 16:30 GMT + 01:00 [Local time and calendar invite](https://dateful.com/eventlink/2809051189) 14:00 – 14:55 Joint session 1. [Introduction to the course](https://www.ccpsynerbi.ac.uk/sites/www.ccppetmr.ac.uk/files/SIRFCIL%20CourseIntro%20.pdf) 2. [Overview of topics for the week](https://www.ccpsynerbi.ac.uk/sites/www.ccppetmr.ac.uk/files/Week1_Monday_Overview_CKolbitsch.pdf) 3. [Demonstration on tools used for the course](https://www.ccpsynerbi.ac.uk/sites/www.ccppetmr.ac.uk/files/ToolsDemonstration_EdoAnder.pdf) 5 min break 15:00 – 16:30 Sequential sessions for different modalities Introduction of software concepts and discussion of relevant notebooks 1. 15:00-15:30 [MR](https://www.ccpsynerbi.ac.uk/sites/www.ccppetmr.ac.uk/files/Week1_Monday_MR_CKolbitsch.pdf) 1. 15:30-16:00 [PET](https://www.ccpsynerbi.ac.uk/sites/www.ccppetmr.ac.uk/files/PET_training_week1.pdf) 1. 16:00-16:30 [CT](https://www.ccpsynerbi.ac.uk/sites/www.ccppetmr.ac.uk/files/CIL_training_week1.pdf) **To access the [recording](https://ukri.zoom.us/rec/share/pN5elCHNwCzym-0Ct_r0SSJhs3zAe-RKbhQ3cXXhEBTRvFI6gl3V8j34cHS3eJ0Y.Z-bt8sBWXeNNvsZ6) of the Zoom meeting use the passcode: vq=H@9qU** --- ### Tue 29th of June 14:00 - 15:00 GMT + 01:00** Technical support session to get started with the tools for the course. https://ukri.zoom.us/j/99401085708 (Passcode: 1957351) --- ### Wed 30th of June **14:00 – 16:00 GMT + 01:00** [Local time and calendar invite](https://dateful.com/eventlink/2132730247) (Partially overlapping) sessions for support 1. 14:00 - 15:00 PET 2. 14:30 - 15:30 MR 3. 15:00 - 16:00 CT **To access the [recording](https://ukri.zoom.us/rec/share/c6WGQJRl6P3kp2uoRGmGLyZfhy8NgB8MJVsltzBl57bvLpv9UH8wnEZdeqxGkhom.ikNlVgVrzbhKjRQG) of the Zoom meeting use the passcode: ?K5Fk2=9** ___ ### Thurs 1st of July **09:00 - 10:00 GMT + 01:00** Dual aim: - Technical support session to get started with the tools for the course. - Participant-only 3 break-out groups (self-selected), allowing interaction and helping each other https://ukri.zoom.us/j/99401085708 (Passcode: 1957351) --- ### Fri 2nd of July 14:00 – 15:30 GMT + 01:00 [Local time and calendar invite](https://dateful.com/eventlink/3149332795) Joint session **Please fill in [our poll for the week before or during the session](https://www.menti.com/dufbsy7sbf)**. 1. Summary of main learning objectives 2. Example solutions 3. Overview of next week 4. Demonstration of the Gather.Town site ([more info](https://hackmd.io/@ -CIL-Fully3D/SyZHD-sh_)) Some of us plan to be in the Gather.Town site for a chat after the meeting. Feel free to join! **To access the [recording](https://ukri.zoom.us/rec/share/t9f4W49mu6lGLSmP7AXbDvAv_8FrWcPQRndQQ1j5E2ZvaHIsvee5BFRKP0Jdd3oW.2yVnRu5uBdQZmvJm) of the Zoom meeting using passcode tFg*4=s** ## Supporting Material Below is a set of links towards recorded lectures or reading material which we hope will be useful for course participants. Please note that we will not cover this material ourselves (except very briefly). ### Software and programming - If you are new to Python and the Unix terminal, please check the excellent material on https://software-carpentry.org/lessons/. There are some additional links in the [Appendix of our starting guide for participants](https://github.com/SyneRBI/SIRF-Exercises/blob/master/DocForParticipants.md#appendix). - Ben Thomas gave a lecture on Object Oriented Programming for the CCP in 2017. The associated [notebook](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/Introductory/object_oriented_programming.ipynb) has links for the video recording etc. - [SIRF/CIL Docker on Mac video](https://mediacentral.ucl.ac.uk/Player/C1eAhHAi) and [PowerPoint](https://mediacentral.ucl.ac.uk/assoc_files/C1eAhHAi_0.pptx?token=16abhFB8) - The SIRF paper: Ovtchinnikov, Evgueni, Richard Brown, Christoph Kolbitsch, Edoardo Pasca, Casper da Costa-Luis, Ashley G. Gillman, Benjamin A. Thomas, et al. ‘SIRF: Synergistic Image Reconstruction Framework’. Computer Physics Communications 249 (1 April 2020): 107087. https://doi.org/10.1016/j.cpc.2019.107087. Link for the [accepted version](https://discovery.ucl.ac.uk/id/eprint/10087933/) - The "SIRF and motion correction" paper: Brown, Richard, Christoph Kolbitsch, Claire Delplancke, Evangelos Papoutsellis, Johannes Mayer, Evgueni Ovtchinnikov, Edoardo Pasca, et al. ‘Motion Estimation and Correction for Simultaneous PET/MR Using SIRF and CIL’. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 379, no. 2204 (23 August 2021): 20200208. https://doi.org/10.1098/rsta.2020.0208. - [CIL paper 1](https://arxiv.org/abs/2102.04560) - [CIL paper 2](https://arxiv.org/abs/2102.06126) ### MR - [Basics of MRI and MR image reconstruction](https://www.youtube.com/watch?v=xCv38thzljw) **Gastao Cruz**. Overview of the basics of MRI, from nuclear spins and magnetic fields to the sampling requirements for k-space. Also advanced topics such as reconstruction from undersampled data (GRAPPA, SENSE) are discussed. - [MR image reconstruction using SIRF](https://www.youtube.com/watch?v=vC66LgPfRNM) **Christoph Kolbitsch**. MR image reconstruction (k-space sampling, coil sensitivity maps, GRAPPA) and how it can be done in SIRF. ### CT - [CT basics](https://youtu.be/PI6vgzg5l7E) Elizabeth Edney - [Physics of Computed Tomography](https://youtu.be/-AfF3O1_duw) Daniel Wessell ### PET - [PET acquisition, backprojection, sinograms](https://youtu.be/3BC0bnWobLs) Andrew Reader - [Time of Flight PET](https://kuleuven.mediaspace.kaltura.com/media/Lecture+on+time-of-flight+in+positron+emission+tomography+%28Prof.+Johan+Nuyts%29/1_zqpnc5gw) Johan Nuyts - [PET acquisition modelling](https://liveuclac-my.sharepoint.com/:v:/g/personal/rmhathi_ucl_ac_uk/ERyfdeFKaptFiVv2hbJt89ABXSU2847gnOTgIGYFVEMATA?e=RedTba), including normalisation, scatter etc Kris Thielemans ## Notebooks These are listed in recommended order. Please do check the `README.md` in each folder as well for more information. ### General * [object_oriented_programming.ipynb](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/Introductory/object_oriented_programming.ipynb) (optional) * [introduction.ipynb](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/Introductory/introduction.ipynb) * [acquisition_model_mr_pet_ct.ipynb](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/Introductory/acquisition_model_mr_pet_ct.ipynb) * [image geometry notebooks](https://github.com/SyneRBI/SIRF-Exercises/tree/master/notebooks/Geometry#readme) (optional but recommended for SIRF and medical imaging) * [registration notebook](https://github.com/SyneRBI/SIRF-Exercises/tree/master/notebooks/Reg) (optional, if you are interested only) ### MR * [a_fully_sampled.ipynb](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/MR/a_fully_sampled.ipynb) * [b_kspace_filter.ipynb](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/MR/b_kspace_filter.ipynb) * [c_coil_combination.ipynb](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/MR/c_coil_combination.ipynb) ### CT * [00_CIL_geometry.ipynb](https://github.com/TomographicImaging/CIL-Demos/blob/main/training/2021_Fully3D/Week1/00_CIL_geometry.ipynb) (optional - for reference) * [01_intro_walnut_conebeam.ipynb](https://github.com/TomographicImaging/CIL-Demos/blob/main/training/2021_Fully3D/Week1/01_intro_walnut_conebeam.ipynb) * [02_intro_sandstone_parallel_roi.ipynb](https://github.com/TomographicImaging/CIL-Demos/blob/main/training/2021_Fully3D/Week1/02_intro_sandstone_parallel_roi.ipynb) * [03_preprocessing.ipynb](https://github.com/TomographicImaging/CIL-Demos/blob/main/training/2021_Fully3D/Week1/03_preprocessing.ipynb) * [04_FBP_CGLS_SIRT.ipynb](https://github.com/TomographicImaging/CIL-Demos/blob/main/training/2021_Fully3D/Week1/04_FBP_CGLS_SIRT.ipynb) * [additional_exercises_data_resources.ipynb](https://github.com/TomographicImaging/CIL-Demos/blob/main/training/2021_Fully3D/Week1/additional_exercises_data_resources.ipynb) Additional datasets have been pre-downloaded and are available on the STFC cloud at `/mnt/materials/SIRF/Fully3D/CIL` If **not** running on the STFC cloud you will need to download the data see [CIL README](https://github.com/TomographicImaging/CIL-Demos/blob/c2bae4649b51ecc3f3d0bb77eacc9faeeb5f7503/training/2021_Fully3D/Week1/README.md) ### PET See also the [PET README](https://github.com/SyneRBI/SIRF-Exercises/tree/master/notebooks/PET#week-1) * [display_and_projection.ipynb](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/PET/display_and_projection.ipynb) * [image_creation_and_simulation.ipynb](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/PET/image_creation_and_simulation.ipynb) * [OSEM_reconstruction.ipynb](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/PET/OSEM_reconstruction.ipynb) * [reconstruct_measured_data.ipynb](https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/PET/reconstruct_measured_data.ipynb) ## Frequently Asked Questions Here we will collect questions that are common to many participants. Please only edit this section if you are part of the organisational team. Other users, please use the section below. ### Q: Geometry B Resampling For reading NifTI files with sirf.STIR, I got an error when running the line: ``` S_cor = Reg.ImageData(/home/sirfuser/devel/SIRF-Exercises/data/working_folder/Geometry/nifti/STIRnii_Cor_14_1.nii) Error: Attempting to open a file that is not a NIFTI image. ``` My data folder in SIRF-Exercises doesn't have that file/directory, so I changed the filepath to `/home/$username$/SIRF-Exercises/working_dir/Geometry/nifti/OBJECT_phantom_T2W_TSE_Cor_14_1.nii` and got the Coronal read via sirf.STIR output to look the same as the Original Coronal output. Is this correct? A: (Kris) Apologies. This line and a few others below it were inserted by accident into the notebook (by myself). I have now fixed this on GitHub, but you can just ignore that line. For your reference, you can see the fixed notebook at https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/Geometry/b_geom_resamp.ipynb ### Q: I am running into errors in PET/reconstruct_measured_data Like: ``` sed: can't read norm.v.hdr: No such file or directory ``` and errors running this line: ``` asm_norm = AcquisitionSensitivityModel(norm_file) ``` **A** (Ash) This is a bug I introduced at the last minute that participants who have updated their SIRF-Exercises or JupyterHub instance will experience. Please find this line: ``` !sed -i.bak2 -e "s#\(!name of data file:=\)#\\1${data_path}/#" umap.v.hdr !sed -i.bak2 -e "s#\(!name of data file:=\)#\\1${data_path}/#" norm.v.hdr ``` And replace with this: ``` !sed -i.bak2 -e "s#\(!name of data file:=\)#\\1{data_path}/#" umap.v.hdr !sed -i.bak2 -e "s#\(!name of data file:=\)#\\1{data_path}/#" norm.n.hdr ``` (Use copy-paste, there are a few subtle changes) ### PET/reconstruct_measured_data reconstructions do not take randoms and scatter into account We've discovered a bug in SIRF 3.1.0 which means that in the current exercises, the reconstructions with randoms and/or scatter are actually identical to the ones without... For more information and a work-around, see https://github.com/SyneRBI/SIRF-Exercises/issues/146 ### PET/OSEM_reconstruction.ipynb: what is the role of `cmax`? The `cmax` variable is used to be able to display all images in the same colourscale. There is no deep reason behind the fact that it's set as `image.max()*.6` but then actually multiplied with `1.2`... It is also used to construct the initial image, setting it to an image filled with `cmax/4`. This is again largely arbitrary. We want to give it an image that is roughly in the correct scale to speed-up convergence a bit. Feel free to experiment with other values. (There will be a bit more on initialisation in Week 2). ## User questions In this section anybody can ask a question. Please **mention the relevant notebook** in your question to make it easier for us to answer. It will also help others to check any previous questions/answers for a specific notebook they are working on. If you think your question does not fit to a specific notebook, please ask it in *General questions*. It is also advisable to write **which set-up you're using (cloud/VM/docker)**. ## General questions ### Your question ### Q: In CIL, can you define helical geometry? **A**: Currently CIL can only handle circular trajectories (with offsets and tilts) but this will be expanded to per-projection set-ups (including helical) in the near future. ### Q: What image formats are supported by CIL? Is there any common image format for the three platforms (PET, MRI and CT)? Take a look at [this](https://hackmd.io/kZx9QmOSRUWQEUtGprZujw#Q-What-image-formats-are-supported-by-CIL-Is-there-any-common-image-format-for-the-three-platforms-PET-MRI-and-CT) answer. ## Questions about the notebooks Please insert the name of the notebook in the title. ## Question about CIL-geometry notebook I have installed CIL as conda package on Windows 10. When I tried to run the CIL_geometry notebook I gave the following error: cannot import name `show_geometry`. Please, could you help me to solve the error? **A:** Can you confirm the import statement that was used? E.g., `from cil .... import ....` Also please can you check which version number of cil you installed? You can do this by activating your conda environment and typing: `conda list cil` If you installed CIL via conda as suggested [here](https://hackmd.io/kZx9QmOSRUWQEUtGprZujw#Q-Is-there-a-way-to-install-CIL-with-GPU-support-on-Windows-10-For-actual-use-after-the-course) you will have created a new environment named `cil` that you have to activate before using: `conda activate cil`. Could you please try the following from command line: ``` python -c 'from cil.version import version; print (version)' ``` This should print `v21.2.0` or similar. ## Q: I installed CIL for windows with the conda package but show_geometry does not exist! The installed CIL version is 21.0.0. I have modified the Intro_walnut_conebeam to use TIGRE reconstruction engine. The reconstruction runs but the `show_geometry` method doesn't work. Any suggestion? ***A:*** (Gemma) `show_geometry()` was added in v21.1.0. I have now built v21.2.0 for windows. Please can you update your version of cil by recreating the environment as instructed [here](https://hackmd.io/kZx9QmOSRUWQEUtGprZujw#Q-Is-there-a-way-to-install-CIL-with-GPU-support-on-Windows-10-For-actual-use-after-the-course). You should now see CIL v21.2.0 and be able to use `show_geometry() ## Q: CIL Additional Exercises Data Resources (Cloud) I loaded in the egg1 data for reconstruction (like the walnut task), but my reconstructed images look weird. I think it's because my egg1 dataset only has 11 angle slices? The data is in a (11, 1024, 1024) array. Egg2 looks good (1001, 1024, 1024). See my egg1 (top) and egg2 (bottom) reconstructions below: ![](https://i.imgur.com/GnLaYFA.png) ![](https://i.imgur.com/A44dtqX.png) A: **I loaded in the wrong .txrm file**. I loaded gruppe 1_tomo-A_Drift.txrm instead of gruppe 1_tomo-A.txrm. Correct Egg1 reconstruction is below: ![](https://i.imgur.com/9ncVXAY.png) **A:** (Jakob) Good to see you solved it. Indeed **the drift file contains separate additional information**. I hadn't tried loading and reconstructing the drift data before so it was interesting to see that :) Follow up Q: Is there a way to see the file directory of the extra data on JupyterHub cloud? I'm trying to load in the crystal data set but when using ls in Terminal I can't find the files? Thanks :) **A:** (Jakob) You are right, the crystals data is not there. I thought we had added it, but will make sure it is sorted. You can open a terminal from the jupyter server, then type `bash` and then navigate with `cd` to the directory and use `ls -l` to view file names and sizes etc. The crystals folder is empty... Thanks! Forgot about -l. Will try some of the other data sets then :) **A:** (Jakob) Great! Let us know how you get on! And feel free to demonstrate in the session this afternoon ;) PS the sophiabeads data is Nikon data like the crystals data, so if you get sophiadata working, then crystals "should" work too... ### Q: I'm confused about the "vertical" and "horizontal" slices. It seems the wrong way round? (This is also about CIL and with the egg data (yeah!)) **A:** (Ander) You can see the egg-holder in the bottom of the vertical axis, so it seems correct to me. I believe this is a micro-CT scan, which means the source and detector are static (unlike a medical scan) and the axis of rotation is vertical from the floor up. The egg is mounted on a rotating stage, and it will rotate. In medical CT, the patient tends to be lying down, and thus the axis of rotation is horizontal. Also we don't rotate the patient, but the machine! Q follow up...yes, I agree the egg has been scanned "standing up" (on a industrial CT scanner with a vertical rotation axis). But that to me means a "horizontal" slice should be parallel to the floor? **A:** (Ander) A vertical slice is a slice that has been cut on the vertical axis. Vertical slice 512 means we took the vertical axis, and cut it at 512. So a vertical slice is parallel to the floor. Just terminology mismatch I see! Thank you, I was wondering if it was just terminology, or if the data was somehow transformed. ### Q: I tried to use TIGRE instead of ASTRA... ...it turned out **TIGRE is not installed on the cloud**. But if I were to use it, are the names the same as with using astra? Ie, would I just replace any calls to astra with tigre? I tried to look this up in the documentation, but couldn't find it (but I might have got lost). ***A*** To use TIGRE you can import `ProjectionOperator` and `FBP` from `cil.plugins.tigre`, this is the same as with ASTRA. The arguments to the call vary slightly as TIGRE is GPU only there is no `device` argument. Our documentation is a work in progress, so thank you for letting us know! Yes, basically it would be as simple as a different import statemet ``` # Use the ASTRA engine from cil.plugins.astra import ProjectionOperator as aProjector from cil.plugin.astra import FBP # use the TIGRE engine from cil.plugins.tigre import ProjectionOperator as tProjector from cil.plugins.tigre import FBP ``` As mentioned, above both of ASTRA's classes accept a parameter `device` which can be `cpu` or `gpu`, while TIGRE supports only GPU ### Q: Can SIRF load other data format than Siemens? E.g. Root, ecat7, dicom,.... In particular interested on Root files output from GATE. Check out [this question](https://hackmd.io/kZx9QmOSRUWQEUtGprZujw?both#Q-What-image-formats-are-supported-by-CIL-Is-there-any-common-image-format-for-the-three-platforms-PET-MRI-and-CT) with additional details on data formats. The [STIR-GATE-connection](https://github.com/UCL/STIR-GATE-Connection) project would be of some help. This uses STIR directly though. However, the final output would be usable in SIRF. ### Q: Tigre results vs Astra I have tried to use TIGRE plugin instead ASTRA plugin on Window. I have followed the suggestions on hackMD but the results are different. The top is the astra reult, bottom the tigre result. ![](https://i.imgur.com/UYUBXZ3.png) ![](https://i.imgur.com/R3QZwOJ.png) **A**: (Gemma) This isn't a difference between TIGRE and ASTRA. There was a bug in the Zeiss reader regarding rotation angles that was fixed in v21.1.0 [here](https://github.com/TomographicImaging/CIL/blob/master/CHANGELOG.md). The STFC cloud has CIL v21.2.0 installed, and I believe you are comparing with your local version CIL v21.0.0, hence on this Zeiss dataset you see the rotation direction difference. I have now released the binaries for windows v21.2.0 so if you update the version of CIL in your conda environment the behaviour should be consistent. However you can reverse the angles: ``` # by changing the sign to angles: data.geometry.angles[:]=-1.0*data.geometry.angles[:] ``` ![](https://i.imgur.com/q3XriTS.png)