# SCC Training plan
## General ideas
- Homeworks
- Prepare for next topic
- Repeat exercise in different system (e.g. during training in LUMI, at home in Mahti)
- A poll about previous knowledge before first training
- Linux and shell scripting skills
- Use of build systems
- Use of parallel computers
- ...
## 1. Kick off
- Team introductions, 30 min
- About the competition, 15 min
- Benchmarks and applications
- Short coffee break
- Practicalities, 45 min
- Hardware
- Travel funding
- Communication tools (github, instant messaging)
- Training schedule
- Summary of survey results?
- Opinions about homeworks / flipped teaching
- CSC Summerschool participation
- Log in to LUMI?
- Home work:
- Introduction to Linux
- Elements of supercomputing
- Team name
- Team proposal
## 2. Basics of parallel computing and working in CSC environment
- Start with simple hands-on
- Log in to supercomputer (LUMI?)
- git clone
- Run `hostname` via Slurm
- Run `rocm-smi` via Slurm
- Lecture about supercomputers / parallel computing concepts
- Node, CPU, CPU core, GPU, interconnect
- Process / MPI task, thread, "GPU thread"?
- Building code with "non-optimal" Makefile
- Testing scalability
## 3. HPL and HPCG
- Review/quiz of supercomputing concepts
- modules
- Short lecture (~30 min) of general idea of different benchmarks and containers
- Hands-on (~60 min): 3 person teams work with tutor on particular benchmark
- Compiling from source and with ready-to-use container
- Teams present their experiences (~10 min / team)
- Homework: "light" (using e.g. existing binaries) trial of benchmarks other than one during session
- Preparation: HPL Jussi, HPCG Leopekka
- Grenoble cluster or LUMI
## 4. HPL and HPCG cont.
## 5. MLPerf etc.
- Review of installation and benchmarking scripts
- HPCG with GPU aware MPI
- cmake modifications
- effects, demo of rocprof
- Report from ISC
- Introduction to MLPerf
- Brief discussion about reproducibility challenge
## 6. Applications
- MPAS and 3DMHD
Compiling both programs and running a small test case with both. Let's have again two teams, both work on both codes.
- 3DMHD (Jussi)
- clone the private repo
- try to build / run without help
- max 5-10 min before asking for help
- commit and push possible changes
- MPAS (Miro/Leopekka)
- use readily installed PIO
- Theory (30min)
- Compiling and running (~1h)
- Demo of StarPU / chameleon?
Starting to divide team members into subteams. Asking for preferences from the students.
- Slideset explaining different possible roles (Leopekka)