NAISS projects overview
These are the NAISS course projects:
HPC cluster |
Course project |
|---|---|
COSMOS |
|
Dardel |
|
Kebnekaise |
|
Pelle |
|
Tetralith |
|
Alvis |
|
Storage spaces for this workshop:
HPC cluster |
Course project |
|---|---|
Alvis |
|
Bianca |
None. Use |
COSMOS |
|
Dardel |
|
Kebnekaise |
|
LUMI |
None. Use |
Pelle |
|
Tetralith |
|
Reservations
Include in slurm scripts with #SBATCH --reservation==<reservation-name> at most centers. (On UPPMAX it is “magnetic” and so follows the project ID without you having to add the reservation name.)
NOTE: as there is only one/a few nodes reserved, you should NOT use the reservations for long jobs as this will block their use for everyone else. Using them for short test jobs is what they are for.
- UPPMAX
- HPC2N
hpc-python-frifor one AMD Zen4 cpu on Fridayhpc-python-monfor one AMD Zen4 cpu on Mondayhpc-python-tuefor two L40s gpus on Tuesdayit is magnetic, so will be used automatically
- LUNARC
hpc-python-dayNfor up to 2 CPU nodes per day, where N=1 for Thursday, 2 for Friday, 3 for Monday, and 4 for Tuesdayhpc-python-day4-gpufor the GPU and ML sessions on Tuesday afternoonNote: for On-Demand apps, click the gear icon next to “Resource” in GfxLauncher popup to see additional options, which should include a box to include a reservation. Ticking that box will reveal a dropdown menu with the list of reservations associated with the project.