NAISS projects overview
These are the NAISS course projects:
HPC cluster |
Course project |
|---|---|
COSMOS |
|
Dardel |
|
Kebnekaise |
|
Pelle |
|
Tetralith |
|
Alvis |
|
[1] This project is in the proposal stage
Storage spaces for this workshop:
HPC cluster |
Course project |
|---|---|
Alvis |
|
Bianca |
None. Use |
COSMOS |
|
Dardel |
|
Kebnekaise |
|
LUMI |
None. Use |
Pelle |
|
Tetralith |
|
[1] This project is in the proposal stage
Reservations
Include in slurm scripts with #SBATCH --reservation==<reservation-name> at most centers. (On UPPMAX it is “magnetic” and so follows the project ID without you having to add the reservation name.)
NOTE: as there is only one/a few nodes reserved, you should NOT use the reservations for long jobs as this will block their use for everyone else. Using them for short test jobs is what they are for.
- UPPMAX
- HPC2N
hpc-python-wedfor one AMD Zen4 cpu on Wednesdayhpc-python-thufor one AMD Zen4 cpu on Thursdayhpc-python-frifor two L40s gpus on Fridayit is magnetic, so will be used automatically
- LUNARC
python_april26for the GPU and ML sessions on Friday - NOTE: it is in the gpua40i partition. Nodes have Intel processors (32 cores) and a40 cards.Note: for On-Demand apps, click the gear icon next to “Resource” in GfxLauncher popup to see additional options, which should include a box to include a reservation. Ticking that box will reveal a dropdown menu with the list of reservations associated with the project.