HPC Systems
General Remarks, Partners
- HPC Systems at UIBK - Description of Services
- HPC Systems of the zid - Statement of Service Levels for Systems
- Research Area Scientific
Computing
How To Obtain An Account, Getting Help
- Applying for HPC accounts, user obligations
- List of Representatives (Research Area Scientific Computing)
- HPC Accounts
for Non-Staff Members of the University
Regulations for persons who are members of the University but are not employed (e.g. students, cooperation partners) and need an HPC account. - zid Ticket System. If you need help with HPC services, please create a new ticket in the queue "HPC".
Local HPC Resources Operated by the zid
- LEO5:
Distributed memory infiniband CPU and GPU cluster (2023) - LEO4:
Distributed memory infiniband cluster of the zid (IT Services - 2018) - LEO3E:
Distributed memory infiniband cluster of the Research Area Scientific Computing (2015) - LCC3:
The Linux Compute Cluster of the zid (IT-Center) - for teaching purposes (2023)
- VISLAB 1669:
Visual Interaction Lab 1669
- User Instructions by the Research Area Scientific Computing
See also
HPC Systems Jointly Operated with Austrian Universities
- VSC: Vienna Scientific Cluster
HPC cooperation of major Austrian universities
Distributed memory infiniband cluster (VSC3: 2015)
Supranational Computing Facilities
- PRACE: Partnership for Advanced Computing in Europe
Top level of European HPC Infrastructure
Systems for very high computing demands - AURELEO:
Austrian users at the
LEONARDO supercomputer
Austrian participation in the LEONARDO pre-exascale supercomputer
Access administered by VSC consortium
Older Systems
LEO3 and MACH2 are out of service. Information is given here for historical reference
- LEO3:
Distributed memory infiniband cluster of the Research Area Scientific Computing (established 2011, decommissioned May 2022) - MACH2: Altix UV 3000
20 TB + 1728 Cores Shared Memory Machine
operated by Johann Kepler Universität (JKU) Linz - specialized for highly parallel jobs with large memory demands (2018 - 2023)
HPC Specific Software Documentation
- General Purpose GPU Processing On The UIBK Leo ClustersInformation on using GPU nodes in the UIBK HPC Leo clusters.
- MatlabMATLAB is a high-level language and interactive environment for algorithm development, data visualization, data analysis, and numeric computation. This document depicts methods and strategies for using matlab efficiently on the HPC systems of the University of Innsbruck.
- Monitoring Processes Using The Jobtop UtilityMonitoring processes belonging to a job is a key factor in optimizing your workloads for a HPC cluster. This document describes how to use the locally developed jobtop facility allowing to run a specially configured top command on all cluster nodes that run processes of a given job.
- Qiskit
- Setting up Your Windows PC With Putty and XmingThis document describes how to set up the software on a Windows desktop or notebook necessary for an efficient user experience of central Linux servers. Covered items: Putty terminal emulator, Xming X11 server, settings for Putty and Xterm terminal emulators.
- Singularity: User Defined Software EnvironmentsSingularity is an environment for running user-defined software stacks such as Docker containers on HPC clusters.
- Totalview DebuggerThe TotalView Debugger is a graphical tool for debugging sequential and parallel (MPI, OpenMP, POSIX threads etc.) programs.
- Using Anaconda for Python and RAnaconda is a comprehensive, curated, high quality and high performance distribution for Python, R, and many associated packages for Linux, Windows, and MacOS, intended for use by scientists.
- smbnetfs