DAIM Guide

From HPC
Revision as of 22:27, 26 July 2023 by Chris.collins (talk | contribs) (OnDemand Web Portal Access (pilot))

Jump to: navigation , search

Page currently under construction

Introduction

Viper, the University of Hull’s High Performance Computing facility, is used by research staff and students in many disciplines across the University and is a potentially significant tool for those with any sort of computational element to their research. Viper is a ‘cluster’ of approximately 200 computes built into a cluster, and features more than 6000 compute cores, high memory systems and GPU technology together with dedicated high performance storage and fast interconnect between systems to meet the needs of the most computationally demanding tasks. Some notes about

  • Viper runs Linux, though to make use of Viper you do not need to be a Linux expert – being familiar with just a few commands is enough to get started.
  • Viper runs a scheduler, a piece of software which manages access to the computers across the cluster, monitoring what is running and how resources such as memory, cpu processors and GPU cards are being used.
  • Users run ‘jobs’ some of which are interactive (like Jupyter Notebook sessions) but many of which are submitted to run automatically without user interaction. Jobs usually have a batch script, a text file containing recipe for the job, for example what resource is needed and what the job should do.

Prerequisites

Available Resource

Important While Viper has a range of GPU resources available for DAIM student use, including Nvidia A100 and Nvidia A40 based systems, such resource is in very high demand and as such at times there may be many people competing for GPU resource meaning long pend (queuing) times are likely.

Viper is a shared resource, so to help alleviate this for all users, could we suggest:

  • Where possible, try and only run final GPU runs on Viper, and keep development of code to either other systems (e.g. own computer, DAIM lab computer or Viper CPU nodes)
  • Don't request longer runtimes than is required or if you have resource allocated to you but are not going to work for a period of time please cancel your allocation
  • Please consider other users require GPU access, don't queue up multiple Jupyter sessions
Resource Type GPU RAM * System RAM Count Accessible as Notes
Nvidia A40 48GB 128GB 2 Viper SSH tunnel Very high demand
NVidia A40 48GB 128GB 2 Web portal Very high demand
Nvidia A100 MIG 20GB 20GB 64GB 6 Web portal MIG version High demand
Nvidia A100 MIG 10GB 10GB 32GB 7 Web portal MIG version Medium demand
CPU only - 128GB / 1TB 10+ Viper SSH tunnel Lower demand but low performance for ML. Use for development and testing
CPU only - 128GB / 1TB 10+ Web portal Lower demand but low performance for ML. Use for development and testing

* The DAIM lab machines are each equipped with Nvidia GeForce RTX 3070 GPU cards, with 5888 CUDA cores and 8GB of GPU memory - all GPU instances on Viper have at least an equivalent amount of GPU memory available.

OnDemand Web Portal Access (pilot)

We are currently testing a web route to provide Jupyter access via our pilot Viper OnDemand web portal - see http://hpc.mediawiki.hull.ac.uk/General/OOD

This service is in testing, please raise any issues via the University Support Portal

If you connect to the web portal, you will see a number of DAIM prefixed apps, which can be used to connect:

Create Python Virtual Environment

  • DAIM Create Environment - this will create a base Python virtual environment. You can change the name of the base environment by changing the text in the Environment Name box. You can then select this environment kernel to use when in DAIM session notebooks.

Web Portal Nvidia A40 GPU access

  • DAIM Jupyter - this will launch a Jupyter lab session in your browser tab
  • DAIM Jupyter Desktop - similarly, this will launch a Jupyter session but with the web browser running on Viper OnDemand. This means you should be able to disconnect and reconnect with cells remaining active.

Web Portal Nvidia A100 GPU MIG (Multi-Instance GPU) access

  • TEST DAIM Jupyter Desktop MIG - this will launch a Jupyter session but with the web browser running on Viper OnDemand. This means you should be able to disconnect and reconnect with cells remaining active. Options are between a 10GB and 20GB VRAM instance. Please only select what is required.
  • TEST DAIM Jupyter MIG - this will launch a Jupyter lab session in your browser tab. Options are between a 10GB and 20GB VRAM instance. Please only select what is required.

Web Portal Nvidia A100 GPU MIG (Multi-Instance GPU) automated run

  • TEST DAIM nbconvert Job - this will submit a task to run Jupyter-nbconvert to automatically execute the specific notebook when resource is available without you needing to interact with the notebook. Specify a notebook using the "select path" button and click launch. Once complete nbconvert will produce an output notebook with nbconvert in the name (e.g. the input mynotebook.ipynb will produce mynotebook.nbconvert.ipynb). After you launch the task, it will just run automatically when resource is available without you needing to interact with it and will be listed as "Running" in your session list. Note that there is very little in terms of output until the notebook execution is complete. Options are between a 10GB and 20GB VRAM instance. Please only select what is required.

Please note:

  • VPN access is still required to connect to the Viper OnDemand web portal.
  • The DAIM Jupyter apps are in testing and there may be issues which have yet to be resolved.
  • The web portal is also testing GPU 'sharding' which is a way of sharing GPU resource among multiple jobs. Please consider other users when trying to access resource.

Alternative (previous) method: Jupyter Notebook via SSH tunnel on Viper

A guide is available for step by step instructions for DAIM students to make use of Viper: Guide to Jupyter Notebooks on Viper