News and Events

Wendian Update – April 17, 2023

Cyberinfrastructure & Advanced Research Computing (CIARC) Team Services 

We are a team of research computing experts here to help you become a greater and more efficient researcher.  Maybe you inherited your job script and never looked back, or you play it safe when requesting memory.  We can look at your job script with you and see if there are any opportunities to be more efficient when requesting resources.  If you need to submit a large batch of jobs and are unsure about how best to orchestrate the workflow, or you want to know how to get started with parallel computing with your research, we can consult with you and your research team to work out your needs. Perhaps you are applying for a grant and thinking about adding computational simulations, we can help with your proof-of-concept and provide guidance for your budget.   

Check out our newly improved TDX portal where you can now schedule a meeting with a CIARC team member for live support.   

Checking HPC Usage 

If you would like to check your usage for a given month, we have provided some convenient commands on Wendian. 

To check usage as a user, use the command getUtilizationByUser: 

$janedoe@wendian002:[~]: getUtilizationByUser  
janedoe -- Cluster/Account/User Utilization 2023-04-01T00:00:00 - 2023-04-12T11:59:59 (993600 secs) 
"Account","User","Amount","Used" 
"hpcgroup","janedoe - Jane Doe",$1.23,0 
 

To check usage as a PI for all your users, use the command getUtilizationByPI: 

pi@wendian002:[~]: getUtilizationByPI 
pi -- Cluster/Account/User Utilization 2023-04-01T00:00:00 - 2023-04-12T11:59:59 (993600 secs) 
"Account"|"User"|"Amount" 
"hpcgroup","janedoe - Jane Doe",$1.23,0 
"hpcgroup","johnsmith - John Smith",$1000.00,0 
 

Checking Job Efficiency 

If you are interested in checking how efficient your job was after it is finished running, we have a tool installed called reportseff that allows one to quickly check the percent utilization of CPU and memory requested. This tool can let you check jobs for a given jobID, as well as check in each job directory that has Slurm output files. 

Please refer to the GitHub page for more information: https://github.com/troycomi/reportseff 

For more, go to our rates website and be sure to check out our blog for the most up-to-date information. 

URGENT: Wendian Critical Request & Reminders

Hello Wendian users,

Critical request: We have reached 88% capacity on the Wendian scratch partition. Please remove all unnecessary files on Wendian. If the filesystem reaches 95% capacity, we will be purging data >180 days old, per policy. 

Announcements and Reminders

Implementation of Monthly Billing for HPC

This is an email reminder that Saturday April 1st at 12:00am will mark the end of preemption and the beginning of a monthly billing cycles.  If you need more information on the charge model, please see: http://ciarc.mines.edu/hpc-business-model.

Quality of Service (QoS) Changes

Quality of Service is a parameter used in Slurm to alter priority access to the job scheduler. Historically, Wendian had two main QoS options, full and normal. The full QoS allowed for jobs to submit to the entire available pool of CPU nodes, while normal QoS was a smaller pool with non-premption. Moving forward, please use the normal QoS. The Full QoS will exist through the month of April and behave identically to normal.  At the end of April, the full QoS will be removed; direct your scripts to the normal QoS now to avoid job errors in May. To do this, please add the following to your Slurm scripts:

#SBATCH -q normal

Pricing on HPC

Below is the current table of rates for the new charge model. Though we are charging the same price for low and high memory compute nodes, we will be monitoring usage and reaching out to users that are inefficient with their memory consumption. Your jobs will be routed to the appropriate node, based on your memory request.  Please request only the memory that your job requires.

These values will be kept up to date on the CIARC website: https://ciarc.mines.edu/hpc-storage-rates

Node Type Rate per hour
[USD]
CPU core Memory per CPU core
[GB]
GPU
CPU $0.02 1 5 or 10* NA
GPU enabled $0.12** 6 48 1xV100

*There are two types of CPU nodes on Wendian: (1) a “low” memory node of 192 GB, and (2) a “high” memory node of 384 GB node. Jobs will be routed to each of these nodes depending on requested resources.

**For GPU jobs, the V100 node has 4 GPU cards. For each GPU card you request, you automatically must pay for 6 CPU cores and 48 GB memory, since this is ¼ of the available compute resources on the GPU node.

Consultations for improved compute efficiency

We understand that a charge model means that individuals will want to run jobs as efficiently as possible. If you would like to reach out for a consultation on how best to utilize HPC resources, please use the following Help Center ticket request:

https://helpcenter.mines.edu/TDClient/1946/Portal/Requests/ServiceDet?ID=30287

Wendian Scratch Filesystem

Files on /scratch is a short-term shared filesystem for storing data currently necessary for active research projects; this is subject to purge on a six-month (180 day) cycle. No limits (within reason) to the amount of data. Currently Wendian is reaching 88% of its 1 petabyte (1000 TB) storage capacity. Once the filesystem reaches the critical threshold of 95% capacity, access to Wendian will have to cease until the issue is resolved. This policy will remain in place: https://wpfiles.mines.edu/wp-content/uploads/ciarc/docs/pages/policies.html

Classroom Use on Wendian

As a reminder, Wendian HPC usage in classes will not be affected by the new model and ITS will request an annual budget to cover classroom costs. If you are interested in using HPC for your class, you can submit a request here: https://helpcenter.mines.edu/TDClient/1946/Portal/Requests/ServiceDet?ID=38002

If you have further questions, please submit a ticket here.

Best,

HPC@Mines Staff

Modified HPC Office Hours for Week of 7/25

Office Hours for HPC users on Wendian and Mio will be modified for the week of July 25. Computational scientist Nicholas Danes will open his virtual doors via Zoom at the following times for this week only:

  • Wednesday, July 27, 12-1 pm
  • Thursday, July 28, 10-11 am

Please join Nicholas on Zoom with the following link:

Nicholas Danes (he/him) – Mines is inviting you to a scheduled Zoom meeting.
Topic: HPC Office Hours
Time: This is a recurring meeting Meet anytime

Join from PC, Mac, Linux, iOS or Android: https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmines.zoom.us%2Fj%2F4179773375&data=05%7C01%7Cndanes%40mines.edu%7C659c2b0535cc4c94f71008da336f6aa2%7C997209e009b346239a4d76afa44a675c%7C0%7C0%7C637878850883473481%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=3GHMZWK1H3sNb1jbCTQjvpOxcgK0L1LI6IX3XepG3h8%3D&reserved=0

Or iPhone one-tap:  13462487799,4179773375# or 16699006833,4179773375#

Or Telephone:
Dial: +1 346 248 7799 (US Toll) or +1 669 900 6833 (US Toll)
Meeting ID: 417 977 3375
International numbers available: https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmines.zoom.us%2Fu%2FaolEkoRay&data=05%7C01%7Cndanes%40mines.edu%7C659c2b0535cc4c94f71008da336f6aa2%7C997209e009b346239a4d76afa44a675c%7C0%7C0%7C637878850883473481%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=pqbGReRJ%2BwEjIDHwWqZ5hLdA4Xh4A7aQIln5ILJBVmQ%3D&reserved=0

Or a H.323/SIP room system:
H.323: 162.255.37.11 (US West) or 162.255.36.11 (US East)
Meeting ID: 417 977 3375

SIP: 4179773375@zoomcrc.com

Note: The initial few weeks of office hours are considered a test phase to gauge community interest; if deemed successful, we will post our regular HPC office hours on our website at https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fciarc.mines.edu%2F&data=05%7C01%7Cndanes%40mines.edu%7C659c2b0535cc4c94f71008da336f6aa2%7C997209e009b346239a4d76afa44a675c%7C0%7C0%7C637878850883473481%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=vYzjoIqef97jTUjemYrGEHuxtYRb0fCUdFWBosigJ1Q%3D&reserved=0.

If these office hours are not convenient for you, please reach out to ndanes@mines.edu directly to schedule a separate time for hands-on help with HPC resources.

Thank you!

Regards,

HPC@Mines

HTCondor Week 2022 – Virtual Registration closed May 23!

HTCondor is a high throughput computing (HTC) open source software suite designed for automating large batch workloads and managing other compute resources. HTCondor Week is a workshop series that provides in-depth tutorials and talks to learn more about HTCondor, HTC and how they are used.

You can learn more about the Workshop series here: https://agenda.hep.wisc.edu/event/1733/

Registration is required and closes May 23. There is an in-person component, but virtual attendance via Zoom is also available.

HPC Virtual Office Hours

We’re exploring ways to expand facilitation for HPC users on Wendian and Mio, and are beginning with virtual office hours. Computational scientist Nicholas Danes will open his virtual doors via Zoom, twice per week, at the following times, beginning May 04, 2022:

Wednesdays, 12-1 pm
Thursdays, 12-1 pm.

Please join Nicholas on Zoom with the following link:

Nicholas Danes (he/him) – Mines is inviting you to a scheduled Zoom meeting.
Topic: HPC Office Hours
Time: This is a recurring meeting Meet anytime

Join from PC, Mac, Linux, iOS or Android: https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmines.zoom.us%2Fj%2F4179773375&data=05%7C01%7Cndanes%40mines.edu%7C659c2b0535cc4c94f71008da336f6aa2%7C997209e009b346239a4d76afa44a675c%7C0%7C0%7C637878850883473481%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=3GHMZWK1H3sNb1jbCTQjvpOxcgK0L1LI6IX3XepG3h8%3D&reserved=0

Or iPhone one-tap:  13462487799,4179773375# or 16699006833,4179773375#

Or Telephone:
Dial: +1 346 248 7799 (US Toll) or +1 669 900 6833 (US Toll)
Meeting ID: 417 977 3375
International numbers available: https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmines.zoom.us%2Fu%2FaolEkoRay&data=05%7C01%7Cndanes%40mines.edu%7C659c2b0535cc4c94f71008da336f6aa2%7C997209e009b346239a4d76afa44a675c%7C0%7C0%7C637878850883473481%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=pqbGReRJ%2BwEjIDHwWqZ5hLdA4Xh4A7aQIln5ILJBVmQ%3D&reserved=0

Or a H.323/SIP room system:
H.323: 162.255.37.11 (US West) or 162.255.36.11 (US East)
Meeting ID: 417 977 3375

SIP: 4179773375@zoomcrc.com

Note: The initial few weeks of office hours are considered a test phase to gauge community interest; if deemed successful, we will post our regular HPC office hours on our website at https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Fciarc.mines.edu%2F&data=05%7C01%7Cndanes%40mines.edu%7C659c2b0535cc4c94f71008da336f6aa2%7C997209e009b346239a4d76afa44a675c%7C0%7C0%7C637878850883473481%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000%7C%7C%7C&sdata=vYzjoIqef97jTUjemYrGEHuxtYRb0fCUdFWBosigJ1Q%3D&reserved=0.

If these office hours are not convenient for you, please reach out to ndanes@mines.edu directly to schedule a separate time for hands-on help with HPC resources.

Thank you!

Regards,

HPC@Mines

PEARC22 – Early Registration Ends May 10, 2022!

The Association for Computing Machinery’s (ACM) annual conference, Practice and Experience in Advanced Research Computing 2022 (PEARC22) is coming July 10-14, 2022 in-person in Boston, MA. Although submission deadlines have passed, it is still a great opportunity for researchers and students to become involved in one of the largest research computing conferences in the United States. Visit the PEARC22 homepage for more information on registration.

PEARC18 @ Pittsburgh, PA: July 22 – July 26, 2018

PEARC18 @ Pittsburgh, PA: July 22 – July 26, 2018

PEARC18 is for everyone who works to realize the promise of advanced computing as the enabler of seamless creativity, the theme of this year’s conference. Scientists and engineers, scholars and planners, artists and makers, students and teachers all depend on the efficiency, security, reliability and sustainability of increasingly complex and powerful digital infrastructure systems. If your work addresses these challenges in any way, PEARC18 is the forum to share, learn and inspire progress.