Requesting Storage on SeaWulf

On the  SeaWulf cluster, data storage is a finite resource. Anticipating that user demand for data often changes, we allot storage space as needed. Requests for storage beyond the default amount are submitted through the OSTicket system at: https://iacs.supportsystem.com

Audience: Faculty, Postdocs, Researchers, Staff and Students

This KB Article References: High Performance Computing
This Information is Intended for: Faculty, Postdocs, Researchers, Staff, Students
Last Updated: March 20, 2018

Summary

Location Size Backed up? Shareable? Cleared?
/gpfs/home/<netid> 20 GB Yes No never
/gpfs/scratch/<netid> 20 TB No No 30 days
/gpfs/projects/<your_group>* up to 5TB* No Yes per request*

*Project spaces are created upon request. The size of this directory and the duration of data storage are defined on creation.


Home Directory

Each user is given 20 GB of disk space for their home directory, which is only accessible by them. Permission changes to this directory and files within will automatically be reverted. The home directory for user jsmith is /gpfs/home/jsmith. This space is backed up.


Scratch Directory

In addition to the home directory, users have access to a scratch space. This file space is intended to be used when users need to run jobs that produce a large amount of intermediary data. It is not intended for long-term storage. For this reason, files older than 30 days will be automatically deleted. The scratch space for the user jsmith is /gpfs/scratch/jsmith. The limit is currently set at 20TB/user. This space is NOT backed up.


Project space

Groups can request a collaborative storage space which can be shared by all members of a project. Project space for the “Smith Project” can be found in /gpfs/projects/smith. The project space allocation is by default 100GB. This space is NOT backed up.

Groups can request up to 5TB of space through the OSTicketing system, including the following in description:

  • The size of the data, or space requested
  • How it will be used/processed (e.g. how often it will be accessed, bandwidth requirements, etc.)
  • The duration of storage (Due to the cost of the high-performance enterprise storage system used on the cluster, we discourage use of it as an archive.)
  • Confirmation that the user understands that the data located in the project space is NOT backed up and that backing up of the data and insuring its integrity is the sole responsibility of the user.

Requests for more than 5TB will require more detail and, if granted, will likely only be satisfied for a specific period of time.

 

Additional Information


There are no additional resources available for this article.

Getting Help


The Division of Information Technology provides support on all of our services. If you require assistance please submit a support ticket through the IT Service Management system.

Submit A Ticket

For More Information Contact


IACS Support System