Storage & Quotas Guide
Understanding the storage architecture, quota limits, and best practices for data management on the HPC cluster.
Storage Architecture
The cluster uses a tiered storage system designed for different use cases:
Home Directories
Path: /home/username
- Quota: 50 GB per user
- Backed up: Yes (daily snapshots, weekly full backup)
- Speed: Standard NFS
- Use for: Configuration files, scripts, small important datasets
Home directories are for important data only. Large datasets should go to project storage.
Project Directories
Path: /projects/project_name
- Quota: Configured per project (typically 500 GB - 5 TB)
- Backed up: Yes (daily snapshots)
- Speed: High-performance NFS
- Use for: Shared datasets, project results, collaboration
Project directories are shared among all project members. Contact your PI to request project storage.
Scratch Space
Path: /scratch or $SCRATCH
- Quota: 1 TB per user (temporary)
- Backed up: No
- Speed: High-speed SSD/NVMe
- Use for: Temporary files, job working directories, large intermediate files
Scratch is NOT backed up! Files older than 30 days are automatically deleted. Move important results to project storage.
Checking Your Quota
# Check home directory quota
quota -s
# Check disk usage in human-readable format
du -sh ~
du -sh /projects/your_project
# Find largest files and directories
du -h --max-depth=1 ~ | sort -hr | head -20
QNAP NAS Integration
The cluster is integrated with QNAP NAS storage for project data:
- Project directories are hosted on QNAP NAS with RAID protection
- Automatic snapshots every 4 hours (retained for 7 days)
- Nightly replication to secondary NAS for disaster recovery
- Kerberos authentication ensures secure access
Best Practices
Data Management Tips
- Always use scratch for temporary job files
- Compress large datasets when not in use (
tar -czf)
- Clean up old files regularly
- Use project storage for shared data, not home directories
- Don't store sensitive data without encryption
- Version control your scripts (git) instead of multiple copies
Transferring Data
From Your Local Machine
# Using scp
scp -r local_directory username@hpc.psychologia.uj.edu.pl:/projects/your_project/
# Using rsync (recommended for large transfers)
rsync -avz --progress local_directory/ username@hpc.psychologia.uj.edu.pl:/projects/your_project/
# Using sftp
sftp username@hpc.psychologia.uj.edu.pl
sftp> put -r local_directory
Between Cluster and External Storage
# From cluster to external server
scp results.tar.gz external_server:/path/
# Using rclone for cloud storage (if configured)
rclone copy /projects/your_project/results remote:backup
Requesting More Storage
If you need additional storage:
- Contact your PI or project administrator
- Provide justification and estimated duration
- Storage requests are reviewed based on:
- Project requirements
- Current utilization
- Available capacity
Data Retention Policy
| Storage Type |
Retention |
After Account Expiry |
| Home Directory |
Duration of employment + 6 months |
Archived for 1 year, then deleted |
| Project Storage |
Duration of project |
Migrated to PI storage or archived |
| Scratch |
30 days maximum |
Immediately deleted |
Additional Resources