...
The NHR systems in Berlin ("Lise") and Göttingen ("Emmy") are equipped with various file systems each. Their properties and their intended utilization are described here.
Disk quotas based on group ownerships are implemented on each site's global (non-local) file systems.
...
- On an Emmy frontend node (glogin.hlrn.de or glogin[1-9].hlrn.de), generate a new SSH key (also documented at the SCC).
- Add the SSH key at the GWDG Website -> My Account -> Security.
- From an Emmy frontend node (glogin9.hlrn.de has access to both Emmy and Grete scratches, while glogin.hlrn.de and glogin[1-8].hlrn.de only have access to the Emmy scratch; but all have access to
$HOME
), transfer the files usingrsync
(see SCC documentation and rsync man page) to/from the SCC transfer node transfer-scc.gwdg.de. Some examples are given belowCopy a single file
FOO
from SCC$HOME
into your current directory on EmmyCodeblock language bash rsync GWDGUSERNAME@transfer-scc.gwdg.de:/usr/users/GWDGUSERNAME/FOO .
Copy a single file FOO in your current directory on Emmy to
$HOME
on the SCCCodeblock language bash rsync FOO GWDGUSERNAME@transfer-scc.gwdg.de:/usr/users/GWDGUSERNAME/
Copy a directory in your SCC /scratch to your current directory on Emmy
Codeblock language bash rsync -r GWDGUSERNAME@transfer-scc.gwdg.de:/scratch/projects/workshops/forest/synthetic_trees .
Copy a directory in your current directory on Emmy to
/scratch
on the SCCCodeblock language bash rsync -r synthetic_trees GWDGUSERNAME@transfer-scc.gwdg.de:/scratch/projects/workshops/forest/
If you have terrabytes of data that need to be transferred, please contact us so that we can provide a custom solution for this.