Campus network connectivity is provided and maintained by OIT. Currently, there are dual paths, for redundancy and load balancing, between campus the the Internet. Each link provides 10 Gbps connectivity. Two 100 Gbps connections provide access to Internet2 and ESnet. More information concerning these connections can be found on the  OIT web site

Head nodes are all connected to the campus network with a 10 Gbps connection. The internal network on the clusters is very cluster dependent. All clusters use a 1 Gbps private network for local communication while the NFS servers are connected using Infiniband. A high performance, low latency Infiniband network also is attached for use with MPI parallel communication.

 

All private networks connect to the central GPFS storage over the Infiniband network connection. This is also a private network with fibre connections between the data center and the Lewis Library where some machines are currently housed. This private network also is used for /projects backups.

Globus is an infrastructure for transferring large amounts of data between Princeton and any remote system that is also participating in the Globus system. Research Computing supports Globus data transfer to and from the GPFS-based /projects, and scratch disk space connected to the Research Computing systems. For information about using Globus at Princeton, see the Globus Data Transfer at Princeton web page.