Skip to content Skip to navigation

Equipment & Facilities

High-performance computing and big-data analytics

Research Computing at University of Florida (UF) operates the HiPerGator supercomputer, a cluster-based system with a combined capacity of about 21,000 cores in multi-core servers. In November 2015, this capacity was expanded by adding 30,000 new Intel cores, bringing the total to 51,000 cores. The servers are part of an integrated InfiniBand fabric. The clusters share over 5 PetaBytes of distributed storage via the Lustre parallel file system. In addition, Research Computing houses about 2 PB of storage for the High Energy Physics collaboration of the Compact Muon Solenoid (CMS) experiment. The system includes over 100 NVIDIA GPU accelerators and 24 Intel Xeon Phi accelerators, available for experimental and production research, as well as for training and teaching.

 

Florida cyberinfrastructure

Universities in the state of Florida joined forces in the Sunshine State Education & Research Computing Alliance (SSERCA) to build a robust cyberinfrastructure to share expertise and resources (http://sserca.org). The current members are Florida Atlantic University (FAU), Florida International University (FIU), Florida State University (FSU), University of Central Florida (UCF), University of Florida (UF), University of Miami (UM), and University of South Florida (USF). The affiliate institutions are Florida Agricultural and Mechanical University (FAMU), University of North Florida (UNF), and University of West Florida (UWF), Florida Polytechnic University (FPU), Florida Institute of Technology (FIT), Nova South Eastern University, and New College of Florida.

The Florida Lambda Rail (FLR) provides the underlying fiber optic network and network connectivity between these institutions and many others. The FLR backbone completed the upgrade to 100 Gbps in June 2015. The University of Florida is connected to this backbone at the full speed of 100 Gbps and has been connected at that rate to Internet2 backbone since Jan 2013.

 

Restricted data storage and analytics

Research projects may involve storing and processing restricted data, including intellectual property (IP), protected health information (PHI), Controlled Unclassified Information (CUI) regulated by Health Insurance Portability and Accountability Act (HIPAA), International Trade in Arms Regulation (ITAR), Export Administration Regulation (EAR), Family Educational Rights and Privacy Act (FERPA). For such projects Research Computing supports two environments

  1. Research Shield https://shield.ufl.edu/meets the NIST 800-53 rev4 “moderate” rating for contracts that require FISMA compliance and has been operating since June 2015, and
  2. GatorVault http://www.rc.ufl.edu/resources/hardware/gatorvault/is approved for PHI, FERPA, IP, and ITAR/EAR restrictions and started operating in December 2015.

 

Network infrastructure

The Research Computing systems are located in the University of Florida data center. The machine room is connected to other campus resources by the 200 gigabit per second Campus Research Network (CRN), now commonly called Science DMZ. The CRN was created with an NSF Major Research Instrumentation award in 2004 and has been maintained by the University since the end of that award. The CRN connects the HPC systems to the Florida Lambda Rail, from which the National Lambda Rail and Internet2 are accessible. The University of Florida was the first institution (April 2013) to meet all requirements become an Internet2 Innovation Platform, which implies the use of software defined networking (SDN), the implementation of a Science DMZ, and a connection at 100 Gb/s to the Internet2 backbone. An NSF CC-NIE award in 2012 funded the 100 Gb/s switch and an NSF MRI grant awarded in 2012 funded the upgrade of the CRN (Science DMZ) to 200 Gb/s. The upgrade has been operational since the winter of 2013.

At the department, five CISE servers - one Solaris SPARC, two Ubuntu 16.04 Linux, and two Windows 2016 - are available via SSH, VNC or remote desktop to all departmental users. Users are able to run jobs and to log in to from remote locations. Faculty members have access to 2 dedicated servers for their access and use only.

All faculty offices are equipped with a Windows or Linux workstation. Standard software installations include Ubuntu 16.04 or Windows 10, Java, jGRASP, many Microsoft packages from the Microsoft Development Academic Alliance, Mozilla Firefox and Google Chrome.

Database servers and software includes MySQL, PostgreSQL, and Oracle. Wireless access is available throughout the CSE Building and the UF campus including student dorms, cafeterias, and other public areas.

The classrooms in the CSE building have all been provided with multimedia support and computers housed in a locked kiosk. These resources all have access to the University’s wireless network. The UF College of Engineering requires all students to possess an adequately-provisioned laptop computer ensuring easy access to all resources in the classrooms.

The bulk of the CISE’s disk storage comes from a Sun 7410 with 66TB of raw disk space.

An additional 300TB is provided by other servers.

There are about 35 servers running a mix of Red Hat Enterprise Linux 6, 7 and Solaris 10 providing such services as:

  • Web hosting
  • Database hosting—MySQL, PostgreSQL, Oracle
  • Kerberos / LDAP authentication
  • DNS
  • DHCP
  • Backups via Tivoli Storage Manager and disk based rsyncs
  • Samba
  • NFS
  • Security-related services
 

CISE's web services are supported on a Dell R720, 128GB of memory and 32 2.6 Intel Xeon CPUs. They serve department content, user content and various departmental web applications.

CISE has about 220 Linux PCs running the Ubuntu 16.04 LTS and 110 Windows 10 PCs. They serve as lab machines and workstations for students, teaching assistants, research assistants, and faculty. Of these, 34 Windows PCs and 90 Linux PCs are in public labs that are intended for general student use as well as use in lab sections of graduate and undergraduate classes.

CISE has a shared computer cluster consisting of a head node with 16 2.6 GHz AMD Opteron cores, 32GB of memory and 40TB of storage with 15 worker nodes containing dual Opterons and 16GB of memory running Redhat Enterprise Linux 6.x.All servers in the CISE department are connected with 1 GB Ethernet to the switching backbone. Inter-switch uplinks are transitioning from 6+ 1GB to 10 GB EtherChannel ports. The newest servers have 10 GB connections for network storage to a private 10 GB network. CISE's Cisco hardware includes a Catalyst 6513, one Catalyst 6509E, and three Catalyst 4506s providing routing and switch capabilities to the more than 600 devices and 80 networks in the department. CISE's external connection is via a 1 GB fiber connection to the University of Florida’s core network.

A unique printing solution allows the department to offer free printing to all students and Teaching and Research Assistants based upon a quota system.