fbpx Reimagining the Internet on a Global Scale | University of Kentucky College of Engineering

Reimagining the Internet on a Global Scale

December 07, 2020

A new $3 million grant from the National Science Foundation (NSF) will expand FABRIC, a project to build the nation’s largest cyberinfrastructure testbed, to four distinguished scientific institutions in Asia and Europe. Jim Griffioen, a professor in the Department of Computer Science is a co-principal investigator on the NSF project.

By Lindsey Piercy


Science is fast outgrowing the capabilities of today’s internet infrastructure. Capitalizing on artificial intelligence, advanced computation and big data requires robust, interconnected computers, storage networks and software. Additionally, uneven progress in science cyberinfrastructure has led to bottlenecks that slow the process of discovery.

Launched in 2019 with a $20 million grant from the NSF, FABRIC is working to build a platform where computer scientists can test new ways to compute, move and store data.

“Unlike the internet, the new FABRIC network incorporates compute and storage capabilities that transform the way cloud, edge and network applications are designed — resulting in faster, more responsive, efficient, intelligent and secure applications capable of supporting today’s big data requirements,” Jim Griffioen, professor in the UK Department of Computer Science and co-principal investigator on the NSF project, said.

Using the additional funding, FABRIC Across Borders (FAB) will link the infrastructure to areas in Japan, the Netherlands, Switzerland and the United Kingdom.

Griffioen, along with computer science professors Zongming Fei and Ken Calvert, will lead efforts to develop new network services aimed at making it easier for researchers to access and transport massive data sets — such as those used by the international high energy physics community — across the extended FABRIC network. 

"The world has changed a great deal since the basic internet transport protocols were designed. Speeds and storage capacities have improved by factors of at least one million, while the data sets that drive discovery have grown even more,” Calvert explained. “FABRIC/FAB deployment will provide a high performance substrate to support improvements in efficiency and performance for scientists in multiple domains."

FAB, a project led by the University of Illinois, will draw on expertise from researchers at various institutions. Over the next three years, the team will collaborate with international partners to place FABRIC nodes at the University of Tokyo; CERN, the European Organization for Nuclear Research in Geneva, Switzerland; the University of Bristol in the U.K.; and the University of Amsterdam.

“FAB allows collaborative international science projects to experiment with ways to do their science more efficiently. Sending large quantities of data long distances is complicated when your science depends on real-time processing, so you don’t miss once-in-a-lifetime events,” Anita Nikolich, FAB principal investigator and director of technology innovation at the University of Illinois, explained. “Being able to put FABRIC nodes in physically distant places allows us to experiment with the infrastructure to support new capabilities and also bring disparate communities together.”

The project is driven by science needs in fields that are pushing the limits of what today’s internet can support. As new scientific instruments are due to come online in the next few years — generating ever larger data sets and demanding ever more powerful computation — FAB gives researchers a testbed to explore and anticipate how all that data will be handled and shared among collaborators spanning continents. 

“FAB will offer a rich set of network-resident capabilities to develop new models for data delivery from the Large Hadron Collider (LHC) at CERN to physicists worldwide,” Rob Gardner, member of FAB’s core team, said. “As we prepare for the high luminosity LHC, the FAB international testbed will provide a network R&D infrastructure we’ve never had before — allowing us to consider novel analysis systems that will propel discoveries at the high energy frontier of particle physics.”

"FABRIC will tremendously help the ATLAS experiment in prototyping and testing at scale some of the innovative ideas we have to meet the high throughput and big data challenges ATLAS will face during the high luminosity LHC era,” ATLAS computing coordinators Alessandro Di Girolamo, a staff scientist in CERN’s IT department, and Zach Marshall, an ATLAS physicist from Lawrence Berkeley National Laboratory, added. “The ATLAS physics community will be excited to test new ways of doing analysis, better exploiting the distributed computing infrastructure we run all around the world."

To ensure the project meets the needs of the scientists it's designed to serve, FAB will be built around five areas: computer science, physics, smart cities, space and weather.

FAB will connect with existing U.S. and international cyberinfrastructure testbeds and bring programmable networking hardware, storage, computers and software into one interconnected system. All software associated with FAB will be open source and posted in a repository available to the public.  

Research reported in this publication was supported by the National Science Foundation under Award Numbers 2029235 and 1935966. The opinions, findings, and conclusions or recommendations expressed are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.