800px-Summit_(supercomputer) by Carlos Jones ORNL

Supercomputer at Riken Wako Campus, Japan

 

 

There is a movement in Japan and the United States to connect supercomputers to study the novel coronavirus, which is raging through the world. Their high-speed calculations and supercomputing capacity can be put to good use in the development of drugs to treat infections and control the spread of the virus.

 

A project utilizing supercomputers owned by private citizens as part of this large-scale computing is underway, as computers around the world take their ready positions to battle the pandemic.

 

 

Advertisement

Leap-frog Operation

 

In April, Japanese research institute Riken announced that it would “leapfrog” the operation of its next-generation supercomputer, “Fugaku,” currently under development in Kobe, and begin using it on a trial basis to study the novel coronavirus.  

 

Fugaku is the successor to Riken’s “K” national policy supercomputer, which was retired in 2019. It will become the most powerful supercomputer in Japan, with a computing capacity approximately 100 times faster than K.

 

Currently still under development, Fugaku was originally to be launched in 2021. Although the supercomputer’s capacity at present is only 10-20% of what it will have upon completion, Riken made the unprecedented decision to move the launch schedule ahead in an effort to bring a halt to the damage being wreaked by the escalating novel coronavirus pandemic.

 

Kyoto University professor Yasushi Okuno is using Fugaku to search for coronavirus treatment options from among approximately 2,000 types of existing medicines. By simulating the molecular dynamics of protein, Okuno is investigating the mechanisms of action against the virus by potential drugs.

 

Fugaku will be used to study the spread of coronavirus infections, as well as to simulate the impacts that measures, such as lockdowns, have on socioeconomic activities.

 

“One of the most important missions of Fugaku is to protect the well-being of citizens by using its massive computing power,” commented Satoshi Matsuoka, director of the Riken Center for Computational Science (R-CCS). He was expressing Riken’s volition to put its supercomputing power to work as quickly as possible to help in the national crisis.

 

Aside from Fugaku, efforts to make use of other supercomputers have also begun. Twelve institutions, including national universities and national research institutes, have announced decisions to provide use of their supercomputers free of charge by May, to study the novel coronavirus. Japan’s leading supercomputers include the “AI Bridging Cloud Infrastructure (ABCI)” at the National Institute of Advanced Industrial Science and Technology (AIST), and “TSUBAME 3.0” at the Tokyo Institute of Technology. The total scale of the computers could reach 114 quadrillion operations per second (A quadrillion is a “1” with 15 zeros after it.)

 

An AIST representative said the institute is preparing for increased demands for its equipment: “We estimate that the demand for computing will increase, along with expectations for computational scientific methods and artificial intelligence (AI) technology.”

 

 

Advertisement

Industry Leaders Come Together

 

Meanwhile, in the United States, 33 organizations, ranging from private businesses to universities and research institutes, have formed a consortium to apply supercomputers to the study of the novel coronavirus.

 

The U.S. Department of Energy, which operates “Summit,” the world’s fastest supercomputer, along with IBM, the company that developed it, have taken the lead. Big names like Microsoft, Google, Massachusetts Institute of Technology (MIT), and the National Aeronautics and Space Administration (NASA) — all driving forces in the supercomputer industry — lead this group of all-stars gathered to address the crisis.

 

In total, the supercomputers bring together a processing capability of approximately 420 quadrillion operations per second. A range of studies is underway, including analyses of the structure and evolution of the novel coronavirus, an investigation into whether or not viral activity will decrease in hot and humid environments, and simulations on how to properly distribute limited medical resources such as ventilators.

 

Another project gaining momentum is working to analyze the structure of proteins in the novel coronavirus using “distributed computing” technology. This brings computers around the world together to process calculations. Based on a contrivance that allows the excess capacity of computers connected to the internet, whether in homes or companies, to be utilized like a huge supercomputer, anyone that installs the specialized software can participate.

 

The project, dubbed Folding@home, was launched in 2000 by researchers at Stanford University. To date, the project has produced results in studies on cancer, the Ebola virus, and Alzheimer’s disease.

 

When the initiative on the novel coronavirus pandemic was announced in late February, participation skyrocketed. In just a month and a half, a massive amount of processing capacity came together, enough to perform 2.4 quintillion operations (2.4 exaFLOPS) per second.

 

Every year in June and November, rankings of the world’s top 500 supercomputers are released. Purportedly, the total processing capacity that came together for this project surpasses the aggregate capacity of these 500 supercomputers put together. In Japanese supercomputer terms, the figure is equivalent to about 240 of Japan’s supercomputer K.

 

The spread of the novel coronavirus has proven once again that the world is connected. However, the power of these connections is being demonstrated in ways never seen before to fight against this unprecedented global threat.

 

Large-scale computing aimed at developing treatments, drug discovery, and infection control could bring about a win for the human race.

 

(Click here for access to this article in Japanese.)

 

Author: Maki Matsuda, Science and Technology News Department, The Sankei Shimbun

 

Leave a Reply