The U.S. Department of Energy allocated $5 million to three projects, including Argonne National Laboratory and Sandia National Laboratories, to enhance computational tools and better prepare for biological threats caused by humans and nature.
The two laboratories will collaborate to combine Sandia's algorithms with Argonne's high-performance models, which focus on disease transmission, control and spread, according to an April 5 news release.
“Our whole point in doing this type of work is to make the process routine, more akin to weather forecasting or other domains where a large computational infrastructure is dedicated to continuously adjusting the models automatically as new data is obtained.” Jonathan Ozik, principal computational scientist, Argonne National Laboratory, said in the release.
Ozik went on to explain his goals for the models, the release reported.
“We want models that mimic reality,” Ozik continued, according to the release. “By calibrating them, we will be able to trust that outcomes from computational experiments carried out with the models have a good chance of meaningfully reflecting reality.”
“Getting access to the data, landing on the right parameters and evolving the best fit of information that is timely and relevant to decision makers are among the biggest problems we face,” Charles “Chick” Macal, chief scientist for Argonne’s Decision and Infrastructure Sciences division and its social, behavioral and decision science group leader, said in the release. “A large component of the work that we do as part of the biopreparedness project is to develop high performance computing workflows to improve the computational techniques, to make them more efficient.”
Investigators for the project also went on to explain what their role is on the project, the release reported.
“We are the guys who search out those parameters,” Sandia’s Jaideep Ray, principal investigator of the project, said in the release. “If the forecasts are right, then we know we have the right set of parameters."
“Our whole point in doing this type of work is to make the process routine, more akin to weather forecasting or other domains where a large computational infrastructure is dedicated to continuously adjusting the models automatically as new data is obtained,” Ozik said in the release. “We can then provide short-term forecasts and the ability to run longer-term scenarios that answer specific stakeholder questions.”