A new type of data visualization, as well as data analysis toolkit, is created to assist speed particle accelerator research and creation by allowing in situ visualization and inquiry of accelerator movements at scale.
Particle accelerators are on the edge of transformational breakthroughs and enhancements in computing techniques and power are a major source of the reason. Long appreciated for their role in industrial, medical applications and scientific discovery, particle accelerators, grab a huge amount of space and carry expensive prices.
The Big Hadron Collider at CERN in Switzerland and France, for instance – the world’s biggest and strongest particle accelerator possess a circumference of 17 miles and a price of $10 billion to construct. Even tinier accelerators, like those employed in medical centers for proton therapy, require big spaces for accommodating the hardware, radiation protection and power supplied. Similar treatment facilities fill an entire block of the city and cost millions and billions of dollars to construct.
In order to make these tools get employed in everyday use, advanced supercomputers and visualization tools like Cori supercomputers and Edison at Lawrence Berkeley National Laboratory have been used. “To avail maximum advantage of the social advantages of particle accelerators, game-transforming enhancements are required in the cost and size of accelerators as well as plasma-based particle accelerators stand beyond their potential for such enhancements,” says Jean Luc Vay, a senior scientist in Berkeley Lab’s Accelerator Technology and Applied Physics Division.
Vay is a powerful particle accelerator modeling project as a component of the NESAP program at NERSC and it is the chief investigator as one of the novel exascale computing elements sponsored by the U.S. Department of Energy. “Altering this from a lucrative technology into a mainstream scientific process depends critically on big-scale, big performance, great fidelity modeling of intricate procedures that progress over an extensive array of time and space scales,” he says.
Vay along with his team of computer professionals, physicists and mathematicians are performing to carry it by developing software tools, which can stimulate analyzing, visualizing and simulating the enhanced large datasets created during particle accelerator study.
Accelerator creation is a lucrative opportunity to assist the way exascale applications are headed. As stated by ATAp Division Director Wim Leemans, “We have spent years preparing for this opportunity,” he says, focusing towards the already widespread employment of modeling in accelerator design and the mechanism of association between computing and physics experts.
“One of the core factors in our study is the transformation to exa-scale and how the process of data visualization is altering,” explains Burlen Loring, an engineer of computer scientists who is also an associate member of the team. He along with his team members like David Grote, Oliver Rubet, Stepan Bulanov, Wes Bethel and Remi Lehe have conducted and executed the research. “With exa-scale systems, conventional visualization becomes restrictive as the simulation get bigger and the machines get stronger – storing all the information does not work and the data bandwidth and file systems are not maintaining up with the computer capacity.”
Filed Under: News
Questions related to this article?
👉Ask and discuss on EDAboard.com and Electro-Tech-Online.com forums.
Tell Us What You Think!!
You must be logged in to post a comment.