| |
EI2GYB > ASTRO 18.11.25 15:19l 116 Lines 6308 Bytes #55 (0) @ WW
BID : 47706_EI2GYB
Read: GAST
Subj: The simulated Milky Way: 100 billion stars using 7 million
Path: DB0FFL<OE2XZR<OE6XPE<UA6ADV<I0OJJ<IR0AAB<VE3CGR<GB7OSW<GB7BED<EI2GYB
Sent: 251118/1403Z @:EI2GYB.DGL.IRL.EURO #:47706 LinBPQ6.0.25
_ _
(_\ /_)
)) ((
.-"""""""-.
_ _ _ _ /^\/ _. _. \/^\
/ \ ___| |_ _ __ ___ | \ | | _____ _____ \( /__\ /__\ )/
/ _ \ / __| __| '__/ _ \ | \| |/ _ \ \ /\ / / __| \, \o_/_\o_/ ,/
/ ___ \\__ \ |_| | | (_) | | |\ | __/\ V V /\__ \ \ (_) /
/_/ \_\___/\__|_| \___/ |_| \_|\___| \_/\_/ |___/ `-.'==='.-'
__) - (__
+------------------------------------------------------------------------------+
The simulated Milky Way: 100 billion stars using 7 million CPU cores
Researchers have successfully performed the world's first Milky Way simulation
that accurately represents more than 100 billion individual stars over the
course of 10 thousand years. This feat was accomplished by combining artificial
intelligence (AI) with numerical simulations. Not only does the simulation
represent 100 times more individual stars than previous state-of-the-art
models, but it was produced more than 100 times faster.
Published in Proceedings of the International Conference for High Performance
Computing, Networking, Storage and Analysis, the study represents a
breakthrough at the intersection of astrophysics, high-performance computing,
and AI. Beyond astrophysics, this new methodology can be used to model other
phenomena such as climate change and weather patterns.
Challenges of simulating the Milky Way
Astrophysicists have been trying to create a simulation of the Milky Way galaxy
down to its individual stars, which could be used to test theories of galactic
formation, structure, and stellar evolution against real observations. Accurate
models of galaxy evolution are difficult because they must consider gravity,
fluid dynamics, supernova explosions, and element synthesis, each of which
occur on vastly different scales of space and time.
Until now, scientists have not been able to model large galaxies like the Milky
Way while also maintaining a high star-level resolution. Current
state-of-the-art simulations have an upper mass limit of about one billion
suns, while the Milky Way has more than 100 billion stars. This means that the
smallest "particle" in the model is really a cluster of stars massing 100 suns.
What happens to individual stars is averaged out, and only large-scale events
can be accurately simulated.
The underlying problem is the number of years between each step in the
simulation-fast changes at the level of individual stars, like the evolution of
supernovae, can only be observed if the time between each snapshot of the
galaxy is short enough.
Computational limits and the need for innovation
But, processing smaller timesteps takes more time and more computational
resources. Aside from the current state-of-the-art mass limit, if the best
conventional physical simulation to date tried to simulate the Milky Way down
to the individual star, it would need 315 hours for every 1 million years of
simulation time.
At that rate, simulating even 1 billion years of galaxy evolution would take
more than 36 years of real time. But adding more and more supercomputer cores
is not a viable solution. Not only do they use an incredible amount of energy,
but more cores will not necessarily speed up the process because efficiency
decreases.
In response to this challenge, Keiya Hirashima at the RIKEN Center for
Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan, with
colleagues from The University of Tokyo and Universitat de Barcelona in Spain,
developed a new approach that combines a deep learning surrogate model with
physical simulations.
The surrogate model was trained on high-resolution simulations of a supernova
and learned to predict how the surrounding gas expands in the 100,000 years
after a supernova explosion, without using resources from the rest of the
model. This AI shortcut enabled the simulation to simultaneously model the
overall dynamics of the galaxy as well as fine-scale phenomena such as
supernova explosions.
To verify the simulation's performance, the team compared the output with
large-scale tests using RIKEN's supercomputer Fugaku and The University of
Tokyo's Miyabi Supercomputer System.
Breakthrough results and broader implications
Not only does the method allow individual star resolution in large galaxies
with over 100 billion stars, but simulating 1 million years only took 2.78
hours. This means that the desired 1 billion years could be simulated in a mere
115 days, not 36 years.
Beyond astrophysics, this approach could transform other multi-scale
simulations-such as those in weather, ocean, and climate science-in which
simulations need to link both small-scale and large-scale processes.
"I believe that integrating AI with high-performance computing marks a
fundamental shift in how we tackle multi-scale, multi-physics problems across
the computational sciences," says Hirashima.
"This achievement also shows that AI-accelerated simulations can move beyond
pattern recognition to become a genuine tool for scientific discovery-helping
us trace how the elements that formed life itself emerged within our galaxy."
+------------------------------------------------------------------------------+
================================================================================
= ____ __ ____ ___ _ _ ____ ____ ____ ____ =
= ( __)( )(___ \ / __)( \/ )( _ \ ( _ \( _ \/ ___) =
= ) _) )( / __/( (_ \ ) / ) _ ( ) _ ( ) _ (\___ \ =
= (____)(__)(____) \___/(__/ (____/ (____/(____/(____/ =
= Serving The Irish Packet Radio Network Since 2006 =
= Packet: EI2GYB@EI2GYB.DGL.IRL.EURO / EI2GYB@WINLINK.ORG =
= Email/PayPal: EI2GYB@GMAIL.COM =
================================================================================
Lese vorherige Mail | Lese naechste Mail
| |