OpenBCM V1.07b12 (Linux)

Packet Radio Mailbox

DB0FFL

[Box DB0FFL]

 Login: GAST





  
EI2GYB > ASTRO    23.11.25 14:34l 51 Lines 4897 Bytes #60 (0) @ WW
BID : 47855_EI2GYB
Read: GAST
Subj: AI Cracks Galaxy Simulation
Path: DB0FFL<OE2XZR<DB0FHN<DB0RKB<DK0WUE<ZL2BAU<GB7BED<EI2GYB
Sent: 251123/1324Z 47855@EI2GYB.DGL.IRL.EURO LinBPQ6.0.25

                                                               _       _
                                                              (_\     /_)
                                                                ))   ((
                                                              .-"""""""-.     
     _        _               _   _                       /^\/  _.   _.  \/^\    
    / \   ___| |_ _ __ ___   | \ | | _____      _____     \(   /__\ /__\   )/        
   / _ \ / __| __| '__/ _ \  |  \| |/ _ \ \ /\ / / __|     \,  \o_/_\o_/  ,/     
  / ___ \\__ \ |_| | | (_) | | |\  |  __/\ V  V /\__ \       \    (_)    /
 /_/   \_\___/\__|_|  \___/  |_| \_|\___| \_/\_/ |___/        `-.'==='.-'
                                                               __) - (__    
+------------------------------------------------------------------------------+
AI Cracks Galaxy Simulation

The Milky Way contains more than 100 billion stars, each following its own evolutionary path through birth, life, and sometimes violent death. For decades, astrophysicists have dreamed of creating a complete simulation of our Galaxy, a digital twin that could test theories about how galaxies form and evolve. That dream has always crashed against an impossible computational wall.

Until now.

Researchers led by Keiya Hirashima at RIKEN's Center for Interdisciplinary Theoretical and Mathematical Sciences have achieved what seemed beyond reach, a simulation representing every single one of those 100 billion stars over 10,000 years of galactic time. The breakthrough came from an unexpected marriage of artificial intelligence and traditional physics simulations, presented at this year's Supercomputing Conference.

The problem wasn't merely one of scale, though the numbers are staggering. Previous state of the art galaxy simulations could handle roughly one billion solar masses, meaning their smallest "particle" represented a cluster of about 100 stars. Individual stellar events got averaged away, lost in the noise. To capture what happens to single stars requires taking tiny time steps through the simulation, short enough to catch rapid changes like supernova explosions.

Barred spiral galaxy known as NGC 1300 viewed nearly face-on. Its thought the Milky Way is a barred spiral like this (Credit : NASA, ESA, and The Hubble Heritage)

But smaller time steps demand exponentially more computing power. Using conventional methods to simulate the Milky Way at individual star resolution would require 315 hours of supercomputer time for every million years of galactic evolution. Modelling even one billion years would consume 36 years of real time. Adding more processor cores doesn't solve the problem either since beyond a certain point, efficiency plummets while energy consumption skyrockets.

Hirashima's team found their solution in a deep learning surrogate model. They trained an AI on high resolution simulations of supernovae, teaching it to predict how gas expands during the 100,000 years following an explosion. This AI shortcut handles the rapid small scale physics without dragging down the rest of the model, allowing the simulation to simultaneously track both galaxy wide dynamics and individual stellar catastrophes.

The AI simulation has modelled all the stars in our Galaxy. The stars of the Milky Way are pictured here above a dark site with little light pollution (Credit : Steve Jurvetson)

The performance gains are remarkable. What would have taken 36 years now requires just 115 days. The team verified their results against large scale tests on RIKEN's Fugaku supercomputer and The University of Tokyo's Miyabi system, confirming the AI enhanced simulation produces accurate results at unprecedented scale.

This approach could transform how we model any system involving vastly different scales of space and time. Climate science, weather prediction, and ocean dynamics all face similar challenges, needing to link processes that range from molecular to planetary scales.

Source : The simulated Milky Way: 100 billion stars using 7 million CPU cores



+------------------------------------------------------------------------------+


================================================================================
=            ____  __  ____   ___  _  _  ____    ____  ____  ____              =
=           (  __)(  )(___ \ / __)( \/ )(  _ \  (  _ \(  _ \/ ___)             =
=            ) _)  )(  / __/( (_ \ )  /  ) _ (   ) _ ( ) _ (\___ \             =
=           (____)(__)(____) \___/(__/  (____/  (____/(____/(____/             =
=              Serving The Irish Packet Radio Network Since 2006               =
=            Packet: EI2GYB@EI2GYB.DGL.IRL.EURO / EI2GYB@WINLINK.ORG           =
=                      Email/PayPal: EI2GYB@GMAIL.COM                          =
================================================================================



Lese vorherige Mail | Lese naechste Mail


 23.11.2025 18:42:10lZurueck Nach oben