A while ago we had a whole series about LHC, ATLAS and particle physics in general. However, despite all we know and the explanatory power of the standard model, there is also a variety of open questions and currently unexplained phenomena. These include dark matter, dark energy, the neutrino mass, the CP violation, the hierarchy problem, and of course the unification of the standard model with gravitation. In this episode, CERN’s Valerie Domcke explains what it’s all about.
In this episode we take a look at newer generations of fission reactors, those that are currently being developed or researched. Our guest is Jacopo Buongiorno of MIT. We discuss some of the high-level goals of these new reactors, such as increased safety and efficiency, and then look at a few of the interesting new designs and how they realize these goals. We also briefly cover some of the policy arguments around keeping fission in the mix for combatting climate change.
In this episode we look at how supercomputers are used to help with managing the pandemic. It’s a double-header with two guests. We start with Cineca‘s Andrew Emerson. As part of the EXSCALATE 4 COV EU-funded research project, he works of virtual screening of existing drugs regarding their potential efficacy against SARS-CoV-2. In part two we talk with Dan Jacobson of the Oak Ridge National Laboratory. He and his team used a big data analysis to understand how the virus “works”, and they figured out very interesting mechanisms and pathways.
Humanity has always been exposed to potentially catastrophic risks that might endanger the continued existence of humanity. Asteroid impacts or supervolcano eruptions come to mind. But since about the invention of the atomic bomb, humanity has been able to wipe itself out, adding self-made existential risks to the natural ones. Oxford philosopher Toby Ord argues in his book The Precipice that those risks are much more likely than the natural ones. In this episode we explore this idea with him, and also discuss what we should do about this realization.
To conclude our detailed look at the ATLAS experiment, this episode looks at the computing infrastructure. We start out with the trigger systems that decide, very quickly, whether the data from a particular collision is worth keeping. We then discuss the reconstruction of the event, the simulation needed to understand the background as well as the LHC Grid used distribute data and computation over the whole planet. Our guest is CERN’n Frank Berghaus.