- Indico style
- Indico style - inline minutes
- Indico style - numbered
- Indico style - numbered + minutes
- Indico Weeks View
"For the students by the students"
Welcome to the 4th annual student symposium in Physics at the Niels Bohr Institute. This is an initiative to celebrate and show all the great work done by all of the master students at NBI across the many fields.
Important dates
ABSTRACT SUBMISSION DEADLINE: Monday the 24th of February.
REGISTRATION DEADLINE: Tuesday the 11th of March @ 23:59.
The James Webb Space Telescope provides unprecedented access to galaxies at a broad range of redshifts observed in a variety of filters, which correspond to different wavelengths of observation. Due to the extraordinary volume of accessible and distinct filters, we are able to see how physical information drastically changes based on the differences in galaxy emission captured at different wavelengths. This introduces a free parameter, wavelength, to add to an analysis of the notable galaxy size-mass relation which describes the correlation between mass and structure of a galaxy over time and provides insight into the potential environmental effects driving mass evolution. The objective of this work is to further the understanding of galaxy evolution through expanding the size-wavelength relation to a generalized relationship for any redshift and at any wavelength of observation.
The IceCube Neutrino Observatory, located at the South Pole, is a cutting-edge telescope designed to detect neutrinos, including high-energy (HE) cosmic neutrinos. Strong evidence links these HE neutrinos to astrophysical sources such as active galactic nuclei (AGNs), blazars, and tidal disruption events (TDEs). Current reconstruction algorithms achieve an angular resolution of approximately 0.5 degrees for HE neutrino events. To enhance the identification of HE neutrino sources, this work explores the use of machine learning to improve event reconstruction methods, aiming to refine angular resolution and bolster future astrophysical discoveries.
This presentation will provide an overview of the IceCube detector and its data, highlighting the potential of machine learning in neutrino physics while also addressing the challenges of applying these techniques to real-world data. A particular focus will be placed on the transformer architecture and its application to event reconstruction. The presentation will conclude with preliminary results and insights from ongoing work.
This project aims to calibrate the Digital Optical Modules (DOMs) in the IceCube Neutrino Observatory using minimum ionizing muons. By analyzing simulated muon events in Monte Carlo data, we hope to determine the relative efficiency of which the DOMs have been simulated with. DOM efficiency is currently the third largest systematic uncertainty in IceCube, directly impacting event reconstruction and flux measurements. Reducing this uncertainty improves the accuracy of neutrino detection and enhances the detector’s sensitivity. The results will contribute to more precise astrophysical neutrino measurements and support future IceCube upgrades.
(Presentation and/or progress poster)
I am developing software to decrypt the complex UV-optical spectra of the gaseous region surrounding accreting supermassive black holes (SMBHs), such that we may advance our understanding of central SMBHs further. Emission from these regions is cryptic at best. Due to the highly energetic environment, physical gas components which contribute to the UV-optical spectrum move at high speeds, causing severe emission line blending. It is therefore necessary to investigate how these complex spectra may be decomposed using techniques which have both physical and statistical grounds, while also ensuring that the software is fast enough to compete with its other well-established counterparts. Although the resultant code has many moving parts, relying on even more statistical tests, the poster/talk will describe the overall approach to tackling a central problem which after 60 years has not been properly addressed.
Oxygen trapped in ice cores provides information about past climate. This research has largely been concerned with single substituded oxygen, where one of the two oxygen atoms is substituted for an either O-17 or O-18. Much rarer and harder to measure is clumped oxygen, where both atoms are of rare isotopes. Clumped oxygen has the potential to provide information about the upper atmosphere in Earth's past.
As part of the Green2Ice project, two new mass spectrometers have been acquired. One of these is set to become the third in the world capable of measuring clumped oxygen from ice cores. As long as isobaric interference is taken care of, we observed very stable measurements under most conditions boding well for ice core measurements to follow.
In 1915, Einstein formulated the general theory of relativity, which describes how massive objects bend and curve the fabric of spacetime. One of the main predictions of general relativity is the existence of black holes, which are dense regions of spacetime in which not even light can escape. Several detections and observations confirm the existence of black holes, including the trajectory of stars in the center of the Milky Way galaxy, the observation by the Event Horizon Telescope, and the direct detection of gravitational waves released from the merger of binary black holes. These gravitational wave detections were made possible through the simulations conducted in numerical relativity. However, recently it has been discovered that 95 percent of the energy in the universe is dominated by dark energy and dark matter, and in my thesis, I study the possible effects dark matter can have on the emission of gravitational waves. These effects are explored by solving the Regge-Wheeler equation in a Schwarzschild background with a dark matter halo. The results show that the dominant effect occurs when the compactness of the halo is around $\mathcal{C}=0.1$.
Open source simulation frameworks are highly valuable in experimental particle physics, enabling collaborative data simulations shared among different research groups. This thesis extends Prometheus, an open source framework created to provide unified simulation capabilities for all active and proposed gigaton neutrino telescopes: IceCube at the South Pole, KM3NeT in the Mediterranean Sea, Baikal-GVD in Lake Baikal, TRIDENT in the South China Sea, and P-ONE in the Pacific Ocean. The work of this thesis improves upon the initial implementation of Prometheus and adds simulation capability to lower-energy neutrino events.
Specific contributions of this thesis include: implementation of low-energy neutrino event generation, implementation of multi-PMT (photomultiplier tube) optical modules, improvements on photon light yield from muons, and more. This modified Prometheus framework can generate simulated datasets to support machine learning algorithms for tasks such as neutrino energy and angle reconstruction, aiding in physics analysis of active and proposed gigaton neutrino telescopes.
Due to the co-existence of different stable states (multi-stability), various climate components, such as the Amazon forest, the West African monsoon or the Atlantic meridional overturning circulation (AMOC), may undergo catastrophic regime shifts at varying levels of global warming. As a result, our uncertainty in future greenhouse gas emissions renders possible a variety of storylines with drastically different climatic conditions. Assessing the relative likelihood of each storyline by ensemble simulations with realistic climate models is computationally extremely expensive. This could be alleviated with machine learning (ML) models trained on simulation data. But a fundamental challenge is that future regime shifts likely correspond to dynamical regimes that have not been observed in the training data, whether generated from observations or state-of-the-art climate models. Thus, ML predictions of long-term climate change may be unreliable if they only capture the physics of previously observed climate states.
This be overcome by data-efficient and physics-informed ML methods, such as 'Next-Generation Reservoir Computing' (NG-RC), which has has shown promise in the task for simpler dynamical systems. We test this approach on extensive model output previously generated from the global ocean model 'Veros', which features various co-existing dynamical regimes that may be attained under future climate change, including a collapsed ocean circulation. We use system identification methods, such as regularized orthogonal least squares on Veros simulation data, to design functional forms of the nodes in the output layer. Then, we train the customized NG-RC model is on various sub-sets of simulation data and validate on initial conditions that belong to unobserved dynamical regimes. The resulting autonomous system shows great predictive ability, even across several chaotic timescales and shows promise in prediction of unseen stable dynamical states.
The IceCube Neutrino Observatory at the South Pole is designed to detect neutrinos originating from astrophysical sources such as Active Galactic Nuclei (AGN), supernovae (SN), and the Sun. The detector consists of 86 strings, each equipped with multiple Digital Optical Modules (DOMs) containing photomultiplier tubes (PMTs). These PMTs detect Cherenkov light produced by neutrino interactions in the ice.
The data from numerous detected events can be used to train neural network-based machine learning models such as Transformers or Graph Neural Networks. As different neutrino flavours exhibit distinct signal morphologies, geometric information plays a crucial role in providing context to the model. This project employs a Transformer, which is known for its effectiveness in analysing large-scale data, to investigate whether the model can identify patterns in simulated high-energy IceCube events.
However, Transformers are computationally demanding due to their self-attention mechanism, which scales quadratically with the input sequence length. To address this, this project adopts a data reduction approach to enhance computational efficiency.
Additionally, the dataset contains not only neutrinos but also other particles such as muons, which can mimic neutrino interactions. Therefore, data cleaning and proper event selection are essential steps in the workflow.
Collapsing massive stars are rich in neutrinos.
Neutrinos are elementary particles with exciting features. The goal is to predict their evolution when travelling through very dense matter and what flavor percentages we should get on Earth.
The study of the intergalactic medium (IGM) and its interaction with light from distant galaxies provides crucial insights into the evolution of the universe. In this project, I will apply a technique called "photometric intergalactic medium (IGM) tomography" to reconstruct a map of the large-scale structure of the IGM in the early Universe using a publicly available photometric catalogue and data from the HSC and Subaru telescopes, and with the use of the COSMOS2020.
With this, I develop a spectral energy distribution (SED) fitting tool, using Markov Chain Monte Carlo (MCMC), to measure the IGM transmission along a sample of photometric background galaxies with Lyman $\alpha$ as a tracer of the IGM trasmission, which I will use to make a map of the IGM at the Epoch of Reionization (EoR).
This will firstly be done with a simple power-law to reconstruct the SED of the background galaxies, where the model will be characterized by two parameters, the UV magnitude, and the UV slope, to describe the SEDs of the galaxies. This model will then be improved upon with a more sophisticated SED model based on the BPASS (Binary Population and Spectral Synthesis) model, which accounts for contributions of binary stars and various stellar populations to the SED. This model will be characterized by parameters such as the star-formation rate, dust attenuation, metallicity, and cluster ages.
In quantum information science, encoding information in atomic states and reading it through light is essential for building quantum networks. Enhancing light-matter interaction is crucial for improving the efficiency of such processes, and optical cavities have emerged as a powerful tool in this regard. This project experimentally investigates the feasibility of measuring a collective state of a lattice of ultracold cesium atoms using cavity-enhanced off-resonant dispersive interactions.
Active Galactic Nuclei (AGNs) are among the most luminous objects in the universe, powered by accretion onto supermassive black holes. The broad-line region (BLR), located within 5–30 light days of the black hole, produces strong emission lines that probe the ionization and kinematics of AGN gas. This project utilizes the Cloudy photoionization code to model BLR emission, incorporating new spectral energy distributions (SEDs) and additional ionic species beyond Hα, Hβ, C IV, and C III]. By varying the hydrogen density and ionizing photon flux, we explore how these parameters shape line intensities, equivalent widths, and the transmitted continuum. The results will help constrain BLR physical conditions in typical and atypical AGNs, providing new insights into AGN structure and evolution.
The Schwarzschild orbit-superposition method, developed for modeling galaxy dynamics, uses a combination of stellar orbits to represent the gravitational potential of a galaxy's center. This technique enables us to estimate the mass of the supermassive black hole (SMBH) residing in the galactic nucleus. Measuring the SMBH mass is crucial for understanding the growth and evolution of supermassive black holes and their role in galaxy formation. In this part of my thesis, I am working on using integral field unit (IFU) kinematics to construct a Schwarzschild model of NGC 3783's center and estimate the mass of its SMBH. NGC 3783 is a barred spiral galaxy, and the presence of this bar structure complicates the morphology and kinematics. Excluding bar structures from dynamical models can lead to biased SMBH mass estimates, and so we must adjust the Schwarzschild method to account for the bar, for which I will use the AGAMA: action-based galaxy modelling architecture developed by E. Vasiliev. This work is currently in progress.
Extreme astrophysical environments may produce axion-like particles (ALPs), whose interactions could leave observable imprints on astronomical spectra. If they exist, ALPs may be a viable candidate for dark matter. In this project, we search for axion-induced spectral features in gamma-ray observations of active galactic nuclei (AGNs) that are aligned with galaxy clusters along the line of sight. We assess the impact of AGN variability on spectral signatures and perform a stacked analysis of 29 AGN-cluster pairs. Our goal is to place competitive constraints on ALP parameters. While our results are still preliminary, they appear promising.
This master’s thesis investigates the role of supernova (SN) feedback in driving turbulence within the interstellar medium (ISM) of galaxies, a key factor in galaxy evolution. While SN feedback is considered to be one of the primary drivers of ISM turbulence, its efficiency across star-forming galaxies remains uncertain. This study focuses on dwarf galaxies in the local Universe, aiming to determine the efficiency of SN energy input needed to sustain turbulence and how this varies with given galaxy properties. Given the low star formation rates and slow energy dissipation in dwarf galaxies, the research will provide crucial insights into SN-driven turbulence. The methodology involves hierarchical Bayesian modeling of archival observational data from the LITTLE THINGS survey and GALEX, comparing results with previous studies of spiral galaxies. The findings are expected to refine theoretical models and improve sub-grid simulations of SN feedback in galaxy evolution.
Current studies on star and planet formation suggest that many molecules form on the surfaces on icy grains in molecular clouds from where they are then accreted onto the emerging protoplanetary disk and in some cases sublimate due to the heating by the young star. However, it remains unclear what role the surrounding environment and/or its evolution plays in regulating the chemistry.
The main goal of this work is to extract the physical and chemical parameters of the cold dust and gas around protostars, further exploring the role of the environment in the formation of complex organic molecules. This is achieved using recently obtained spectroscopic data from the ALMA’s Compact Array (ACA) provided by the COMPASS ALMA Large Program for a set of protostars for which complex organic molecules are seen. Online catalogues are used to identify the species present in the observations, and comparisons with developed synthetic spectra help calculate physical parameters like the excitation temperature and the upper column density. Also, zeroth and first moment maps are produced to map the intensity and velocity range of each species, identifying the ones with interesting emission features; like the overly extended methanol emission present.
Moving forward, we can implement the same methods described above to other sources for further comparison. Eventually, the same regions will also be observed with JWST targeting the ices, aiming for a revelation of the complex interplay between the gas and ice species.
As the Earth travels across the Milky Way, it passes through the galactic halo of dark matter particles. Occasionally a dark matter particle could interact with the contents of the earth, scattering it to a lower energy, which can lead to it becoming gravitationally trapped inside the Earth. If these dark matter particles are self-annihilating, or decay, one possible final state product will be neutrinos, which would lead to a flux of neutrinos at the surface of the earth, coming from dark matter, thus enabling indirect dark matter detection. The work focuses on the specific case of super-heavy dark matter in the mass range 1e7 GeV to 1e11 GeV, and explores the possibility of detecting ultra-high-energy neutrinos in the planned IceCube-Gen2 detector, in the hopes that data in the next 10--15 years can either discover or set new limits on dark matter.
Precipitation in Denmark arises from different mechanisms across seasons. In winter, large-scale dynamics driven by the North Atlantic storm track dominate, while summer precipitation involves large-scale systems and local convective processes. Current climate models predict changes in these patterns, but since 2000, Denmark has experienced precipitation exceeding model predictions. Suggesting a gap in our understanding.
The North Atlantic Oscillation (NAO), characterized by pressure differences between Iceland and the Azores, represents the dominant mode of atmospheric variability in the North Atlantic. While traditional analysis shows its strong influence on European precipitation patterns, particularly in winter, its signal appears muted in the normalized CMIP6 data used for this study.
Using Feed-Forward Neural Networks analyzed with Layerwise Relevance Propagation (LRP), my research investigates which atmospheric patterns drive precipitation anomalies. The analysis reveals distinct seasonal behaviours in the model's predictions. The LRP analysis shows that local pressure patterns are important predictors across seasons. However, contrary to traditional understanding, the NAO pressure patterns are not prominently used by the neural network in either winter or summer predictions. For summer precipitation, the model additionally identifies connections to local temperature variations, suggesting different driving mechanisms between seasons. These findings highlight the potential and current limitations of using explainable machine learning to understand precipitation patterns.
My thesis demonstrates the potential and limitations of explainable ML methods in climate science while highlighting gaps in our understanding of precipitation drivers in a changing climate. These findings suggest we need to dig deeper into explainable machine learning to use it for precipitation analysis.