I work in theoretical cosmology, and I am interested in Dark Energy (DE) and Modified Gravity (MG) phenomenology with the aim of
addressing some of the pressing theoretical, methodological and observational problems raised by current cosmological data.
In particular I am focusing my research on what deals with combining theoretical, numerical and observational tools to build a consistent framework to perform model-dependent and model-independent tests of gravity over the full range of cosmological scales, starting from large linear scales to small non-linear ones.
Scroll down to find more information about my research interests and publications.
A Universe described by General Relativity (GR) and filled with ordinary matter is naturally expected to experience decelerated expansion. One of the most remarkable results of contemporary observational cosmology is the evidence that this is not the case. This cosmic acceleration is today one of the few evidences of the existence of physical phenomena beyond what we already know.
How we can build working models of this phenomenon? How we can test them? What cosmological probes can we use to distinguish between different candidate models?
Observations of our universe, in the era of precision cosmology, give us the possibility to access physical phenomena on energy and distance scales that are not reachable in any other way. This wealth of data can be used to study fundamental theories, in particular gravitational ones, giving us information complementary to solar system tests.
How can we optimally use the data of the next generation of cosmological probes, like Euclid, LSST, WFIRST and SKA, to find or constrain deviations from General Relativity? How much information about gravity can we extract from these probes?
INSPIRE LINK to my publications
The phase space numerical investigation of different dark energy models for the first order system. Initial conditions are evolved both in the past (blue lines) and in the future (green lines). The red line corresponds to the ΛCDM trajectory.
The comparison of the functions μ(z, k) and γ(z, k) for the BZ approximation to f(R) models, with B0 = 2 and s = 4, and those computed with EFTCAMB evolving the full set of Einstein-Boltzmann equations for f(R) models that reproduce a ΛCDM expansion history and have B0 = 2. In all plots, the solid line represents the physical horizon while the dashed line represents the Compton wavelength of the scalaron.
The marginalized joint likelihood for the GWs speed of sound, the tensor to scalar ratio, the scalar perturbations spectral index and its running. Different colors correspond to different instrumental specifications used in the forecast and different models as shown in the legend. The two different shades indicate the 68% and the 95% confidence regions.
The marginalized joint likelihood for the present day value of Log B0, the sum of neutrino masses, and the amplitude of the (linear) power spectrum on the scale of 8 Mpc/h. Different colors correspond to the different codes used and, hence, a different modeling of f(R) as shown in the legend. The darker and lighter shades correspond respectively to the 68% C.L. and the 95% C.L.. The solid line indicates the best constrained direction in parameter space while the dashed line indicates the worst constrained one.
EFTCAMB is a patch of the public Einstein-Boltzmann solver CAMB, which implements the Effective Field Theory approach to cosmic acceleration. The code can be used to investigate the effect of different EFT operators on linear perturbations as well as to study perturbations in any specific DE/MG model that can be cast into EFT framework. To interface EFTCAMB with cosmological data sets, we equipped it with a modified version of CosmoMC, namely EFTCosmoMC, creating a bridge between the EFT parametrization of the dynamics of perturbations and observations.
COSMOFISH is a forecasting tool to study what future cosmology will look like. This tools is using EFTCAMB and MGCAMB to ensure the maximum coverage of cosmological models and will serve two important purposes. At first it will allow to optimise model testing, forecasting the expected constraints on several models and parametrizations to select the ones that are better constrained by the data. In the second place it will allow the design and optimization of experimental probes that aim at testing gravitational theories. It will be publicly released in the following months.
CRYPSIS is a set of tools to perform finite elements simulations of screening mechanisms in highly non-linear environments. The additional interaction that is included to source cosmic acceleration is strongly constrained by solar system tests. Viable models of DE/MG successfully develop a screening mechanism to hide the presence of this force on such scales. Generally these mechanisms rely on non-linear interactions or self- interactions between the scalar field sourcing the fifth force, gravity and matter. These non-linearities make it impossible to understand analytically the behaviour of these screening mechanisms in realistic situations and the CRYPSIS code will numerically fill this gap. Among other applications it will allow the design of laboratory experiments to test the presence of additional forces by looking at the objects for which these screening mechanisms are less effective. On the long run this will also help in making the connection with large scale modelling completing the range of scales over which we will perform cosmological tests of gravity. I am developing the CRYPSIS code as part of a Master in High Performance Computing and it will be publicly released in fall 2016.