Logo

Advanced search     |     French

Technological Themes

To accompany and support scientific thinking in the fields related to the LabEx P2IO, technological developments are necessary. These needs are extremely varied:

-Very high field superconducting magnets
-Extreme beam control
-New high critical temperature superconducting materials
-Improved detector response and detection thresholds
-High Performance Computing
-Development and management of common IT infrastructure

This is not an exhaustive list, but it gives you an idea of the wide range of applications within LabEx.

 
#98 - Last update : 11/27 2020
 

Building the next generation of high-intensity particle accelerators, high-luminosity colliders, advanced light sources or laser-plasma accelerators requires innovative conceptual and technological developments, in particular regarding very high field superconducting magnets, high gradient acceleration cavities, laser-plasma acceleration and control of extreme beams. It includes, for instance, the mastering of new high critical temperatures superconducting materials, innovation in cryogenics to limit the use of helium and in high-reliability and high-performance RF systems. The use of high power lasers impacting directly on matter or through the excitation of plasma waves is an ambitious and promising technology that could allow producing compact systems.

P2IO is involved in several projects located in Paris-Saclay: LUNEX-5 5th generation of free electron laser source, APOLLON multi-PW laser-acceleration, PERLE accelerator complex based on an energy recovery linac (ERL) for various applications, SONATE compact high-intensity neutron source. Many of these projects are carried out in collaboration with teams from the PALM LabEx or PHOM department.

The upgrade or creation of dedicated technological platforms will be necessary to develop these technologies, build prototypes and characterize them.

The refinement of detection techniques and development of specific detector systems by P2IO laboratories will address the needs of experimental teams leading key projects in our domains. This includes searches for dark matter, research on the nature of neutrinos, precision tests of the Standard Model, studies of cataclysmic phenomena in the Universe through measurements of high-energy gamma rays with the future CTA observatory, as well as experiments at high-energy colliders like CERN or EIC, or novel facilities like PERLE. The key challenges will be improving the detector response in high counting-rate and luminosity environments, dynamic range, and efficiency of detection systems often located in harsh environments (radiation, low temperature...). Possible developments include the exploration of new detection materials, the improvement of pulse-shape discrimination techniques as well as R&D on related electronics, with the aim of providing viable detector technology for next-generation experimental facilities.

The success of this program requires a close collaboration between the teams involved from the earliest stages, including mechanical integration, electronics, and data processing. In parallel, we will apply these developments at the technical platforms like SCALP, ALTO, ANDROMEDE or Virtual Data, as well as via collaboration agreements with the industry. 

P2IO has brought support to the development and management of common computer infrastructures: shared access computer rooms/data centers, innovative distributed file storage and backup software solutions. On the physics application side, support was also given to advanced architecture and algorithmic R&D to exploit parallelism for data selection and analysis tasks. In the next phase, P2IO will foster and support emerging efforts in the labs to solve the upcoming computing challenges of data processing and simulation in our scientific community:

  • The intensive computing challenge: processing and simulating large physics datasets, ranging from complex detector events to telescope images, require massive computing resources, and advanced architectures. Notably, astrophysics simulation will benefit from the upcoming exa-scale supercomputing infrastructures. Efficient parallelism and efficient architecture are two key aspects to match rapidly evolving processors with the physics workflow in an automatic and sustainable way, to produce software stacks and data pipelines intended to live over several decades.
  • The big data challenge: on the I/O side, the recent explosion of data volume recorded or simulated by experiments introduces new data storage and locality challenges. New tools inspired by "big data" approaches need to develop. On the algorithmic side, Artificial Intelligence approaches have proven to be very efficient for big data problem solving, and could play an increasing role in our experimental and theoretical physics paradigms. Deep machine learning methods are a promising avenue in this prospect, the foundation of many R&D efforts in our community.

P2IO is involved in this research, benefiting from the close links with data science community, in particular the UPSaclay Center for Data Science, an interdisciplinary network of data scientists.

 

Retour en haut