Skip to content
About FNR
Funding
Funding process & policies
News, Media & Events

Luxembourg National Research Fund

Final results 2021 CORE Call

The FNR is pleased to communicate the final results of the 2021 CORE Call. Out of 157 proposals submitted in the 2021 CORE Call, a total of 51 research projects have been retained for funding, representing a financial commitment of 31,4 MEUR.

CORE is the central funding programme of the FNR, with a prime objective to strengthen the scientific quality of Luxembourg’s public research in the country’s research priorities adopted by the Government on 20 December 2019. 

In order to identify the most promising and most excellent projects, the FNR submits project proposals to an assessment by independent international experts. Among the 157 eligible project proposals that were submitted, 51 have been retained for funding. 

14 of the 51 projects are CORE Junior projects (CORE Junior PIs marked with * below). In the biomedical field, 5 projects pertaining to research relating to the National Cancer Plan are jointly funded by FNR and Fondation Cancer. 

FNR’s CORE programme is one of the major vehicles to implement the national research priorities. Funded research projects have a duration of 2-3 years and will be  implemented in Luxembourg’s research institutions. 

Find out more about CORE – 2022 deadline 21 April

The retained projects of the CORE 2021 Call are grouped in the areas below.

Subcategory: Trusted data-driven economy and critical systems – 8 projects

Principal Investigator

Jacques Klein

Project title

Pre And Post Processing For Comprehensive And Practical Android App Static Analysis (REPROCESS)

Host institution

University of Luxembourg

FNR Committed

€597,000

Abstract

Users can today download a wide variety of mobile apps ranging from simple toy games to sophisticated business-critical apps. They rely on these apps daily to perform diverse tasks, some of them related to sensitive information such as their finance or health. Ensuring high-quality, reliable, and secure apps is thus key.

In this research project, named REPROCESS, we will contribute in developing tool-supported approaches related to two main research directions:

(1) First, by pre-processing apps before analysis, REPROCESS aims at extending the scope of the existing static analyzers. In particular, the first main goal of REPROCESS is to make existing Android static analyzers “”binary aware”” in order to not only focus on Dalvik bytecode (as most of the current static analyzers do). To that end, we aim at providing a unified representation of Android code (e.g., JIMPLE, LLVM, or a new one). Both native code and Dalvik bytecode will be translated into this unified representation.
By achieving this first objective, REPROCESS will contribute to reducing the number of false-negative results.

(2) Second, by post-processing the results yielded by static analyzers, REPROCESS aims at reducing the number of false alarms.

In particular, by considering contextual information such as app category or app description, REPROCESS will “”learn”” if a reported alarm for a given app makes sense in the context of the app.

By achieving this second objective, i.e., by reducing the number of false alarms, REPROCESS will contribute to making existing static analyzers more practical. Indeed, when an alarm is reported, the analyst often needs to manually inspect this alarm. The required manual effort to check too many false alarms jeopardizes the practicality of Android static analysis.”

Principal Investigator

Marcus Völp

CORE Bilateral: DFG

Project title

Resilient And Secure Activity Control For Flexible Time-triggered Systems (ReSAC)

Host institution

University of Luxembourg

FNR Committed

€610,000

Abstract

The Time Triggered (TT) paradigm of activations has been shown to be very well suited for closed safety-critical embedded systems with a-priori known configurations and strict timing constraints, as in the avionics, railway, automotive or space domain. Its success has been documented both scientifically and in industrial applications, notably by the company TTTech, selling TTP and TTA technologies, which are both among the candidates for the on-board communication systems of the lunar gateway. This successful commercial adoption is based on two main points:

(1) the application of the TT paradigm with a focus on simplicity and efficiency, providing a package of features such as strong real-time guarantees, reliability, and safety; and
(2) a variety of algorithms provides further guarantees (e.g., membership, transparent redundancy, rapid mode change, etc.) on top of the provided globally consistent, sparse time base that TT systems establish.

Unfortunately, as hardware platforms become more powerful and communication links more versatile, executing applications and transmitting traffic with different characteristics and criticalities, not all known beforehand or changing over time, the strictness, limited flexibility and resource overprovisioning of TT systems, prohibits their efficient application and reduces their scope to niches.

A number of methods have been introduced to include some flexibility in TT systems, typically by relaxing indiviual strict TT properties while sacrificing some of the gurantees the TT paradigm conveys. For example, it has been proposed to relax task to slot assignments, slot boundaries, or even to give up on time synchronization (under the assumption of drift- and jitter-limited clocks). However, in these proposals, the untying of any of these elements has lead to loosing the entire bundle of properties and guarantees TT achieves, not only for the application that motivated this relaxation, but for all co-existing applications.

In this project, we take a more principled approach by sacrificing time in favor of a generalized, but reliable activation to systematically investigate the relation between activation properties assumed and guarantees obtained. Our ultimate goal, to which this project contributes, is to obtain without time and the strictness of TT operation, what time-triggered systems achieve for the highly safety critical application fragment, while smoothly integrating other application characteristics, efficiently and on the basis of a solid understanding of the time-dependence of the guarantees they require. We will provide for various bundles to be configured, meeting various demands and criticalities of applications and systems, not meaning to replace existing TT solutions, but to provide a wider range of solutions and tradeoffs to be selected and simultaneously deployed in today’s and future cyber-physical and dependent systems.

Principal Investigator

Peter A. Ryan

CORE Bilateral: DFG

Project title

Real-world Implementation And Human-centered Design Of Pake Technologies (ImPAKT)

Host institution

University of Luxembourg

FNR Committed

€560,000

Abstract

The Internet has become the foundation of modern society. On a daily basis, we consume services such as e-mail, e-banking, e-government, e-commerce, social media, and cloud storage. These services require mechanisms to securely establish the identity of the user and communicate securely with them. Accidental or even malicious mis-authentication runs the risk of security threats ranging from economic loss, information leaks to identity theft.

Despite being declared “”dead”” on a regular basis, password-based authentication is still the predominant form of user authentication either as stand-alone or part of multi-factor authentication schemes when higher security guarantees are required. This is despite the fact that password-based authentication suffers from several security problems. A number of tools have been developed and deployed to address these problems: password strength meters to help users choosing stronger passwords, proactively checking leaked passwords to prevent harm from password re-use and leaked credentials, risk-based authentication to detect malicious sign-ins, and more. However, phishing attacks still present the most significant threat, even in the presence of state-of-the-art security mechanisms. Password-Authenticated Key Exchange (PAKE) protocols offer strong protection against phishing attacks and bring several other advantages such as the establishment of forward and backward secure cryptographic session keys. However, their use in practice has been very limited.

The main goal of ImPAKT is to improve the state of the art of password-based authentication on the Internet. This will be done by taking a holistic view of PAKE protocols and their implementations in practice. The project will take extensive input from practitioners via surveys and interviews to better understand the exact nature and impact of blockers in practice, as well as the importance of alternative (non-phishing) security goals for PAKE protocols such as service providers never seeing plaintext passwords. We will synthesize this in a new set of technical requirements for PAKE protocols (such as enabling strength estimates, handling typos, or risk assessment) and provide cryptographic security models for these settings. The project will study in detail, through a series of user studies, how users are interacting with PAKE protocols and how protocols, as well as interfaces, need to be designed to facilitate use and prevent phishing attacks. Finally, the project will construct novel protocols to support the new requirements and usability constraints, and provide efficient and ready to use implementations as open-source, as well as, validating the usability of these.

Principal Investigator

Sjouke Mauw

Project title

Give Control Back To Users: Personalised Privacy-preserving Data Aggregation From Heterogeneous Social Graphs (HETERS)

Host institution

University of Luxembourg

FNR Committed

€638,000

Abstract

Heterogeneous social graphs (HSG) have been widely used to analyse social media data to support decision making. Compared to simple social graphs which only model the relations between users, HSGs capture the heterogeneous nature of social networks in terms of data subjects and relations between them. The richer information encoded in HSGs leads to overwhelming better results than those on simple social graphs. In the meantime, it also imposes more risk of a privacy breach. Due to the potential economic and reputation loss, social network operators only publish a limited amount of HSG data for researchers and third-party data analysts.

In this project, we address an alternative decentralised solution for data analysts to collect data of HSGs directly from volunteers while guaranteeing volunteers’ privacy. Specifically, users privately calculate and share data about their local views of HSGs. Data analysts aggregate these responses into the information of interest. To the best of our knowledge, no works in the literature exist to achieve this goal. Moreover, we will take into account the fact that in real-life scenarios, users may have different privacy requirements, e.g., due to various trust to data collectors. We design methods for users to perturb their local data according to their own personalised privacy requirements. In this manner, we manage to give control back to users over their data by determining the level of privacy protection. In addition to precise privacy preservation, our methods can also ensure better utility for the aggregated data when only a small number of users require high-level protection.

To achieve our purpose, we will first extend the notion of local differential privacy to quantify users’ personalised privacy requirements over different types of sensitive information, i.e., vertices and edges. Once the privacy properties have been defined, we will design corresponding privacy-preserving methods for two widely studied data aggregation tasks: query answering and graph synthesis. Query answering is used to aggregate statistics of some structural properties of HSGs while graph synthesis allows data analysts to conduct flexible analysis on synthetic HSGs with similar properties to the original graphs. Last but not least, we will develop a comprehensive evaluation framework to evaluate the effectiveness of our methods and define new measures to quantitatively assess the utility of the aggregated data.

Principal Investigator

Marjan Skrobot*

Project title

Transition Of Low-entropy Authentication Ciphersuites Into A Post-quantum World (FuturePass)

Host institution

University of Luxembourg

FNR Committed

€780,000

Abstract

This project aims at facilitating the transition of low-entropy authentication mechanisms into a post-quantum world. We will look at two widely-deployed protocol suites that use low-entropy secrets to establish a secret channel: one underlying electronic travel documents and other securing Wi-Fi network protocols. Unfortunately, current authentication approaches used in these protocol suites are vulnerable to quantum attacks. Although the maturity of quantum computing technology that can affect these protocols is probably years away, it is important to prepare for the future transition to quantum-safe cryptography. There are several reasons for such a pre-emptive approach: First, these systems are widely deployed, have long service life (adult passports are typically valid for 10 years and Wi-Fi devices might run for even longer periods), and changes in their specifications for a new generation of devices are slow due to many involved parties. Second, based on the experiences with previous cryptographic migrations – such as those from SHA1 to SHA2 and from RSA towards ECC – one can conclude that it will take years, or even decades to complete the transition of such scale. Third, cryptographic algorithms and protocols that run on low-cost embedded devices often are not easily upgradable. This is true for both e-travel documents and Wi-Fi supporting devices.

As a step in the desired direction, we plan to provide a modular, Post-Quantum Password-based Authenticated Key Exchange (PQ-PAKE) protocol suite (including corresponding “hybrid” construction) to advance long-term security of e-travel documents and Wi-Fi protocol suites in terms of quantum resistance and cryptographic agility. We aim to achieve this by first providing a modular formal security analysis of existing protocols within e-travel documents and Wi-Fi protocols, building on top of the relevant body of knowledge in classical and PQ-PAKE, e-travel documents and Wi-Fi security. Then, once the exact properties for secure composition are identified, we will integrate a suitable PQ-PAKE suite into existing systems.

Also, an in-depth analysis of typical side-channel attacks and other deployment issues that occur in classical PAKEs will be conducted for PQ-PAKEs. Note that PQ algorithms introduce higher computation, memory, and communication costs compared to contemporary algorithms. We will need to take into account these aspects for the final PQ-PAKE selection. Our result should help standardization bodies that are already interested in these research questions to make informed decisions when choosing between protocol candidates.

Principal Investigator

Valérie Maquil

Project title

Enhancing Remote Collaboration Across Interactive Surfaces With Awareness Cues (ReSurf)

Host institution

Luxembourg Institute of Science and Technology (LIST)

FNR Committed

€548,000

Abstract

In the 21st Century world, we are facing highly complex societal and intellectual challenges, that can only be solved when professionals with distinct abilities and resources join their efforts and collaborate. Interactive wall-sized displays provide large benefits for information visualization and visual data analysis. Besides allowing for a better presentation of large amounts of data, they support (collocated) collaboration, as multiple users can access and view content at the same time and easily follow each other’s actions.

However, in many situations, such as in the COVID-19 pandemic or due to geographical barriers, face-to-face collaboration is not feasible, and needs to be replaced by remote collaboration. Conventional tools used to support remote collaboration, such as audio-video links, strongly limit non-verbal awareness information, which leads to difficulties in communication and requires additional efforts for staying engaged. This lack of awareness is increasingly relevant in the context of decision-making at interactive wall displays as in front of a large screen, collaborators are naturally making use of a large number of non-digital body movements and hand gestures. In the attempt to better mediate awareness information and facilitate communication, previous work suggests adding additional visual cues, such as pointers, sketches, or annotations, onto the common workspace or the live video stream. Such cues have been proposed for smaller workspaces (tabletops or physical tasks) and have only seldomly been investigated in the context of remote collaboration across two or more wall displays.

In ReSurf, we will combine design-based research, user centred design, and user studies with the aim to explore and study how mutual awareness can be enhanced for collaborative decision-making in a distributed wall-display setup. We will make use of different wall displays that are available at LIST (VisWall, 360° Immersive Arena, DemoWall), and first conduct a user study to find out how awareness information is shared in a well-functioning, collocated decision-making context. In an iterative approach, and by progressively integrating results from focus groups and user studies we will design audio-visual awareness cues that make use of body movements (proxemics, postures, and hand gestures) and eye gaze to support remote collaboration. A series of user studies will allow us to learn about the role and effectiveness of different types of cues. So, in ReSurf, we address the question of how person-oriented awareness cues need to be designed in order to enhance remote collaboration across two physically distributed wall displays.

Through our approach, ReSurf will generate scientific knowledge on the optimal design of awareness support for two or more remotely connected wall displays, and contribute to a next generation of mixed-presence decision-making tools, where people can collaborate smoothly, and enjoy an experience that is as close as possible to collocated collaboration.

Principal Investigator

Sarra Habchi*

Project title

Spotting The Root Causes Of Flaky Tests (SpotFlakes)

Host institution

University of Luxembourg

FNR Committed

€743,000

Abstract

Regression testing is the process of evaluating new code changes to verify that they integrate well in the codebase without breaking existing features. In large software development ecosystems, this process can be hindered by the prevalence of test flakiness. Flaky tests are tests that manifest non-deterministic behaviours, i.e., pass and fail intermittently for the same code changes. These tests cripple the continuous integration with false alerts that developers have to investigate manually. As flaky tests become frequent in a codebase, developers can lose trust in their test suites and become more inclined to ignore their outcomes. Hence, real faults can be ignored and buggy products are shipped to the user. Both researchers and industrial experts proposed strategies and tools to detect and isolate flaky tests. However, flaky tests are rarely fixed because developers struggle to understand and debug the root causes of flakiness. Studies involving practitioners confirmed the difficulty of debugging flaky tests and highlighted it as the main challenge for flakiness mitigation. The objective of SpotFlakes is to build a comprehensive suite of solutions that assist developers in debugging flaky tests. To cover the full process of debugging, our solutions address three main tasks:

– Classification: We propose a static technique that classifies flaky tests with minimal overhead and costs.

– Reproduction: We envision a technique that reruns flaky tests in variability-aware environments to reproduce passing and failing executions efficiently.

– Root cause localisation: We propose a technique that localises the root causes of flakiness at different granularity levels (e.g., code, modules, and services).

The envisioned techniques can operate separately and assist developers who mitigate flaky tests. Ideally, the combination of these techniques leads to a full automation of the flakiness debugging process.

Principal Investigator

Ezekiel Soremekun*

Project title

Ground-truth Based Program Debugging (GTDebug)

Host institution

University of Luxembourg

FNR Committed

€634,000

Abstract

Context: When a program fails, developers are saddled with the task of finding the fault locations, diagnosing the root cause, and fixing the bug. To alleviate this task, researchers have proposed automated techniques to support developers during debugging activities. In the last two decades, hundreds of automated fault localization (AFL) and automated program repair (APR) techniques have been proposed to support developers during bug diagnosis and bug fixing, respectively.

Problem: Despite the availability of many debugging techniques, they are hardly adopted in the industry because they do not account for the human factors in debugging practice. Current debugging techniques are not grounded in ground-truth (GT) debugging information such as the actual fault locations, bug diagnosis, and repair patches that are required by developers. Often, these techniques are evaluated without ground-truth information about the actual needs of developers. Consequently, there is a gap between the capability of the proposed debuggers and the actual needs of developers in practice.

As an example, researchers often make assumptions concerning the ground-truth, since software repositories or benchmarks providing ground-truth debugging information are scarce. In the absence of ground truth, researchers simplify the evaluation of AFL techniques by considering the patch locations as substitute fault locations or assume perfect bug understanding, i.e., localizing the first fault location is sufficient even if there are several faulty locations. In this project, we posit that the lack of ground truth inhibits the practical evaluation of debuggers and subsequently leads to their poor performance and low adoption in practice.

Main Idea: The goal of this project is to help bridge the gap between debugging techniques and debugging practice by providing GT-based evaluation techniques that enable researchers to practically assess the utility and performance of debuggers, both in the wild and in the lab. In particular, we propose a GT-based evaluation method that enables researchers to practically evaluate debuggers with actual developers in real-world large-scale software development environments. We also plan to gather GT debugging information and GT experimental factors from professional developers to enable the practical evaluation of debuggers in the lab and inform the development of useful debugging techniques.

Approach: To enable the evaluation of debuggers in the wild, we propose a GT-based evaluation method that applies controlled experimentation to assess the performance and utility of debuggers in real-world software development settings. To collect GT debugging information, we plan to conduct a large-scale human study to collect GT debugging information by observing professional developers while debugging real faults. To elicit ground-truth experimental factors, we will evaluate the empirical effectiveness of debugging techniques with and without commonly used experimental factors and their practical alternatives. Next, we will develop novel evaluation techniques that allow merging our GT evaluation method, GT debugging information, and GT experimental factors. Finally, we plan to systematically design novel debugging techniques that are effective in supporting developers in practice using GT debugging information.

Implications/Conclusions: Overall, the results of this project would help bridge the gap between debugging aids and debugging practice, by enabling researchers to evaluate debugging tools with more realistic and practical ground-truth debugging information, experimental factors, professional developers, and real software development environments. Furthermore, our proposed GT-based benchmarks and techniques will improve the state of the art in program debugging and encourage industrial adoption of debugging tools.

Subcategory: Integrative materials science and technology – 8 projects

Principal Investigator

Eddwi Hasdeo*

Project title

Navier-stokes Flows In Quantum Materials (NavSQM)

Host institution

University of Luxembourg

FNR Committed

€370,000

Abstract

Understanding the physics of many interacting particles is a formidable task. This complex system surprisingly follows a simple hydrodynamics picture when momentum conserving interparticle scattering rate is the largest compared with other scatterings that relax the momentum (e.g. with impurities or phonons). In the hydrodynamic regimes, electrons no longer move individually but rather collectively similar to viscous fluids. The hydrodynamic transport has recently been observed in clean graphene samples.

However, electron hydrodynamics in graphene humbly shows phenomena similar to those observed in classical fluids. One of reasons of this classical physics resemblance is because graphene is a trivial material. Quantum materials, on the other hand, possess a special internal structure of wave functions known as the Berry curvature that drives electrons in a nontrivial way. These quantum materials exist in the forms of topological insulators, topological (Dirac and Weyl) semimetals, and layered heterostructures. It is expected that electron hydrodynamics in quantum materials will show richer features beyoind classical-fluid-like ones. In this project, I will explore new hydrodynamic phenomena in quantum materials. Firstly, I propose that Berry curvature will give quantum effects in the Navier-Stokes equation. Spin-orbit coupling might separate charge and spin degrees of freedom thus it is interesting to study their hydrodynamics. Secondly, I will investigate the intriguing situation of charge neutrality points (CNP) of semimetals. At CNP, electrons and holes move in the same (the opposite) directions under applied temperature gradient (electric field). These carriers scatter strongly and are known to exhibit Planckian dissipation. The combination of Planckian dissipation with the topology of Weyl semimetals and gapped bilayer graphene are expected to exhibit interesting features in hydrodynamics. Thirdly, hydrodynamic transport in three-dimensional (3D) materials has been seen and originates from electron-phonon interaction. I will focus on 3D Weyl semimetals with interesting topology and investigate how electron-phonon drag viscosity affects the chiral anomaly. Finally, I will investigate hydrodynamic transport in strongly correlated heterostructures. This work package will incorporate dissipative umklapp scattering that is mostly neglected in previous studies and reformulate quantum kinetic equation suitable for strongly correlated systems. This research project will significantly advance our transport models and give insights in designing future electronic devices.

Principal Investigator

Jorge Iniguez Gonzalez

Project title

Predictive Mesoscale Dynamics Of Ferroelectrics And Related Materials (Ferrodynamics)

Host institution

Luxembourg Institute of Science and Technology (LIST)

FNR Committed

€712,000

Abstract

In this project we will develop methods for predictive simulations of dynamic mesoscale phenomena in ferroelectrics and related materials. To do this, we will construct continuum phase-field models with parameters fitted to reproduce the temperature-dependent results of predictive calculations based on accurate atomistic “second-principles” potentials. In particular, we will compute the (relaxational, inertial) rates controlling the phase-field dynamics so as to replicate the response to fast perturbations (e.g., short electric pulses) obtained from the atomistic simulations. We will thus introduce a new generation of continuum models, free from unchecked approximations and not relying on experimental information. Further, the new methods will be general, in principle applicable to other materials and fields in physics and engineering. Hence, I truly think this project will open the door to a new generation of predictive continuum simulations.

We will demonstrate the new methods with applications to two hot problems, which constitute a great motivation for the intended methodological developments. On one hand, we will investigate ferroelectric switching in thin films of room-temperature multiferroic BiFeO3, whereby the experimentally observed – and very peculiar – two-step switching path enables the full reversal of the magnetization of the material. Our new phase-field methods will allow us to, first, understand the elastic and electric constraints inherent to the ferroelectric multidomain structure of these films and how such constraints condition the switching path; and second, identify ways to optimize the switching (i.e., by reducing the voltage that needs to be applied) while maintaining its desirable features. This problem – which I am already investigating, albeit with ultimately-insufficient theoretical tools – is central to the development of a new generation of low-power memories recently proposed by intel. By providing unvaluable physical insight, our new phase-field methods may unblock progress towards a technological breakthrough!

On the other hand, we will use the new methods to study the terahertz (THz) and sub-terahertz resonances of ferroelectric multidomain structures, which have just been observed experimentally in model PbTiO3/SrTiO3 ferroelectric/dielectric superlattices. This is of great interest, as the potential exploitation of the THz range of the electromagnetic spectrum – for ultra-broadband short-range communications – is a focus of the telecoms industry. Interestingly, the THz dynamics of periodic arrays of ferroelectric domains lies between the slower domain-wall relaxations and the faster polar phonon vibrations, and is most likely (strongly) coupled to both. To make things more exciting, it should be possible to tune the system’s THz response and potentially match it to technological needs, either statically (e.g., by suitably choosing the thickness and composition of the layers in the PbTiO3/SrTiO3 superlattice) or dynamically (taking advantage of the ultra-reactivity of these materials to electric and mechanical perturbations). Our new phase-field methods will allow us, for the first time ever, to run predictive dynamic simulations of such complex mesoscopic states, and thus tackle this all-important problem.

Hence, by delivering one key methodological breakthrough, and two timely and important applications, Ferrodynamics will introduce and demonstrate a new generation of much-needed methods for predictive simulations of mesoscale dynamics. Given the considerable excitement in the field of nano-ferroelectrics (e.g., with the recent discovery of electric skyrmions, which indeed form mesoscale lattices) and the potential broad implications of the intended developments (i.e., providing a path towards a new generation of continuum simulation methods), I am convinced this is a timely and pertinent project that is likely to have a very strong and long-lasting impact.

Principal Investigator

Santhana Eswara

Project title

Advanced Imaging Modalities Using Stationary And Scanning Transmitted Helium Ions For Materials Research (AIMSTHIM2)

Host institution

Luxembourg Institute of Science and Technology (LIST)

FNR Committed

€662,000

Abstract

Interaction of energetic ions and solids triggers a multitude of fundamental physical and chemical processes at atomic scale. Analysis of the energy lost by ions upon travelling through materials provides insights about the physical and chemical nature of the materials. The newly emerging possibility to image the energy-loss of transmitted ions with nanometer scale lateral resolution offers unprecedented opportunities to access nanoscale variations in materials properties, which are otherwise partially or completely inaccessible by other techniques.

A key technological breakthrough in the context of ion microscopy was the development of high-brightness Gas Field Ion Sources (GFIS). With GFIS, it is possible to focus He+ ions to a probe size down to 0.5 nm. Indeed, high-resolution secondary electron (SE) imaging by rastering sub-50 keV He+ ion probe on the surface of the sample is now routinely performed on commercially available Helium Ion Microscopes (HIM). The same source technology also allows the possibility to produce Ne+ ions, which are typically used for nanofabrication and for Secondary Ion Mass Spectrometry (SIMS) imaging with < 15 nm lateral resolution. While the high-resolution SE and SIMS imaging capabilities using He+ and Ne+ ions have been spectacularly demonstrated, the possibilities for advanced imaging methods using sub-50 keV He+ ions in transmission mode has not yet been fully explored. The potential to use HIM in transmission mode to obtain complementary insights about materials has been already recognized but progress has been largely limited because of the unavailability of suitable experimental setup in commercial HIM.

Among the main advantages of using transmitted He+ ions in comparison to transmitted electrons for microscopy and nanoanalysis are (i) mechanisms related to charge-exchange can be studied, (ii) the smaller de Broglie wavelength of ions in comparison to electrons of same energy means the diffraction contrast in transmitted ion imaging is not significant and hence allows direct quantitative analysis of image intensities, (iii) by virtue of its positive charge polarity the He+ ion channeling trajectories inside solids drastically differ from electron channeling thereby providing a complementary particle probe to study materials.

Within the framework of an EU Horizon 2020 project (npSCOPE, 2017-2020), we have developed a dedicated prototype instrument in which a commercial GFIS column is integrated to perform experiments using transmitted He+ and He0. A cryo-stage is also developed for the npSCOPE prototype. Furthermore, within the framework of a FNR CORE project (STHIM, 2017-2020) we have developed the Galileo prototype for Transmission He+ Ion Microscopy (THIM) and Scanning THIM (STHIM) using a stationary and rastered beam of 20 keV He+ ions respectively. These two powerful and complementary prototypes give us at the Luxembourg Institute of Science and Technology (LIST) unique experimental capabilities worldwide to perform advanced imaging modes using transmitted He+ ions.

The present project proposal aims to achieve (i) detailed investigations of variations in ion energy-loss characteristics with nanoscale lateral resolution, (ii) correlation of crystal orientation dependent ion energy-loss to channeling and non-channeling conditions in imaging mode with an aim to quantitatively map 3D electronic density within crystals, (iii) investigate temperature dependence (from ambient to cryo temperature) of energy-loss characteristics in imaging mode, and finally (iv) explore the potential to use channeled He+ ions for non-destructive mapping of strain fields, identification of interstitial atom type and ppm-level impurities. These experiments have the potential to unveil hitherto unexplored fundamental nanoscale mechanisms related to energy-exchange and charge-exchange processes occurring during the interaction of slow keV ions and solids with unparalleled lateral resolution.

Principal Investigator

Massimiliano Esposito

Project title

Theory Of Chemical Complexity (ChemComplex)

Host institution

University of Luxembourg

FNR Committed

€720,000

Abstract

The goal of this proposal is to make use of state-of-the-art methods from nonequilibrium statistical physics to make progress in developing a theory of chemical complexity able to quantify the accuracy, precision, speed, energetic cost and computational power of complex operations performed by open chemical reaction networks (CRNs). Concretely, we will: 1) develop an information thermodynamic framework describing how synthetic molecular motors transfer free energy from chemical to mechanical degrees of freedom and use it to suggest strategies to optimize their performance; 2) extend the thermodynamic theory of open CRNs to situations where energy can be supplied by light and/or by electrodes and use it to study energy transduction and storage in batteries and artificial photosynthetic reaction centres; 3) propose a circuit theory that greatly simplifies the description of the dynamics, thermodynamics and fluctuations of complex CRNs, in analogy to the circuit theory used to describe electrical circuits in terms of circuit components characterized by I-V curves and Kirchhoff’s laws; 4) use these findings to study the dissipation-accuracy-speed trade-offs at work in various schemes of chemical computing. In doing so we will push the boundaries of knowledge on CRNs.

Principal Investigator

Sebastjan Glinsek

Project title

Flash-lamp Assisted Growth Of Piezoelectric Oxides (FLASHPOX)

Host institution

Luxembourg Institute of Science and Technology (LIST)

FNR Committed

€708,000

Abstract

Glass has been in use for more than 5000 years and is everywhere around us. It has an amazing span of functionalities, from visual aesthetics to electronic applications. Electronic oxides, on the other hand, are one of the most fascinating classes of materials. Their integration on glass underpins touch screens, which have become ubiquitous. However, there is an important class of electronic oxides -piezoelectrics-, whose potential has not yet been fully revealed on glass. Piezoelectrics can transform electrical energy into mechanical energy and vice versa. Intense research on their integration – and especially lead zirconate titanate (PZT), the flagship piezoelectric – on silicon-based electronics resulted in wide-spread applications like actuators in inkjet printers and autofocus lenses in mobile phones.

Two main challenges impede integration of thin-film piezoelectrics on glass: (1) Processing temperatures above 650 °C, which are beyond the usable range of commercial glasses; (2) incompatibility of standard deposition technologies such as vacuum- and spin-coating-based with large glass sheets.

In FLASHPOX project we will address these two challenges by growing lead zirconate titanate on glass using flash lamp annealing method, which enables heating of thin-film materials on temperature-sensitive substrates (challenge #1). Light-sensitivity of the precursors will be enhanced through chemical modification. For the deposition technique we will use inkjet printing, a digital additive manufacturing technique, which enables printing on large sheets (challenge #2). The project will enable growth of piezoelectric thin films below 500 °C and will fully unleash their potential in glass-based electronics.

Principal Investigator

Constance Toulouse

Project title

Inducing Negative Strain States By Implanting Helium In Matter (INSIGHT)

Host institution

University of Luxembourg

FNR Committed

€385,000

Abstract

The objective of this project is to take advantage of a technique we developed recently to induce a new type of strain by implanting Helium. Due to Helium’s nobility, the atoms implant interstitially (without forming bonds), inflating the crystal’s unit cell from within, inducing an effective negative pressure. Our technique uses a Helium Ion Microscope (HIM) whose side effect is the implantation of ions, taking advantage of the nanoscale resolution of the microscope to impose the strain locally, even allowing to pattern spatially the strain state of the implanted material.

We have obtained results showing the structural and strain effects of implantation. Our goal is now to use it to tune locally the properties of functional materials.
We chose two model materials presenting potential for such study: (i) the room-temperature model multiferroic BiFeO3 in which we intend to modify the ferroelectric domains and polarization through implantation, as well as the magnetic spin arrangements, and observe the effects on the magnetoelectric coupling between these properties and (ii) the antiferroelectric material PbZrO3 in which we envision inducing ferroelectricity locally under implantation.

Principal Investigator

Ariane Soret

Project title

Thermodynamics Of Non Linear Quantum Optics (ThermoQO)

Host institution

University of Luxembourg

FNR Committed

€359,000

Abstract

Light is a versatile tool, used extensively in science, from non destructive exploration of biological tissues, to nanotechnologies and quantum simulation. In all these applications, the ability to control the fluctuations which arise from the interaction with the environment is crucial. This is particularly relevant in the context of rapidly developing nanotechnologies, where fluctuations become significant compared to the average values. Despite the outstanding progress made in recent years in the ability to control light in complex media, the race towards smaller scales in nano devices is opening new challenges, showing the limits traditional approaches and the need of a refined theoretical framework to understand energy transfers between light and matter at small scales, and far from equilibrium.

A powerful theory has emerged in the past two decades, stochastic thermodynamics, to study energy transfers in complex systems far from equilibrium. It is usually applied to complex electronic, colloidal and biological systems. This theory aims at obtaining universal (i.e. independent of the microscopic details) laws describing energy fluctuations and transfers in complex systems, such as bounds on the fluctuations of devices designed to produce a current with a certain precision.
The field of quantum thermodynamics aims at extending this framework to the quantum realm, and to explore the role of quantum effects on thermodynamics. Until recently most applications have focused on simple or non interacting quantum systems, but the quantum thermodynamics of interacting photons has not yet been examined.

The goal of this project is to use ideas from stochastic thermodynamics to build a quantum thermodynamics theory of light, without and with photon-photon interactions. To reach this goal, the project is broken down in three objectives. First, we will investigate the thermodynamics of an electromagnetic field interacting with thermal baths at very different temperatures (far from equilibrium). A spectacular feature of equilibrium (and near equilibrium) quantum electrodynamics is the emergence of Casimir forces, exerted by the electromagnetic field on macroscopic objects. Casimir forces play an important role in nano technologies, but are still ill understood far from equilibrium. Our work will improve the understanding of these forces. Second, we will study the thermodynamics of photons interacting with quantum objects (e.g. qubits, 3 level systems). This will give estimates of the energetic cost of quantum computation, and will also be relevant for quantum heat engines. Finally, we will turn to the macroscopic limit, and focus on the emergence of non linearities (photon-photon interactions). We will build a framework to study the thermodynamics of the rich phenomena induced by the non linear nature of the system, namely, metastable states of light and dynamical phase transitions. In turn, this will give a new insight on how to enhance the performance of devices built from a large number of quantum machines. We will also use this framework to derive universal laws on the cost associated to transitions between multistable states, similar to the Landauer principle in information theory. This will give quantitative tools to assess the stability of a multistable state, and be extremely useful for the expanding field of quantum simulation using non linear quantum optics.

To sum up, this project will improve our understanding of quantum electrodynamics and non linear optics, provide a new insight on the thermodynamics of non linear systems, and produce quantitative tools to improve the efficiency of quantum engines, nano devices, and of quantum computing or quantum simulation setups.
The combination of the complementary backgrounds of my host and myself is ideal to realize this outstanding goal of formulating a quantum thermodynamics theory of non linear optics. This achievement would push my career to the forefront of current research.

Principal Investigator

Jan Lagerwall

Project title

Bio-sensing The Frugal Way With Liquid Crystal Spheres (BIOFLICS)

Host institution

University of Luxembourg

FNR Committed

€566,000

Abstract

The strongly interdisciplinary project BIOFLICS intends to explore the power of liquid crystal self-assembly (and the functional materials it gives rise to) to develop low-cost, simple tests to reliably detect respiratory pathogens. Our aim is to develop tests that are simple enough to be conducted by the patients themselves in their homes, yet more reliable than current Rapid Antigen Tests and capable to provide a quantitative output. With the intent to eventually make these tests available to everyone, everywhere, we adopt a ‘frugal technology’ approach, where we minimize the needs for advanced technology and expensive materials, and commit to sharing the research results and technology, in an ‘open source’ format, thus enabling local production of many of the components. The three core objectives of BIOFLICS reflect the key innovative steps, which are (1) the replacement of nasopharyngeal/throat swabbing by a simple and much less invasive face mask sampling approach, using water-soluble or swellable filters of non-woven fibers as a medium; (2) replacing the low-contrast and low-information-content color read-out in current Rapid Antigen Tests with the circularly polarized and narrow wavelength band selective reflections of polymerized cholesteric liquid crystal-derived spherical reflectors, enabling low detection limit and a quantitative analysis using a standard mobile phone and accessories costing no more than a few Euros at most; and finally (3) the exploitation of the desire of liquid crystals in spherical shell topology to expel order-disturbing foreign species into topological defects, with the consequent change in their optical appearance, to reveal bacterial endotoxins. This broad scope, focused on the target of delivering simple yet reliable pathogen detection in an entirely novel way, is possible thanks to an interdisciplinary collaboration between the University of Luxembourg (UL; a soft matter physics group and a biochemistry group) and the Luxembourg Institute of Science & Technology (LIST; experts in nanomaterials and surface functionalization). We expect our unconventional approach to biosensing to generate high-impact publications and conference presentations of large interest to the scientific community, while, if successful, giving its most important contribution through its enormous societal impact on improving global health by enabling frugal health screening solutions that can be implemented anywhere in the world, even in remote areas without electricity or other kinds of advanced infrastructure.

Subcategory: Future computer and communication systems – 2 projects

Principal Investigator

Pascal Bouvry

Project title

A Concurrent Model Of Computation For Trustworthy Gpu Programming (COMOC)

Host institution

University of Luxembourg

FNR Committed

€387,000

Abstract

Today, central processing units (CPUs) are reaching a plateau in terms of efficiency with only 4% of increase in the period 2015-2018 (SPECint benchmarks). Over the same period, the number of cores in graphics processing units (GPUs) has increased by 60%, reaching 6912 cores on the latest Nvidia A100 card. Although the hardware became massively parallel, and could lead to large improvement in efficiency, GPUs remain difficult to program for two reasons.

Firstly, the shared memory model of GPUs makes it hard to program correct and deterministic programs.
Secondly, the single instruction, multiple data (SIMD) architecture of GPUs makes general-purpose programming a daunting task.

Indeed, programs are still written first in the mental frame of sequential computation, then transformed to fit the GPU. Our project «A Concurrent Model of Computation for Trustworthy GPU Programming» (COMOC) tackles the question of how to apply the power of GPUs to program general-purpose and correct systems?

To this aim, we propose the design, implementation and evaluation of a novel concurrent model of computation based on lattice theory suitable for execution on GPUs.
Grounding our approach in a mathematical theory offers to programmers a sound model of parallel computation that exploits the full power of GPUs.

Lattice theory is crucial as it allows concurrent operations over the same data structure in a way that the sum of their results correctly contributes to the solution.
We will show that the application of this model to the massive parallelism of the GPU leads to efficient and deterministic computations.

In order to illustrate, evaluate and refine our concurrent model of computation, we focus on constraint reasoning, a field of artificial intelligence (AI) which encompasses all exact and heuristics techniques for solving combinatorial problems.

Two objectives of COMOC are to implement an exact parallel constraint solver based on constraint programming, and a heuristic parallel constraint solver based on genetic algorithms, both within the developed model of computation.

Constraint reasoning is a large research area with applications spanning multiple fields such as operational research (scheduling and transportation problems), healthcare (cancer treatments, fair kidney exchange) and sustainability (smart buildings, energy efficient cloud data center).

Because constraint solvers are general-purpose, designing a correct and efficient parallel constraint solver can improve any of the mentioned applications.

Our overarching goal is to demonstrate that the power of GPU can be leveraged to various AI applications with the right model of computation.

Principal Investigator

Eva Lagunas

Project title

Leveraging Artificial Intelligence To Empower The Next Generation Of Satellite Communications (SmartSpace)

Host institution

University of Luxembourg

FNR Committed

€606,000

Abstract

Research in Artificial Intelligence (AI) has been active for several decades, but lately, and with the exponential increase in the amounts of available data, new scenarios and use cases have emerged and have particularly influenced the field of general wireless communications. In fact, AI has recently received significant attention as it is considered a key enabler for future 5G and beyond wireless networks. The satellite world is also progressing towards having all the ingredients making AI suitable for particular satellite related use cases. Similar to the terrestrial communications systems, the satellites are moving towards a centralized controlled network, composed of distributed low-complexity gateways (GWs) implementing the functionalities from baseband processing to packet processing, while the centralized unit gathers the intelligence of the network, which is able to analyse, process and interpret the monitoring signals collected from the Network Operation Control (NOC) and the Satellite Operation Control (SOC), and react accordingly.
While AI for general terrestrial wireless communications has received a lot of attention recently from the academic community, their application to Satellite Communications is in its infancy. Note that herein we focus on the PHY/MAC layer and Radio Resource Management part of the systems, excluding Space Operations and Assembly, Integration & Test (AIT) and Earth Observation (EO), where Machine Learning (ML) has been well-studied for anomaly detection and image processing, respectively. SmartSpace project would like to jump into the AI bandwagon and investigate what AI can bring to satellite communications with regards to the following aspects: (i) algorithm accelaration, or how to address the complex resource optimization problems typically encountered in satellite communications; (ii) estimation of unknown or inaccurate system model, or how to complement / improve procedures that rely on sometimes innacurate channel models; and (iii) Network load prediction, or how to correlate human patterns to predict the satellite data traffic, which at the same time can be used to better distribute the satellite resources where needed.

Furthermore, SmartSpace will not only study direct applicability of ML techniques to communications satellite problems but also bring new contributions to the table by exploiting the background of the team in terms of communication and signal processing knowledge and the industrial advice from a world-wide satellite operator such SES.

Last but not least, AI/ML was and still is sometimes conceived with a bad connotation for the general public (e.g. “AI is going to steal people’s jobs”) and the industry is rather sceptical or does not really understand the technology. Uncertainty and scepticism may negatively impact the success of this technology. Therefore, SmartSpace will make an effort to collaborate with FNR and the Government of Luxembourg in order to fight against disinformation concerning ML and AI for both general public and space industry.

Subcategory: Space telecommunications, earth observation and space resources – 2 projects

Principal Investigator

Juan Merlano Duncan

Project title

Ground-based Distributed Beamforming Harmonization For The Integration Of Satellite And Terrestrial Networks (ARMMONY)

Host institution

University of Luxembourg

FNR Committed

€641,000

Abstract

Satellite communications are expected to play a fundamental role in beyond 5G and 6G networks. This is motivated by the vast service coverage capabilities and reduced vulnerability of spaceborne vehicles to physical attacks and natural disasters. In particular, satellite networks will foster the roll-out of 5G service in un-served areas that cannot be covered by terrestrial 5G network (isolated/remote areas, onboard aircraft, or vessels) and underserved areas (e.g. sub-urban/rural areas).

Satellites will also be instrumental to upgrade the performance of limited terrestrial networks cost-effectively, reinforce the 5G service reliability, and enable 5G network scalability by providing data delivery towards the network edges or even user terminals. This new generation of network technologies will open prospects for the new digital economy and business models, enhancing human development, including disadvantaged and isolated areas.

On the other hand, the trend in 5G mobile communications goes toward distributed and decentralized architectures, the use of small cells, and even the implementation of cell-free topologies where a large number of small antennas distributed over a wide area serve a set of mobile users. This trend might seem in contradiction with the idea of using the spaceborne links to provide connectivity to the terrestrial networks since the satellite reception usually require large and very directive antennas (parabolic reflectors). The complications might be worsened for satellites flying in non-geostationary orbits, in which case the antenna needs to be steered to point to the moving satellite, increasing significative the cost of the system. One preliminary solution is the use of active antenna arrays, however, simply scaling current active antenna architectures could result in unacceptable mass, volume, and power consumption. To solve these challenges, we propose the use of collaborative beamforming among the set of distributed terrestrial antennas to be used in the reception of the satellite signals as well as in the terrestrial signals.

The objective of this project is to identify and analyze the technical challenges associated with these new architectures, such as network synchronization and the requirement of a low-latency, high-rate fronthaul owing to using distributed signal processing. And subsequently, perform prototyping and experimentation/demonstration of the concept in a laboratory environment.

Principal Investigator

Laurent Pfister

Project title

Behaviour Of O And H Stable Isotopes Of The Water Molecule In Lunar Regolith (LUNAQUA)

Host institution

Luxembourg Institute of Science and Technology (LIST)

FNR Committed

€964,000

Abstract

While historical lunar exploration missions had concluded on a ‘bone-dry’ planetary body, this paradigm has been recently revised with the discovery of water on the Moon, potentially extending beyond so-called ‘cold traps’ in polar regions. Prior to any In-Situ Resource Utilization (ISRU) of lunar water resources, manifold scientific, and technological challenges first need to be ironed out. With the existing knowledge, the calculation of water ice abundance and related O-H isotope signatures remains highly uncertain – eventually hindering the assessment of potential lunar water resources and the interpretation of scientific data. Here, we propose to leverage our recent development work on an innovative instrumental prototype geared towards carrying out water ice sublimation experiments under higher vacuum and lower temperatures than reached by previously developed set ups. The prototype targets the collection of the sublimated and remaining water ice fractions at high recovery yields and with no extra instrumental isotope fractionation that may potentially impact subsequent isotope analyses. Our development work has blazed a trail that we now intend to follow for going beyond the current understanding on water ice sublimation and related isotope fractionation. First, we will test the conjecture that the instrument’s recovery yield, pressure and temperature performances meet the requirements not to cause extra instrumental isotope fractionation, other than that expected from the sublimation experiments. Second, we will assess water ice and icy regolith sample preparation protocols – relying on different simulant materials, representing the two main lunar regolith compositions expected (a) at the lunar poles (highlands) and (b) at lower latitudes (mare). Third, we will use the prototype for testing our hypothesis on sublimation-induced mass loss and related water ice isotope fractionation. Fourth, we will consolidate the theoretical sublimation isotope fractionation model for simulating the isotope fractionation during ice sublimation under low pressure and low temperature systems. Ultimately, this model and the sublimation experiments shall help anticipating (i) the sublimation rate and the amount of water-ice expected to be lost during the sample processing chain of the LUNA 27 mission’s payload (PROSPECT) and (ii) the isotope fractionation factor and the isotope fractionation expected during the sample processing. This shall allow for correcting the measured water abundance and isotope signature.

Subcategory: Autonomous and intelligent systems and robotics for earth and space – 1 project

Principal Investigator

Djamila Aouada

Project title

Enabling Learning And Inferring Compact Deep Neural Network Topologies On Edge Devices (ELITE)

Host institution

University of Luxembourg (SnT)

FNR Committed

€749,000

Abstract

Over the last decade, private and public European sectors are increasingly investing in space from scientific to commercial missions. There is an increased interest in the space sector to improve the computational capabilities of the edge devices utilized in these missions. ELITE aims to empower these devices by compact Deep Neural Network (DNN) architectures that have fast and accurate inference capabilities. The current state-of-art methods are still unable to release the potential of DNNs employed on terrestrial applications. Exploiting the power of DNNs on space missions’ edge devices can enable more performant functionalities such as smart debris detection and tracking, self-navigation for collision avoidance, automated satellite docking, and effective remote sensing capabilities.

DNNs can progressively represent more complex decision functions and extract richer semantic information from input images. However, complex DNNs do not fit edge devices with limited memory and computational resources that are commonly used in space missions. Furthermore, while it is essential to reduce the space-earth communication overhead, these missions typically download a large amount of data to the ground stations for extra processing, even if the information has no scientific relevance. Hence, in order to reduce the heavy data processing and transfer operations, fast on-board data analysis and autonomous operations are required. Thus, compact DNNs that have fast and accurate inference capabilities shall have a tangible impact on processing on-board data especially in nanosatellites that have limited lifetime.

To this end, the primary goal of the project “ELITE:  Enabling Learning and Inferring compact deep neural network Topologies on Edge devices” is to investigate new ways to build compact DNNs from scratch by: 1) using efficient latent representations and their factors of variations, and 2) exploiting NAS based techniques for minimal deep architectural design. The final objective is to construct compact DNN models suitable for edge devices with limited computational capabilities in space missions.

ELITE will be carried out by a complementary Consortium of three partners: the Computer Vision, Imaging and Machine Intelligence (CVI2) Research Group of SnT at the University of Luxembourg (UL) as the Coordinator, and two non-contracting partners, the Melbourne Space Laboratory (MSL) at the University of Melbourne (UoM) and Lift Me Off (LMO), a space company developing the necessary subsystems & components in the fields of Propulsion and Space Situational Awareness (SSA).

Subcategory: Fintech/RegTech and transformative applications of distributed ledger technologies – 1 project

Principal Investigator

Gilbert Fridgen

Project title

Privacy-preserving Tokenisation Of Artworks (PABLO)

Host institution

University of Luxembourg (SnT)

FNR Committed

€817,000

Abstract

Digital information can be exchanged in a peer-to-peer manner via the internet, but the digital transfer of private property rights typically requires intermediaries such as banks and asset exchanges. The Bitcoin network disrupted this and famously allows participants to exchange fungible tokens – currency units named bitcoins – in a peer-to-peer manner. The Ethereum network took this one step further and allowed participants to exchange both fungible and non-fungible tokens. If non-fungible tokens are used to exchange private property rights for unique assets, this process is commonly referred to as “tokenisation”.

Investors are attracted by a range of tokenisation opportunities, from rare and precious artefacts to real estate or unique digital artworks. The art and collectibles market is especially important for Luxembourg, as is the growing ArtTech industry, which provides fintech solutions for art investors and managers of private collections. Some major tokenisation challenges are, however, yet to be addressed. Digital property rights must be authentic, legally valid, and exchangeable across tokenisation platforms so that investors are not locked to any particular platform; and tokenisation must satisfy both art investors’ demands for privacy and regulators’ demands for transparency and auditability.

In response to these challenges and shortcomings of existing tokenization approaches, we plan to design and implement a solution that intersects decentralised finance and digital identity management. We will use tokenisation to improve the liquidity and accessibility of alternative asset markets. For privacy’s sake, we will use zero-knowledge proofs and state-of-the-art cryptographic methods. For transparency’s sake, we plan to identify and authenticate both assets and network participants using an emerging standard called Verifiable Credentials – a novel approach that does not lock assets and investors to any particular tokenisation platform and makes real-world trust frameworks accessible to blockchains. Our research responds to the interests of Luxembourg’s art and collectibles investors, but its implications extend much further. A successful European Central Bank digital currency, for example, must comply with regulation, and it should offer European citizens cash-like privacy features.

Subcategory: Fundamental tools and data-driven modelling and simulation – 1 project

Principal Investigator

Pascal Bouvry

Project title

Cloud-based Computational Decision By Leveraging Artificial Ultra Intelligence (CBD)

Host institution

University of Luxembourg

FNR Committed

€648,000

Abstract

We are making our decisions every day. Some decisions are made by our habit and intuition, while others are crucial, resulting from our careful deliberation. This research focuses on the topic of the critical and strategic decisions of an organisation with careful reflection because these decisions require an enormous amount of resources and time. The consequences of these decisions have many profound impacts on organisational development. Many of these decision consequences are often irreversible. Therefore, both senior executives and even researchers need to understand how and why these decisions are made for what purpose and in what context for what objectives. Moreover, they want to know how to leverage the collected dataset for the wisdom of making a strategic decision in organisations.

With the recent advance of machine learning (ML) and artificial intelligence (AI), and other computational technologies, the process of decision-making analysis becomes much more predictable, reliable, transparent, reconfigurable, modular, and cost-effective. Although many organisations have adopted ML and AI in their practices of strategic decision-making, there has still been a gap in researching work in terms of decision-making frame, objectives, and feedback of decision outcomes or process of learning. Most of the previous studies focused on the rationality of adopting different types of optimising techniques or algorithms for a particular kind of application. However, we do not make every decision according to our deliberation or rationality. There are still many decisions that are heavily dependent on our value frames, intuition and emotion.

The research question is, “how can we construct novel knowledge representation in a strategic decision space to reflect multiple frames, build a loss function based on the representation and then select the right optimizer for the loss function?”

The objectives of this project include 1.) Establish a knowledge framework to reflect multiple frames for strategic decision-making, 2.) Build the most effective loss function for the strategic decision-making representation, 3.) try to discover the right optimizer for the loss function. 4.) chope the large training dataset into smaller multiple batches and then send to cloud for learning processing, 5.) Write a prototype of program or web-based interface software that allows Luxembourg SME to have a trial.

The impacts will advance knowledge representation of a strategic decision-making process. It will help the majority of Luxembourg organizations and SMEs to quickly respond to the rapidly changing business environment during the 4th industrial revolution.

The objectives of this proposed project will include 6 high-quality papers and a prototype of an ML program that allows any ordinary Luxembourg SME to run a strategic decision-making process via a web portal.

Subcategory: Social migration and social cohesion / cultural identities, cultural heritage and nationhood – 8 projects

Principal Investigator

Frederic Docquier

Project title

Cross-border Mobility, Housing Market Developments, And Inequalities (CRHOUSINQ)

Host institution

Luxembourg Institute of Socio-Economic Research (LISER)

FNR Committed

€683,000

Abstract

This project aims to uncover the interdependencies between economic concentration, labor mobility, housing market developments and inequality. It focuses on the economy of Luxembourg and the Greater Region, characterized by high economic growth, booming housing prices and dynamic labor mobility (through both immigration and cross-border commuting), features that can also be observed in other regional poles of growth around the world. The project aims to shed light on the sustainability, drivers and distributional effects of the core-periphery developments observed in the Greater Region. This requires to collect and merge data from several sources as well as to develop state-of-the-art empirical and theoretical methods to understand causation links and interactions between variables. The project proceeds in three steps. A first work-package focuses on the determinants of labor mobility and consists of two tasks: (i) collecting and harmonizing data on cross-border mobility, housing prices, economic opportunities in the Greater Region between 2005 and 2020; (ii) developing an empirical strategy to understand workers’ joint decision to commute or to change residence. A second work-package examines the welfare and distributional impacts of labor mobility and the implied housing market developments by: (i) empirically assessing the role of massive labor inflows on the domestic labor market, (ii) estimating the effects of economic growth and labor mobility on real estate prices. Finally, the last work-package explores the interdependencies between growth, labor mobility and housing market developments in a general equilibrium setting. It consists of two tasks: (i) quantifying the welfare and distributional implications of the Luxembourg “growth miracle” using a macroeconomic model accounting for various interactions; (ii) using the model to anticipate future trends and investigate the consequences of policy reforms as well as changes in the sociodemographic and economic environments in the Greater Region.

Principal Investigator

Michel Beine

CORE bilateral: SNF

Project title

Modeling Migration Intentions Using Advanced Discrete Choice Models (MIGDCM)

Host institution

University of Luxembourg

FNR Committed

€480,000

Abstract

This research program proposes extensions of the use of the Cross-Nested Logit approach to the modelling of location choices. The CNL allows to better capture the stochastic structure of these choices through the use of overlapping nests of alternative locations and to account for deviations from the property of independence of irrelevant alternatives. The research program extends the recent work of Beine, Bierlaire and Docquier (2021). This work might be seen as pioneer work that allows to address important issues like the substitutions across alternative destinations. It nevertheless leaves many issues unaddressed. In this research program, we tackle the most pressing issues of the use of the CNL. It will first evaluate the transferability of the CNL to other contexts than the particular case of India. It will develop the technique of alternative sampling to include a comprehensive set of destinations within the CNL. It will improve the specifications, in particular by modelling perceptions of individuals through latent variables and it will evaluate the relevance of using preparation plans for migration rather than pure intentions in capturing self-selection factors of mobility. The program will run for a period of 36 months and will feature a close collaboration between two teams, one located at the University of Luxembourg and one located in Lausanne at the Ecole Polytechnique Fédérale.

Principal Investigator

Konstantinos Tatsiramos

Project title

Social Origins And Intergenerational Persistence Of Socioeconomic Outcomes (ORIGINS)

Host institution

University of Luxembourg

FNR Committed

€690,000

Abstract

Growing economic inequality and low social mobility raise important concerns about social cohesion; and require tracing individuals back to their social origins to address inequality of opportunity that can affect their potential to succeed in life. ORIGINS will expand and deepen our knowledge on the causes of inequality and its persistence by studying the influence of social origins on individual socioeconomic trajectories. Unpacking the origins of existing inequalities and the way in which they are passed on from parents to children is key for understanding their consequences for social cohesion in the long run.

ORIGINS focuses on three key stages of the family life cycle along which parental choices may exert a long run influence on the well-being of their offspring: (i) through parental marital choices, (ii) through offspring childbearing and childrearing and (iii) through offspring entry into the labour market. ORIGINS will provide the first evidence on the role of parental mating choices and the role of parental labour market networks for children’s life cycle earnings; and will offer a unifying framework for the relative impact of genetic endowments and environments provided by parents on several child outcomes such as education, earnings and wealth, offering new insights on an old question with the aim to reconcile previously inconclusive evidence.

ORIGINS has two distinctive features: (i) it deploys high quality register data from Denmark that enable linking parents to their children across their entire life on several socioeconomic outcomes such as education, income and wealth; and (ii) it develops an innovative econometric framework to apply these data based on a model of intrafamily labour earnings dynamics. Although Denmark is a welfare state with generous support of childcare and education, it is well documented that family background still matters for individual success. Detecting the mechanisms of intergenerational mobility in the context of Denmark, can therefore offer important insights for higher inequality and lower social mobility countries including Luxembourg, which is one of the countries with the lowest degree of social mobility across generations in Europe.

Gaining a firm understanding of the extent to which different aspects of social origins matter for the observed intergenerational persistence and socio-economic inequalities, can inform policy makers designing public policies to support disadvantaged children, and address low social mobility and existing inequalities

Principal Investigator

Adrian Nieto Castro

Project title

The Implications Of Temperature For Social Interactions, Work Organization And Well-being (TEMPORG)

Host institution

Luxembourg Institute of Socio-Economic Research (LISER)

FNR Committed

€469,000

Abstract

Extreme temperatures have become more frequent over recent decades, with the trend expected to continue into the future. Prior evidence has shown that extreme temperatures alter the way in which individuals allocate their time between leisure and work, which can have important implications for the labor market and well-being. A much less explored phenomenon is whether extreme temperatures have an effect on how individuals arrange their time within the leisure and work dimensions. In the leisure domain, it is important to understand the effect of temperature on social interactions, as these are an important determinant of employment and well-being. In the work domain, it is important to understand the effect of temperature on worktime arrangements, as these can influence productivity and career advancement.

This project is divided into three work packages (WPs), and provides evidence on the causal impacts of extreme temperatures on (i) social interactions and (ii) worktime arrangements, as well as (iii) whether these effects have long-term implications for well-being.

WP1 examines the causal impact of extreme temperatures on social interactions, by studying whether extreme temperatures alter the amount of time individuals spend alone, and with family and friends. The analysis relies on daily-individual data from the American Time Use survey (ATUS) linked with county-daily data on U.S. weather conditions obtained from the National Oceanic and Atmospheric Administration (NOAA), which acts as the source of identification. In WP1, we also offer an explanation of possible temperature effects on joint time use, using a theoretical model that provides testable implications, which we investigate using ATUS data. WP2 examines the causal effect of extreme temperatures on individuals’ worktime organization across three main dimensions: (i) place of work, (ii) work schedules, and (iii) number and length of breaks taken during work. The analysis, which explores heterogeneity by industry and level of education, is based on a sample of individuals who work on the time diary completion date taken from ATUS data, which we combine with variation in temperature on the time diary completion date and month prior to it, at the U.S. county level. WP3 provides novel evidence on the short, medium, and long-term causal effects of extreme temperatures on flexible types of employment, life satisfaction and mental health. The analysis uses two large panel datasets from the U.S. Panel Study of Income Dynamics and the German Socio-Economic Panel surveys that follow individuals on an annual basis over a 30-year period. We combine these two panels with data from the NOAA and German Weather Service on yearly temperature distributions across U.S and German counties, respectively. Estimating the analysis separately for each of these countries allows to study whether the dynamic effects of extreme temperatures on type of employment and well-being differ between the U.S. and European contexts.

The scientific contribution of the project is to provide novel evidence on the causal effects of extreme temperatures on social interactions and worktime organization as well as on their dynamic implications for type of employment and well-being. This will contribute to the design of more efficient public policies addressing the potential detrimental effects of extreme temperatures on the examined outcomes, thus leading to better functioning labor markets and increased welfare. The results of the project may also assist policy-makers by providing insights for future discussions on climate change using a perspective that has not previously been explored. The contributions of the project are relevant for any advanced economy, given that global warming is an international phenomenon.

Principal Investigator

Morgan Raux

Project title

Digitalization, Change In Skills And Firms’ Hiring Difficulties (DIGISKILLS)

Host institution

University of Luxembourg

FNR Committed

€409,000

Abstract

The digitalization of economic activities is one of the major transformations of developed economies over the last decade. Digitalization changes the task content of occupations and modifies the portfolio of skills required on labor markets. This gives rise to hiring difficulties that now represent one of the top challenges cited by employers.

Digitalization affects labor market tightness in a specific way. The adoption of new digital technologies increases the number of skills required on labor markets without completely removing the older ones. Thereby, the labor demand for the corresponding occupations grows faster than the labor supply and explains a part of hiring difficulties observed on labor markets. This project aims at documenting this mechanism and at investigating its labor market implications on workers and firms.

In this project, we will take advantage of new digital tools to study the labor market effects of digitalization. In particular, we will use large online job posting databases collected with web-scraping alike algorithms to measure labor market tightness. Our contribution will then consist in developing an instrumental variable approach to assess the causal effects of labor market tightness on workers and firms. We will build on this identification strategy to document the labor market impacts of hiring difficulties that have mostly been overlooked in the academic literature. We will first use this approach to investigate the causal impact of recruitment challenges on wage inequalities in France. We will also exploit this setting to study its effect on firms’ productivity and innovation in Luxembourg. In addition, we will explore the mitigation strategies adopted by firms to cope with recruitment challenges. In particular, we will compare the results associated with firms’ training, outsourcing, and the recruitment of foreign-born workers. Comparing the French and the Luxembourgish contexts will add to our understanding since the labor market structures in each country are very different. Finally, we will document the cause driving the positive relationship between labor market tightness and the demand for foreign skilled workers in the United States.

Principal Investigator

Benoit Majerus

Project title

Making Shell Companies Visible. Digital History As A Tool To Unveil Global Networks And Local Infrastructures (LETTERBOX)

Host institution

University of Luxembourg

FNR Committed

€610,000

Abstract

Letterboxes, representing shell companies, are an emblematic symbol for a form of international capitalism that has come under increasing scrutiny in recent years and serve as a focal point for international criticism of the Luxembourgish financial system. LETTERBOX is an innovative project that mixes recent questions in financial history with cutting-edge digital history. The banking crisis of 2007/2008 and the ensuing increasing public debt as well as the scandalisation of financial practices (Luxleaks, Panama Leaks, Paradise Papers, OpenLux) has put the question of fiscal tax optimisation at the forefront of societal discussions. In these narratives, Luxembourg, through flexible legislation on companies, plays a pivotal function. Thanks to a unique digitised corpus – Annexe du Mémorial et Mémorial C (1929-2016) – LETTERBOX will situate Luxembourg in a global financial geography and unveil the local ecology of lawyers, notaries and accountancies in a historical perspective. This is however only possible to achieve by incorporating computational means that allow one to automatically annotate and extract persons, places and organisations as a basis for analysis and exploration and so unveil these hidden infrastructures. LETTERBOX will build on the existing know-how of the C2DH in the extraction of entities in large-scale corpora and will refine existing as well as new analytical approaches for analysing the data through specific interfaces. The project will make extensive use of co-design principles in order to develop interfaces that are tailor made for cross-referencing heterogeneous structured, semi-structured and unstructured data.

Principal Investigator

Andreas Fickers

CORE bilateral: FWO

Project title

Bureaucracy By Design? Eu Office Interiors As An Interface Between Architectural “Hardware” And Managerial “Software”, 1951-2002 (BUREU)

Host institution

University of Luxembourg (C2DH)

FNR Committed

€634,000

Abstract

One of the most noteworthy secondary effects of the ongoing covid-19 pandemic has been the sudden emergence of large-scale teleworking. It was striking to see how sanitary considerations could – within days – radically transform the way in which large parts of the European population performed their work. As such, teleworking in pandemic times has raised fundamental questions on the essence of “the office” as both a social and material phenomenon.

By taking a historical approach, BUREU will contribute to our understanding of the roots of current society’s complex relationship with “the office”. Concretely, we will explore the office buildings of the European Union in Luxembourg, Brussels, and Strasbourg. The EU and its legal predecessors have been unique in history: singular in the way an additional, before non-existent political layer affecting the lives of all Europeans was given shape; singular in the way a transnational administrative framework was created; and singular in the way far-reaching plans for political integration and piecemeal organizational expansion were to be reconciled in office architecture. By investigating this latter dimension, with a specific focus on EU office interiors (1950s – early 2000s), BUREU will not only add to research on the histories of office architecture and cultures of office work, but also to research on the history of European integration.

We will develop an innovative analytical framework which combines historiographical approaches on (a) the office as a “technical-organizational complex” and (b) the role of material infrastructures as driving forces behind European integration processes. Using a set of metaphors from the field of informatics, we contend that EU office interiors were “interfaces” where the material dimension of architecture (“hardware”) and the immaterial managerial programme (“software”) met. We hypothesize that managerial programmes (a) functioned as normative ideological frameworks for the creation of “efficiency”, and (b) had an impact on interior design choices, as well as on the administrative work performed in the office buildings. For a selection of five EU institutions (Parliament, Commission, Council, Court, and European Investment Bank), we will analyse which design choices were made regarding the office interiors, and if these were perceived as successful. This way, we will be able to answer the question if the EU’s political aim of operational efficiency also found a counterpart in the organization’s office interiors.

In order to understand how EU office buildings were (re)designed with the aim of enabling operational efficiency, and how the impact of office architecture on the EU’s administrative functioning was perceived, we will look at the epistemic communities involved in these processes. We will take into account the discourses and activities of (a) architectural clients, (b) various professional groups vying for influence and power in the sphere of office development and/or design (architects, managerial specialists, etc.), and (c) a crucial category of users (the so-called “Heads of unit”, who had the discretionary power to take decisions on spatially-related organizational matters). Our empirical component will be based on archival sources and oral history.

By combining the specific expertise of the PI (digital history and the history of technology), the co-PI (history of architectural expert networks and political European history), and the postdoctoral researcher (history of government office buildings), we ensure to meet the conceptual challenges coming with our methodological approach.

Principal Investigator

Michel Erpelding

Project title

Forgotten Memories Of Supranational Adjudication (FoMeSA)

Host institution

University of Luxembourg

FNR Committed

€279,000

Abstract

Since the end of the Cold War, international courts and tribunals enjoying jurisdiction over complaints by private persons against sovereign states have played an increasingly important role. This is especially true in Europe, which is often said to have pioneered this practice in the 1950s. It is true that post-WWII gave rise to the European Court of Justice (ECJ) in Luxembourg and the European Court of Human Rights (ECtHR) in Strasbourg. It is also true that the first generation of European integration lawyers heavily insisted on the novelty of post-WWII European supranational courts and tribunals, thus shaping an institutional narrative that is now firmly established.

Empirical evidence shows that this narrative needs to be largely qualified. Indeed, both within the colonial context and in interwar Europe, several treaty regimes had already established internationally composed judicial or quasi-judicial institutions endowed with jurisdiction over claims by private persons and groups regarding individual rights against sovereign states. In other words, a form of supranational judicial practice had already developed well before the 1950s. Despite their pioneering role and their direct links with the architects of post-WWII courts and tribunals, these early international courts have today largely disappeared from collective memories.

FoMeSA aims to assess the continuities and discontinuities between these institutions and present-day European and international courts. By reviving the forgotten memories of the international and European legal community, by offering new historical comparisons and inviting the public to explore ways of re-integrating them into institutional memories, it will also contribute to present-day debates on the role and legitimacy of international courts and tribunals.

Subcategory: Climate change: energy efficiency and smart energy management, resilient eco- and agrosystems – 5 projects

Principal Investigator

Yves Le Traon

Project title

Lightweight Collaborative Nanogrid Controllers For Global Greehouse Gas Emission Reduction (LightGridSEED)

Host institution

University of Luxembourg

FNR Committed

€767,000

Abstract

Buildings account for a significant part of global energy demand and greenhouse gas emission (GHGE). Renewable energies and associated technologies are key to long-term climate change mitigation, as they provide new ways to produce, store and control how energy is used. However, it is not only a matter of using renewable technologies, but also of optimizing the charging/discharging strategy of local storage units (e.g., during off-peak times, it could be sensible to draw power from the electrical grid rather than from the battery to power the loads, as it may result in lower GHGE and costs in the medium run). A number of optimization models have been proposed in the literature, and, although they may differ in terms of required infrastructure and targeted fitness goals, they are often faced with two major drawbacks:

– Computational complexity: charging optimization models are known to be NP-hard, and require Machine Learning to forecast future parameters like anticipated energy demands. Due to the large computational needs, most of the approaches rely on Cloud computing (using powerful computational servers), but this model suffers from several weaknesses such as limited bandwidth resources and latency, along with privacy concerns (e.g., inhabitants’ energy consumption profiles are sensitive data);

– Sub-optimal system level performance: most of the state-of-the-art optimization models are designed based on individual goals/constraints, leading to optimal charging actions at the household (nanogrid) level, but not necessarily at the global level (e.g., considering a shared goal at the microgrid or grid level such as minimizing electricity providers’ procurement costs, or complying with CO2 regulations). This leads to the dilemma of sacrificing some of the individuals’ profits for the collective social welfare, which can be formalized as a distributed charging optimization problem.

To overcome these limitations, the LightGridSEED project, standing for “Lightweight collaborative nanogrid controllers for global Greehouse gaS Emission rEDuction”, is committed to investigate and design a new type of controller at the household (nanogrid) level empowered with four key abilities: (a1) Self-adaptive: it can self-adapt to any changes occurring in its environment; (a2) Eco-friendly: it can act in a sustainable way; (a3) Sociable: it can interact and collaborate with other controllers to achieve a global (shared) goal; (a4) Robust: it can overcome uncertainties and disturbances through coalition and collaborative efforts. Towards these desired abilities, two research questions aim to be addressed:

RQ1: How edge (fog) computing can be efficiently integrated into future nanogrid controllers to make them highly self-adaptive (a1)?
RQ2: How blockchain technology and distributed optimization theory can be combined in an eco-friendly manner (a2) to empower future nanogrid controllers with high social (a3) and collaborative robustness (a4) abilities?

From a scientific viewpoint, LightGriSEED contributes to extend the state-of-the-art in two-respects. First, it investigates novel combinations of model compression techniques and offloading/transfer learning strategies using Deep Learning (DL), whose key challenge is to optimize how to devise edge computing architecture to achieve the best performance of DL training and inference under the multiple constraints of networking, computing power, communication and energy consumption. Second, it investigates how to mitigate the impacts that the blockchains used in prosumer energy markets could have on the overall system performance (from a CO2, computational delay and financial viewpoint). The theories and algorithms developed in the project are experimentally tested and validated based on (i) secondary (scientific) data to allow for scientific benchmarking; (ii) real-life data coming from real-life renewable houses located in Biekerech/Saeul in collaboration with Energipark s.a.

Principal Investigator

Markus Hesse

Project title

European Financial Centres In Transition (FINCITY)

Host institution

University of Luxembourg

FNR Committed

€848,000

Abstract

FINCITY is a multidisciplinary research project that focusses on how major global events coincide with broader processes of economic restructuring and financialisation in Luxembourg, Frankfurt, and Dublin—three of Europe’s most significant financial centres. More specifically, it aims to understand how these cities and other major European financial centres have been restructured in response to Brexit— which has expelled a large number of financial services firms from the UK, and COVID-19— which has redistributed population away from major urban centres toward smaller agglomerations (McCarthy and Smith, 2020), and fundamentally altered the ‘labourscape’ in favour of telework (Belzunegui-Eraso and Erro-Garcés, 2020). The project is inspired by the dual nature of current conditions in continental European financial centres. On one hand, Brexit acts as a centripetal force, attracting large numbers of firms and employees to Luxembourg, Frankfurt, and Dublin (among other cities). To date, 7,600 financial sector jobs and €1.5 trillion in assets have relocated from London (Jones, 2021). According to The Financial Times, Luxembourg “has emerged as one of the biggest winners from the shift out of the UK: 72 companies, nearly all of them in financial services announced plans to relocate their EU operations from London” (Stafford, 2020). Similar figures support the migration of firms and their employees to the Dutch and German financial hubs. On the other hand, COVID-19 acts as a centrifugal force, with strong evidence suggesting that the cumulative impacts of telecommuting, firm restructuring, long-distance commuting, and firm decentralisation are likely to cause de-agglomeration. Within Europe, a recent EU report found that while only 15% of employees had teleworked before the pandemic, 25% of jobs were ‘teleworkable’ (Fana et al., 2020). Of the EU member states, Luxembourg has the highest proportion of jobs fit for telework (Fana et al., 2020). Similar patterns have emerged in the United States, where cities such as Austin, Texas, and Boise, Idaho, have absorbed a large number of Silicon Valley firms and workers, many of whom may never return to the office. In Europe, further decentralisation is possible if banks’ back-office staff are permanently dislocated from central offices. To explore the socio-spatial impacts of the current state of affairs, this project first builds an updated profile of each financial centre by investigating how the corporate geography of banks and financial services firms has changed over the past five years. Firm-level data will be compiled from various proprietary databases with a view to understand how Brexit and COVID-19 have reoriented the character and composition of advanced producer services within each city. Second, the reorientation of each city’s services agglomeration will be related to its spatial impact through a detailed investigation of local property markets as key indicators. The nature of commercial property has changed considerably with a pivot to teleworking, larger floorplates (allowing for distancing) and the requirements of global firms whose footprint extends far beyond the walls of their offices. Residential property has also been brought into sharper relief, with a greater preference for working from home, meaning that proximity to urban centres is potentially less important than space. Based on a combination of empirical data, extensive stakeholder interviews and focus group meetings, we interrogate which changes may play out in each market’s property sector, and how these relate to both global and industry trends. Finally, given the importance of regulation, the project concludes by investigating how policymakers have responded to these multiple crises and their significance for urban policies.

Principal Investigator

Ottavia Cima*

Project title

Sustainable And Inclusive Urban Food Systems In Luxembourg And Switzerland (SUSINU)

Host institution

University of Luxembourg

FNR Committed

€429,000

Abstract

Food systems play a complex role in the current environmental crisis: they are simultaneously victims of climate change, one of its causes, and they can represent one of the possible solutions to tackle it. If cities want to develop sustainably, then, they have to start thinking seriously about what they eat. A growing number of citizen-led initiatives in cities, and a much more limited number of initiatives by city administrations, are attempting to transform urban food systems to make them more sustainable. However, their impact and scope remain limited. Furthermore, consumers’ demands for more sustainable food risk adding a further, also moral, burden on food producers who already struggle between market imperatives and changing environmental conditions. If those initiatives fail to integrate the different needs and perspectives of food producers, they in fact risk digging even deeper divides within food systems, leaving many food producers behind and stifling their willingness to embrace sustainable solutions. The goal of the SUSINU project is to conceive and design alternative food systems as a way to tackle the current environmental crisis. Specifically, SUSINU aims to contribute to the development of urban food systems that are 1) actively promoted by urban policymakers; 2) more sustainable, in that they can adapt to and at the same time mitigate climate change; and 3) more inclusive in that they integrate the perspectives of different typologies of food producers. In the context of two cities in Luxembourg and in Switzerland, SUSINU assesses current urban food policies, analyses the role and experiences of food producers in them, and together with policymakers and key stakeholders, collaboratively develops innovative alternatives for urban food policies and systems. The project adopts a relational perspective on cities and food systems, applying a mix of qualitative methods within a transformative research approach. The project thus enriches current academic and public debates on sustainable food systems, especially in three regards: 1) it innovatively reflects on how to translate a relational understanding of cities into effective, sustainable and inclusive urban food policies; 2) it actively mobilises urban policymakers in these reflections, thereby challenging their common neglect of food issues; and 3) it rebalances the mainly consumer-oriented focus of alternative food initiatives by focusing on food producers and by actively integrating their perspectives into the design of urban food policies.

Principal Investigator

Paul Kilgarriff*

Project title

Impact Of Teleworking On Urban Structures (TELE-SIM)

Host institution

Luxembourg Institute of Socio-economic Research (LISER)

FNR Committed

€554,000

Abstract

The pandemic has revealed preferences for greater levels of teleworking (working from home). With many workers and employers in favour of more teleworking post-pandemic, this will have consequences for how cities organise themselves. Therefore, the impact of teleworking on urban structures in the post-pandemic city requires further empirical testing for Europe. Teleworking decreases commuting costs, making longer commutes more affordable. Lower commuting costs increase the household budget for housing enabling people to move to larger less expensive houses in the suburbs. These factors will influence the population density profile. Teleworking may cause greater levels of dispersion and sprawl, with cities becoming less compact, challenging the sustainability levels of cities. This project contributes in several innovative ways. Firstly, we contribute to the teleworking literature adopting a nomothetic approach common in urban analytics; we simultaneously examine the internal urban structure under several scenarios for a large sample of European cities (400+). Secondly, we contribute to the scaling literature by providing the first examination of house price profiles and their gradients with a focus on the internal structure of cities. Thirdly, we challenge the current definition of cities, which use a functional urban area approach and commuting thresholds. With less commuting, these definitions no longer seem appropriate. Teleworkers have greater residential choice expanding the fringe distance and city extent. Several important research questions are addressed. How is residential choice and commuting distance affected by teleworking? Will house prices decrease in the centre and increase in the periphery? Will teleworking have a differential impact on density profiles across the city size distribution (small cities versus big cities) and across regions? To address these questions, this project will examine the internal structure of cities using a monocentric approach from city science and urban analytics, GIS methods, web-scraping techniques, hedonic models and data simulations. Commuting costs and house prices are used as tools to simulate changes in the density gradient and profile. Different teleworking scenarios are analysed, reflecting the teleworking potential of the city (share of jobs in retail, professional services etc.), on the density profiles of cities. Using a large sample has the additional advantage of providing statistical significance and generalisable results. From our results, we can say with a greater level of certainty what the impact of teleworking on density is for European cities. One of the major outputs from the project will be a dashboard that will disseminate the results, research and data to both an expert and non-expert audience supporting open-source reproducible research.

Principal Investigator

Florin Capitanescu

Project title

Transmission System Security Enhancement Through Distribution System Flexibility (TESTIFY)

Host institution

Luxembourg Institute of Science & Technology (LIST)

FNR Committed

€777,000

Abstract

The transition towards affordable, secure and clean energy is a key pervasive societal need and challenge, which requires bridging several knowledge gaps. To bridge some of these gaps, this project proposes a new approach to manage on short-term the overall electrical grid in a secure manner, making optimal use of the energy flexibility of various distributed energy resources (DER), e.g. renewable energy sources (e.g. wind, solar), which are present in active distribution systems (ADSs) managed by distribution system operators (DSOs). These DER are typically not visible to the transmission system operator (TSO), which is a major loss of opportunity because they can be effective additional control means to enhance secure operation of the transmission grid and reduce its operation cost. Currently the necessary knowledge and methodologies to make use of this flexibility for maintaining security are under development.

The project addresses this timely research gap by responding in an integrated way three fundamental questions:
Q1: What enhancement of probabilistic security metric is needed to factor in ADS flexibility?
Q2: How should a DSO quantify the flexibility of an ADS (meeting the ADS constraints) at the interface with the TSO in terms of active/reactive powers capability chart over time?
Q3: How to extend the TSO’ security management tool, i.e. security-constrained optimal power flow (SCOPF), to incorporate in a scalable way the enhanced security metric and ADS flexibility at TSO-DSOs interfaces in a look-ahead (multi-period) short-term timeframe?

A specific objective corresponds to each research question, as follows:
O1: Develop an enhanced risk-based security metric accounting for ADS flexibility.
O2: Design a methodology to compute the active/reactive power chart of an ADS at the interface with the TSO.
O3: Develop a tractable methodology for short-term multi-period SCOPF-based security management embedding the enhanced security metric and active/reactive power chart at TSO-DSO interfaces.

The new proposed security management approach raises scientific and computational challenges such as: intrinsic problem complexity in terms of comprehensiveness and diversity of relevant features to incorporate, and thereby the suitable problem formulation, and the large problems size, unmanageable by state-of-the-art solvers.
The project is unconventional in two fundamental aspects: (i) it proposes two new (nested) frameworks: the holistic security management approach (not addressed so far), including an enhanced security risk metric, and ADSs flexibility provision at TSO-DSOs interfaces (timely topic) to support security (partially explored), and (ii) it explores suitable methodologies (trading off accuracy and speed) with various original features to solve these problems.

The project tackles challenging major research and industry needs, which are often neglected in academic research: the integration of security in operation and the complex nature of doing so in a holistic manner, including TSO and DSOs. The most important impact on the academia is to generate new knowledge in terms of security criteria and the associated look-ahead optimization routines, which could change the manner in which power systems will be operated.

The impact on the industry lies on (i) bringing further evidence that computationally tractable and sufficiently accurate models are possible to address security management in a holistic way, (ii) prove quantitatively that a departure from the deterministic N-1 criterion is necessary in the context of large shares of RES, and (iii) assess the project benefits on real-world data provided by CREOS, the TSO/DSO in Luxembourg, in addition to abundant open-source data sets from the literature. The impact on the industry will be analyzed with CREOS, ENTSO-E and Croatian DSO.

The project will allow accommodating more renewable energy in the power system in a cost-effective manner while maintaining system security.

Subcategory: Economic green sustainable finance / circular and shared economy – 3 projects

Principal Investigator

Thomas Gibon

CORE Bilateral: DFG

Project title

Integrated Modelling Of Material Efficiency And Environmental Impacts Of Building Materials Cycles (IMMEC)

Host institution

Luxembourg Institute of Science and Technology (LIST)

FNR Committed

€477,000

Abstract

In Europe, as well as around the globe, increasing resource efficiency to enable sustainable economies is a pressing issue. The transition towards a more circular economy is seen as a key element to achieve a high degree of resource efficiency. Material cycles, where waste is minimized and turned into resources, can be used in production processes and thereby increase overall resource productivity. Environmental impacts often decrease with increasing circularity, but there are also tipping points, where more recycling does not lead to better environmental performance. In addition, conventional models often consider infinite recycling loops with no loss of quality, which has recently appeared as not representative of real-world processes (where collection, sorting, quality, purity… are almost always overestimated). Additionally, implementing circularity measures in long-lived potentially leads to the treatment and/or reintroduction of legacy contaminants.

Based on this context and identified research gaps in the state of the art, the consortium, composed of researchers from LIST and the University of Kassel, aims at answering the following research question: what are the potential sustainability issues arising, over time, with the increasing reuse, recycling, and reprocessing of materials, in particular in the case of plastics and the hazardous substances they may contain, in buildings and infrastructure? The objectives are three-fold: 1. To develop a consistent assessment framework for considering material flow dynamics and life cycle impacts of long-living anthropogenic resource systems, 2. To (simultaneously) model substance, goods, and product cycles in order to quantitatively and qualitatively account for recycling loops and issues of downcycling, with uncertainty considerations, in relation with past and current substance regulations, allowing for a policy analysis on legacy contaminants, 3. To apply the developed model to material use in the B&I sector in two case studies on plastics, from a material perspective (PVC) first, and then from a product (tubes and pipes) perspective, to illustrate its use as a circular economy decision-support tool. The proposed research will enable assessments of circular economy strategies integrating material efficiency and environmental impacts from a systemic perspective. The model will allow for a comprehensive evaluation of re-use, recycling and other waste utilization options, which is essential to raise recycling levels (as required by EU policy directives) and at the same time improve environmental performance whereby efforts for (and impacts of) enhanced recycling must not offset gains by secondary production. The team at University of Kassel will lead tasks related to system dynamics modelling, as well as data collection. LIST will coordinate the project, enrich the model with environmental impact assessment modules, as well as developing the software. The outcomes of the project will be of relevance to the scientific community (ensured by the scientific advisors), the stakeholders of the B&I sector (guaranteed by their involvement in the projects from the outset and several participatory workshops), as well as policymakers and the public.

Principal Investigator

Roman Kräussl

Project title

Sustainable Finance And The Efficient Allocation Of Capital (GREEN)

Host institution

University of Luxembourg

FNR Committed

€495,000

Abstract

Environmental, Social and Governance (ESG) metrics overwhelmingly rely on company-disclosed policies that are then used to assess companies’ exposure to financially relevant ESG-related risks and opportunities. The absence of widely-accepted standards that define a company’s ESG profile further exacerbates the bias resulting from self-reported data. Our suggested research project´s overarching objective is to promote the understanding of the role of public and private sources of financing in the efficient allocation of capital aimed at the transition of the world economy to a sustainable path. The goal will be reached by the following fivefold contribution:

(1) Identify the fundamental sustainability impact of a company through the products and services it offers along the dimensions of the 17 UN Sustainable Development Goals (SDGs);

(2) Evaluate the extent of the divergence between self-reported sustainability practices of companies and estimates of their fundamental sustainability and investigate the implications stemming from that disagreement on the misallocation of capital to sustainable projects and companies;

(3) Assess the sustainability footprint of ESG-labelled Exchange Traded Funds (ETFs) through the companies they hold, investigate the dimensions of ESG score metrics that are material for investor flows and quantify the potential for capital misallocation due to mismatch with funds’ stated objectives or ESG labels;

(4) Provide a framework for the assessment of the sustainability alignment of infrastructure assets and the role of ESG considerations in the flow of institutional investor capital to the asset class;

(5) Provide a modelling framework to address the implications of company sustainability scores and their divergence on corporate bond values and the pricing of credit risk.

Principal Investigator

Christos Soukoulis

Project title

Exploration Of The Potential Of Deep Eutectic Solvent – Physical Processing Driven Strategies In The Production Of Functional Nanocellulose Pickering Particles (DEEPCELL)

Host institution

Luxembourg Institute of Science and Technology (LIST)

FNR Committed

€557,000

Abstract

Nanocelluloses are nanosized features of crystalline or fibrillar form that exert interesting intrinsic properties such as high surface area, crystallinity, wettability and mechanical strength. As a common practice, nanocellulose is formed through the hydrolysis of cellulose by means of acidic or enzymatic media. Nanocelluloses are deployed as active or inert fillers in functional nanomaterials and nanoscaffolds for a broad range of applications. Owing to their nanoscale size and good wettability, nanocelluloses can also be applied as solid interface stabilising agents known as Pickering particles.

Green chemistry inspired solvents such as deep eutectic solvents (DES) have gained a lot of ground against organic solvents as they are inexpensive, low vapour pressure, non-flammable, chemically and electrochemically stable, recyclable, biodegradable, water neutral and toxicologically well-characterised. Besides, DES can be naturally synthesised by sustainable bioresources (e.g. primary metabolites of plants and microorganisms), i.e., NADES. Transformation of recalcitrant biomass resources, e.g. wood, paper, agri-food side streams or food waste etc., into added-value products via DES deployed biorefinery strategies is an emerging field of research innovation actions.

DEEPCELL project aims at exploring the potential of combined NADES – physical processing (microwave, ultrasound and micro-fluidisation) strategies for green transforming microcrystalline or microfibrillar forms of cellulose-rich substrates (e.g. obtained from industrial waste side streams such as wood, paper or agri-food waste) into Pickering particles for nutraceutical, cosmetic or drug delivery applications. An optimisation plan based on a Design of Experiments – Artificial Neural Networks (DoE-ANNs) approach will be adopted to define the optimum conditions for producing cellulose nanocrystals and nanofibrils of desirable intrinsic properties i.e., morphological aspects, wettability and cytotoxicity. A Life-Cycle Analysis driven jack-knifing strategy will be implemented for identifying nanocellulose candidates of minimal ecological footprint. The LCA screened cellulose nanocrystals and nanofibrils will be fully characterised in terms of their chemical structure, molecular, interfacial, and self-assembly NADES endowed properties. For the first time, NADES/physical processing obtained nanocelluloses will be tested for their Pickering particle interface stabilising efficiency in o/w emulsions of nutraceutical/drug delivery purpose. The interface stabilising role of the nanocellulose will be tested under accelerated storage conditions and simulated in-vitro digestion conditions. The biocompatibility, e.g. cytotoxicity, proteomic response, cell barrier integrity, mucoadhesion, and the ability of nanocellulose to modulate the uptake of a model drug compound (squalene), will be tested using a triculture (Caco-2, HT-29 and Raji B cells) model of the human intestinal epithelium. Also, the ability of the nanocellulose particles to affect the species diversity of a synthetic model of the gut microbial ecosystem will be investigated. Conventionally produced cellulose nanocrystals and nanofibrils, e.g. concentrated acid and cellulase assisted hydrolysis, will be employed as reference Pickering particles material.
DEEPCELL aspires to provide proof-in-principle evidence of the feasibility of NADES combined with soft processing unit operations for the production of sustainable, ecologically resilient, biocompatible and biologically active Pickering particles.

Subcategory: Precision medicine, including environmental, lifestyle and socio-economic factors – 8 projects

Principal Investigator

Silvia Bolognin

Project title

Autophagy Modulation To Decrease Astrocyte Senescence In Lrrk2-g2019s Models (AstrAging)

Host institution

University of Luxembourg

FNR Committed

€602,000

Abstract

Parkinson’s disease (PD) is the second most prevalent neurodegenerative disease in the aging population. Mutations in the gene encoding for the leucine-rich-repeat-kinase-2 (LRRK2) protein, especially the G2019S, have been identified as a cause for a familial form of the pathology and a risk factor for the development of sporadic PD. The cause of the disease is still obscure but aging has been unanimously recognized as the major risk factor associated with its development. Aging is a physiological condition but it is likely that changes in the cellular hallmarks of healthy aging, together with other contributing factors, might predispose to the development of PD.

In this project, we aim at studying an aspect of cellular aging, called senescence, in one of the most abundant cell types of the brain, the astrocytes. We hypothesize that defects in autophagy and mitophagy in astrocytes determine the acquisition of a senescent phenotype, especially in a LRRK2-G2019S background. We will study how autophagy and mitophagy dynamics change in young and aged astrocytes derived from induced pluripotent stem cells. We will then pharmacologically modulate mitophagy and autophagy to prove whether this reduces/prevents senescence occurrence in LRRK2-G2019S astrocytes. To evaluate how mitophagy and autophagy modulation can affect astrocytes in a more complex 3D system, we will use brain organoids. We postulate that autophagy-induced astrocyte senescence can be toxic to surrounding neurons but reversible upon treatment. We will characterize the LRRK2-G2019S organoids for the potential rescue of astrocyte senescence and the resulting neuronal degeneration following the autophagy/mitophagy activators.

The obtained data will allow us to clearly define the role of astrocytic senescence in PD and to establish whether its modulation can represent a therapeutic option for PD.

Principal Investigator

Giuseppe Arena*

Project title

Pink1-related Molecular Mechanisms To Dissect The Connection Between Type 2 Diabetes And Insulin Resistance In Parkinson’s Disease (PINK1-DiaPDs)

Host institution

University of Luxembourg

FNR Committed

€673,000

Abstract

Growing evidence indicates that patients with Type 2 Diabetes (T2D) have an increased risk of developing Parkinson’s disease (PD). These two age-related chronic diseases share similar alterations in essential biological processes and molecular networks, suggesting common mechanisms underlying their pathogenesis. In this light, is not surprising that common anti-diabetic drugs such as Metformin and Exenatide have been highlighted as potential causative PD therapeutics.

Mutations in the PARK6 gene encoding the mitochondrial kinase PINK1 are the second most frequent cause of autosomal recessive early onset PD. Functional studies on disease-causing PINK1 mutations significantly contributed to understand the cellular dysfunctions leading to neurodegeneration, unravelling common molecular mechanisms also implicated in the more frequent sporadic PD (i.e. impaired mitochondrial function and quality control). Based on the compelling evidence implicating PINK1 also in T2D, we decided to use complementary PINK1-deficient cellular models (PINK1-mutant iPSC-derived dopaminergic neurons and PINK1-silenced pancreatic β-cells) as prototype to decipher the cellular alterations leading to neurodegeneration in PD and β-cells failure in T2D, trying to clarify the functional interdependencies between the two diseases.

Our preliminary data suggest that impaired expression of the transcription factor NR4A3 could represent a missing link between PINK1 deficiency and insulin dysmetabolism, potentially coupling insufficient insulin secretion from the β-cells with neuronal insulin resistance and neurodegeneration. Nevertheless, the upstream signals regulating NR4A3 activity as well as the panel of genes controlled by the transcription factor in both dopaminergic neurons and pancreatic β-cells remain largely uncharacterized to date. Thus, the specific objectives of PINK1-DiaPDs project are: (i) To characterize the molecular and metabolic alterations linking PINK1 deficiency to decreased NR4A3 expression in both iPSC-derived neurons and pancreatic β-cells; (ii) To identify and validate the molecular effectors (i.e. NR4A3 target genes) linking impaired PINK1-NR4A3 axis to neuronal dysfunction and pancreatic β-cell failure; (iii) To rescue the phenotypic alterations observed in PINK1-mutant neurons and in PINK1-depleted pancreatic β-cells by re-activating (genetically or pharmacologically) the expression of NR4A3 or newly-discovered downstream targets.

Findings obtained in PINK1-DiaPDs will provide new insight into the mechanistic and pathophysiological interplay between PD and T2D, allowing the identification of novel potential therapeutic targets and thus contributing to patient-based biomedical research in both research fields.

Principal Investigator

Clément Thomas

Project funded together with Fondation Cancer

Project title

Novel Synaptic Filopodium-like Protrusions (Sfps) Protect Tumor Cells Against Cytotoxic Lymphocyte-mediated Killing (Synapodia)

Host institution

Luxembourg Institute of Health (LIH)

FNR Committed

€426,000

Abstract

Cytotoxic lymphocytes (CLs) are key anti-tumor immune effector cells. They physically interact with prospective target cells through a highly specialized cell-to-cell interface termed immunological synapse. This interface is indispensable for recognition of cancer cells and activation of CL cytolytic functions. Recently, we established that cancer cell intrinsic resistance to CL-mediated killing correlates with fast and massive accumulation of actin filaments to the immunological synapse or “actin response”. Remarkably, inhibition of the actin response is sufficient to restore cancer cell susceptibility to CL-mediated killing in vitro and a potent anti-tumor immune response in vivo.

Conversely, enhancing the capacity of cancer cells to mount an actin response translates into tumor immune evasion, both in vitro and in vivo. Super resolution microscopy and ultrastructure investigations revealed that the actin response consists of many actin-rich filopodial protrusions projecting into the synaptic cleft and that are heavily decorated with the immune checkpoint molecule PD-L1. In the SYNAPODIA project, we aim at characterizing the molecular identity and functions of these newly discovered synaptic protrusions. Based on our preliminary data, we propose that they critically contribute to immune checkpoint activation by driving PD-L1 polarization toward conjugated CLs.

In addition, we aim at understanding how this synaptic evasion mechanism alters the tumor immune landscape in vivo. Finally, we will evaluate the therapeutic potential of targeting the actin response and synaptic filopodia in combination with immune checkpoint blockade therapy.

Principal Investigator

Leslie Ogorzaly

Project title

Environmental-based Epidemiology For Preparedness And Early Detection Of Viral Epidemics (VIRALERT)

Host institution

Luxembourg Institute of Science and Technology (LIST)

FNR Committed

€846,000

Abstract

The risk of viral epidemics and pandemics has undeniably increased in the recent decades due to global changes affecting inter-connectivity, climate and human-animal interface. Notably, several epidemics or pandemics marked the last 20 years, such as those caused by SARS-CoV-1, MERS-CoV, several influenza, Ebola, Zika and lastly SARS-CoV-2. All these viruses originated in animals, before favorable environmental conditions were created to promote species jump and further human-to-human transmission. While many viruses may circulate silently in the population until breaking out, traditional infectious disease surveillance systems detect outbreaks on the basis of clinical symptoms and thus fail to timely identify the critical places and times of viral disease emergence.

Innovative surveillance and early warning tools are thus urgently needed to more efficiently control and prevent viral threat spread. In this context, the main objective of the VIRALERT project is to set up the initial phase of the epidemic preparedness program at the national scale, through the implementation of an enhanced environmental surveillance allowing early detection of viral epidemics. The originality of the system proposed relies on the integration of two complementary environmental compartments characteristic of human and animal populations, by combining the monitoring of wastewater and surface water.

While wastewater-based epidemiology has long proven its feasibility and usefulness for enteric virus surveillance, mounting evidence suggests a much wider applicability to provide essential public health data on non-enteric viruses: wastewater monitoring showed good correlation but also interestingly increasing SARS-CoV-2 transmission in the population ahead of population screening data. The current pandemic has also altered health-seeking behavior and promoted telemedicine, likely durably affecting traditional sentinel surveillance. Non-pharmaceutical interventions, such as social distancing, limited interactions and enhanced hygiene implemented to slow down the COVID-19 pandemic, have disturbed the spread of endemic viruses, resulting in lower herd immunity, larger pools of susceptible and more severe epidemics in the near future. In complement freshwater-based epidemiology will be explored as a pioneering tool to reflect the viral diversity of major animal reservoirs and epidemic prevention.

We therefore propose to take environmental surveillance to the next level by (i) broadening the spectra of viral species monitored beyond human enteric viruses, (ii) monitoring the post-COVID-19 effect on endemic viral populations as an early warning system for future epidemics and (iii) extending it to more reservoirs in an integrated manner to cover human and animal populations. To that purpose, viral populations in wastewater and aquatic environment will be characterized using three complementary molecular techniques to cover the spectrum from a highly targeted approach with little information on virus diversity (RT-qPCR) to a targeted approach with sequence data to unbiased metagenomics for viral communities.

Focus will be made on virus families with high epidemic or pandemic potential, i.e. Coronaviridae, Orthomyxoviridae, Flaviviridae, Caliciviridae, Picornaviridae and Hepeviridae. The results emerging from this state-of-the art project will decrease disease burden, improve public health and the well-being of the Luxembourg population by paving the way to a strategic cost effective national prevention resource for viral disease surveillance and possibly beyond.

Principal Investigator

Dirk Brenner

Project title

Characterization Of Key Metabolic Circuits In Th17 Cells And Their Influence On Th17 Cell Mediated Pathogenic And Protective Functions (Th17-ImmunoMet)

Host institution

Luxembourg Institute of Health (LIH)

FNR Committed

€762,000

Abstract

A balanced immune system is important for organismal survival because it allows reactivity against pathogens while preventing self-destructive immune reactions driving autoimmunity or chronic inflammation. A devastating disease whose treatment would benefit greatly from new therapies able to dial back self-reactive immune responses is the autoimmune disease multiple sclerosis (MS), which affects the central nervous system (CNS).

MS is the most common cause of irreversible disability and subsequent inability to work, especially in young adults. A crucial step in the development of autoimmune diseases is the generation of inflammatory T cells, which often arise when inflammatory and tolerogenic T cell responses are not balanced. A key subset of inflammatory T cells in MS is the T helper (Th) 17 cell population, which has been detected in the blood and CNS of MS patients. However, although Th17 cells first came to light in the context of autoimmunity, it is now clear that they can also play a role in protective immune responses against extracellular pathogens. For example, Th17 cells are important for defense against certain gastrointestinal bacterial infections. An innovative concept for the treatment of diseases characterized by excessive inflammation, be it induced by autoimmunity or pathogen attack, is the manipulation of metabolism in Th17 inflammatory T cells. This approach may contribute to the development of new therapies for inflammatory diseases that constitute a great unmet medical need.

Our group is intensively involved in the elucidation and study of metabolic pathways in T cells. Crucially, metabolism is differentially controlled in different Th cell subsets. Recently, we have shown that the antioxidant glutathione (GSH) regulates the metabolism of conventional T cells and regulatory T cells in an almost opposing manner. Our most recent data suggest that GSH has another divergent role in regulating the metabolism of Th17 cells and thus influences their function. Our proposal presents experiments designed to elucidate how GSH operates in Th17 cells and how it might be manipulated.

Another aspect of the proposed research project addresses glucose usage in Th17 cells and its impact on inflammation. In our preliminary work we have shown that glucose flux into the TCA cycle influences the activity of Th17 cells. We now plan to investigate these metabolic pathways in Th17 cells with the aim of discovering novel subset-specific pathways and targets that may offer new therapeutic options for the treatment of inflammatory diseases. To this end, we will exploit various mouse models of human diseases characterized by excessive inflammation, including bacterial infection and the induction of MS-like disease. We will characterize Th17 cell metabolism in vitro and in vivo and align our findings with the corresponding perturbations in our various disease models. Armed with this knowledge, we will target Th17-specific metabolic and/or signaling pathways associated with protective or damaging Th17 responses and evaluate the consequences. Successful completion of our project may result in novel treatment strategies for inflammatory diseases and will unravel novel principles of metabolic regulation in this important T cell subset.

Principal Investigator

Melanie Grusdat-pozdeev*

Project co-funded with Fondation Cancer

Project title

Identification Of Clinically Relevant Compounds For The Enhancement Of Cd8 T Cell Metabolism And Function (CD8-library)

Host institution

University of Luxembourg

FNR Committed

€603,000

Abstract

Cancer is a major cause of death in Europe and causes high financial burden for the public health sector. Immunotherapy has remarkable outcomes in patients, but response rates are limited in solid tumors. With its nutritional restrictions, hypoxia, and suppressive environment, the tumor microenvironment (TME) represents a great barrier for efficient T cell responses. A hallmark of cancer is to hijack metabolic processes of highly proliferating cells. Metabolic similarities between CD8 T cells and cancer cells raise the need for pharmaceutical interventions, which specifically enhance CD8 T cell metabolism, but not cancer cell metabolism.

With this proposal we aim to identify compounds targeting T cell metabolism, which can increase the anti-tumor response. In order to answer this question, we will use the Ludwig metabolic library. This library contains 240 compounds, which are commercially available and a significant portion is FDA approved. In brief, we will mimic the TME in vitro using glutamine deprivation and perform high throughput sequencing in order to identify and characterize metabolic pathways, which boost T cell metabolism and effector function. We will also test the effect of the identified compounds of interest on the outcome of an in vivo murine melanoma model, B16F10. This model will allow us to assess the in vivo relevance of the identified compounds, as well as their suitability for systemic treatment. We will rule out compounds which boost cancer cell metabolism in addition to CD8 T cell metabolism. Moreover, we will test if pre-treatment with the identified compounds will enhance the activity of adoptively transferred anti-cancer T cells. To this end, we will test if our results can be validated using human CD8 T cells.

This unique setup promises a fast translation of the compounds into the clinic and future applications for the treatment of cancer patients. In conclusion we aim to identify compounds, which can boost the efficacy of immunotherapy by enhancing CD8 T cell responses, without boosting cancer cells.

Principal Investigator

Antje Biesemeier

Project title

Analysis Of Neuromelanins And Metals Involved In Neurodegeneration Of Parkinson’S Disease Using Novel High Resolution Sims Methods (PANSIMS)

Host institution

Luxembourg Institute of Science and Technology (LIST)

FNR Committed

€774,000

Abstract

Accumulation and mismanagement of heavy metals and undegradable material in specialized neuromelanin (NM)–containing organelles of dopaminergic neurons are known to play a key role in Parkinson’s disease (PD). Previous work yielded a quantitative understanding of the ultrastructure and metal distribution in NM organelles of human substantia nigra (SN) relevant for understanding the underlying pathological mechanisms. Another pigmented catecholaminergic region early and heavily targeted in PD is locus ceruleus (LC). However, basic cellular research especially on the role of iron and NM is lacking for this small noradrenergic area in the brainstem that can hardly be investigated with routine biochemical analyses. Knowing structure and composition of NM organelles in LC neurons would provide information on neurodegenerative process of these neurons and suggest novel perspectives for diagnosis and therapeutic strategies, e.g. through following experiments on model systems and human tissues.

To this end in this project we will investigate LC neurons of healthy and PD affected subjects with existing correlative immunohistochemistry and quantitative analytical electron microscopy as well as imaging secondary ion mass spectrometry (SIMS) methods targeting elemental and molecular fingerprints of NM organelles and metal accumulations. In parallel, we will improve sensitivity and resolution of these methods by using the npSCOPE, a new high resolution prototype cryo-SIMS instrument platform allowing the simultaneous investigation of ultrastructural features and elemental composition within one instrument and close-to native tissue. Thus, a multimodal imaging approach will be developed that allows investigation of individual LC specimen with special focus on the composition of NM organelles, accumulation of metals and other key markers for PD, including water soluble molecules and lipids. Correlation of clinical/histological data with elemental/molecular information on LC specimen will give valuable insight into the mechanisms of aging and PD.
Especially the recognition of possible accumulations of iron or other metals in LC, which are not yet known to take place, would open perspectives for future therapy developments, e.g. involving iron-chelating disease-modifying agents. Modern magnetic resonance imaging (MRI) can also take advantage of NM and Fe-related contrast mechanisms. NM generation, redistribution and finally their removal thus aid in diagnosis of PD. The data gained from our in situ analyses can therefore in future be used to improve and further develop resolution and contrast of such NM-specific and iron-specific MRI procedures for diagnosis, assessing the disease progression and response to therapy.

This multifaceted investigation will highly benefit from the interdisciplinary team consisting of Luxemburgish clinical and research neuropathologists from the Laboratoire National de Santé (LNS), Luxembourg Institute of Health (LIH) and Luxembourg Centre for Systems Biomedicine (LCSB, Prof.Michel Mittelbronn) as well as (neuro)melanin experts from the Institute of Biomedical Technologies at the National Research Council (ITB-CNR, Prof. Luigi Zecca) in Italy and the Luxembourg Institute of Science and Technology (LIST, Dr. Antje Biesemeier). In addition, LIST’s SIMS instrument developers will provide novel equipment for improved orientation and correlative chemical imaging with optimized sensitivity and lateral resolution (< 20 nm for SIMS). Together, they will work on a better molecular characterization of NM pigment in ageing and its role in PD. On the long hand, that approach can be useful to analyse also other (metal-associated) neurodegenerative disorders like Alzheimer’s disease.

Principal Investigator

Gunnar Dittmar

Project co-funded with Fondation Cancer

Project title

Hif-α Non-canonical Regulation By Ptms And New Interaction Partners (HifReg)

Host institution

Luxembourg Institute of Health (LIH)

FNR Committed

€751,000

Abstract

All human cells need oxygen for energy production by oxidative phosphorylation. If the oxygen levels are low, cells switch from oxidative phosphorylation to glycolysis. The Hif1α subunit of the transcription factor HIF-1 is essential for this regulatory switch. The switch is regulated by the oxygen level-dependent formation of hydroxyproline on the Hif1α subunit, which in turn is recognized by the VHL-E3 ligase leading to Hif1α’s degradation by the ubiquitin-proteasome system. In recent years other pathways for the regulation of Hif1α emerged, including other degradation pathways or the modulation of translation, leading to higher expression of Hif1α.

As one of the metabolic master regulators for metabolic switching from hypoxic to normoxic conditions and vice versa, Hif1α is important for disease progression prediction. Hif1α has been related to shifting the metabolic programs on the inner parts of solid tumors, like gliomas, lung tumors, and colon carcinomas, by modulating the tumors ability to survive treatment by chemotherapeutic.

Ubiquitination is one of the major cellular signaling pathways. Unlike other signaling pathways, ubiquitin can transmit different signals depending on the target protein’s modification by attaching a single ubiquitin moiety or forming a ubiquitin chain, consisting of several ubiquitins connected covalently to each other. Ubiquitin modification is catalyzed by a multi-enzyme cascade and can be negatively regulated by deubiquitinating enzymes, which cut the ubiquitin modification from the target or disassembling the ubiquitin chains.

Using a newly developed screening technology, Prisma, we identified a significant number of new interaction partners of Hif1α, many of which belong to the ubiquitin signaling system and the ubiquitin-proteasome degradation system. The screen identified three E3 ligases, which have not been reported as interactors before, connecting Hif1α degradation to new degradation pathways. The characterization of these new degradation pathways for Hif1α will allow targeting Hif1α by different means.

A second class of proteins related to ubiquitin signaling is a number of deubiquitinases, which were also identified as Hif1α interactors. Nine different hydrolases were identified on the screen, pointing to a complicated negative regulation of Hif1α. Deubiquitinases are single-molecule enzymes, and recent inhibitor development allows the direct targeting of several of these enzymes, allowing the modulation of ubiquitin signaling.

The Prisma screening technology allows the identification of the interaction site within Hif1α and can quantify the interactome changes induced by small post-translational modifications like phosphorylations, acetylations, or methylations. The Prisma screen with Hif1α revealed several phosphorylation sites, which are necessary for the recruitment of deubiquitinases to a specific site in Hif1α. The proposed project will study the impact of these PTMs on the function and regulation of Hif1α stability and the impact on metabolic switching.

Post-translational modifications are often understudied concerning diseases, as the detection of PTMs in disease material is difficult and can only be done using protein-based technologies. We will investigate the presence of the different known Hif1α modifications using targeted proteomics on the Luxembourgish colon carcinoma cohort. Here we will focus our efforts on measuring the PTMs important for ubiquitin signaling and characterize their presence in the samples in the normoxic and hypoxic parts of the tumor.

Subcategory: Complex biomedical systems – data and models – 1 project

Principal Investigator

Anna Golebiewska

Project co-fund with Fondation Cancer

Project title

Deconvolution Of Heterogeneity In The Glioblastoma Cellular Ecosystem For Understanding Treatment Resistance And Improving Patient Stratification (DIOMEDES)

Host institution

Luxembourg Institute of Health (LIH)

FNR Committed

€510,000

Abstract

The following interdisciplinary proposal aims at investigating treatment resistance mechanisms in Glioblastoma (GBM). GBM is the most aggressive and incurable brain tumor with a dismissal prognosis of 12-15 months. Recent studies, including ours, have shown that GBM cells display very strong intrinsic plasticity and adapt reversibly to dynamic microenvironmental conditions, forming a very dynamic ecosystem. The role of GBM plasticity in creating resistant states upon treatment is currently less understood. We hypothesize that high plasticity allows GBM cells to adapt towards drug resistant states upon treatment. We posit that treatment can simultaneously modulate cells within tumor microenvironment, leading to an overall resistant GBM ecosystem.

This proposal is based on our complementary expertise in GBM biology and computational methods for deconvolution of the complex biological systems. By combining state-of-the art preclinical models, transcriptomic analyses at the single and bulk cell level and powerful deconvolution methods we aim to unravel mechanisms that shape treatment resistance in GBM. First we will assess transcriptomic changes upon treatment in our GBM patient-derived orthotopic xenografts (PDOXs). We will investigate transcriptomic adaptation at the single cell level within tumor cells and subpopulations forming tumor microenvironment. By comparing longitudinal models derived from patients prior and after treatment we will reveal long-term stable changes. Direct treatment of PDOXs will allow investigation of transcriptomic transitions towards resistant states at the moment of treatment. We will next investigate presence of treatment tolerant and resistant states in GBM patient tumors.

Advanced systems biology methods tailored to single cell transcriptomic datasets will reveal signatures of treatment resistant states and their regulators. We will further apply our consensus Independent Component Analysis (consICA), a reference-free deconvolution method, to assess treatment resistance signatures based on wide published and unpublished single cell and bulk transcriptomic datasets. consICA will further allow linking independent molecular signals from tumor and TME subpopulations to clinical outcomes such as patient treatment history and survival. The identified treatment resistance signatures will be validated at the protein level. Finally we will explore potential drugs targeting treatment resistance states and test their efficacy in 3D organoids ex vivo.

Our analyses will elucidate potential therapeutic targets for innovative combinatory treatment strategies. Assessing tumor composition prior and after treatment may further reveal predictive biomarkers of response at the level of individual genes and biological processes and may lead to improved stratification of patients for personalized therapies. Our scripts will be shared via website interface available to researchers with minimal computational skills. Our methodology will be accessible for a wider use, e.g. analysis transcriptomic datasets obtained during clinical trials for identification of biomarkers of responders and non-responders upon targeted therapies.

Subcategory: Understanding, preventing, and treating the health-disease transition – 1 project

Principal Investigator

Johannes Meiser

Project co-funded with Fondation Cancer

Project title

Understanding The Flexibility Of 1c Metabolism And Its Role During Metastatic Dissemination Of Cancer Cells (1cFlex)

Host institution

Luxembourg Institute of Health (LIH)

FNR Committed

€632,000

Abstract

After cardiovascular diseases, cancer represents the second biggest death causing disease. Especially, when tumours start to metastasise, life expectation strongly decreases and clinical treatment options are limited. Hence, a thorough understanding of when, how and why cancer cells become metastatic is urgently needed in order to improve clinical intervention strategies. Based on profound preliminary data we have identified cellular metabolism as an essential component and driver of metastasis formation.

More specifically, we have uncovered that (i) serine catabolism with formate overflow promotes invasion and metastasis and (ii) that antifolates reduce only growth properties but not the motility potential of cancer cells. We have demonstrated that such sustained motility potential depends on serine metabolism. In 1cFlex we are going to uncover in detail how serine synthesis and one-carbon metabolism contribute to chemoresistance and metastasis formation. By using novel tool compounds, a panel of genetic cell models and in vivo mouse models, we aim to identify the best strategy on how to target this pathway to reduce cancer cell motility and metastatic dissemination.

In summary, we aim to exploit a metabolic bottleneck that can be targeted to prevent disease development and most importantly, decrease metastasis formation.

Subcategory: Equality of educational opportunity – 1 project

Principal Investigator

Audrey Bousselin

Project title

Investigating The Consequences Of Public Investments In Early Childhood Education And Care (Ecc) For Child Development (ChilDev)

Host institution

Luxembourg Institute of Socio-Economic Research (LISER)

FNR Committed

€508,000

Abstract

In most OECD countries, an increasing number of children aged 0-2 are spending a greater amount of their time in formal early childcare (ECC). ECC is generally recognized as an effective mean to expand skills that are important for the development of treated children as well as for income and educational opportunities. The scientific evidence on the subject is, however, inconclusive, because children with different characteristics and background of origin may benefit differently from ECC attendance.

The goal of this project is to provide innovative causal evidence about the effect of different margins (in term of coverage, affordability and quality) of ECC attendance on the cognitive and non-cognitive development of the treated children. The project cares specifically about (but does not limit to) the group of children with a disadvantaged family background, for which we can interpret positive returns to ECC attendance as improving efficiency in human capital accumulation as well as equalizing educational and income opportunities.

Evaluating the causal effects of ECC expansion is a difficult exercise, insofar children are not randomly allocated into formal ECC facilities offering homogenous levels of care quality. Rather, parents select into care, depending for instance on the alternative means of care available at home, and chose the ECC facility quality level that better matches with their characteristics, (female) labor supply and income. In this project, we rely on the introduction of a generous ECC voucher in Luxembourg in 2009 to generate quasi-experimental variation in ECC assignment across the population of children born in the country about 2007-2012. For identification, we combine features of the reform (which improved affordability for low-income families and contextually raised equilibrium ECC supply) with a rich and innovative data infrastructure, gathering information from educational and fiscal register for the universe of the relevant population alongside survey data for a matched sample.

We contribute to the literature along three lines. First, we exploit geographic variability in the expansion of ECC equilibrium supply to estimate (within a DiD setting) the effect of new slots created on development of treated children when in primary school. Second, we exploit discontinuities in the voucher scheme around several remuneration thresholds to estimate (within a RDD design) the effects of rising ECC affordability on children performances, along the distribution of family income.

Third, we develop a structural equilibrium model (rationalizing the children to ECC provider empirical matches) that endogenizes parental sorting, in order to analyze whether returns on ECC attendance differ along the lines of quality of ECC centers and family characteristics.

Subcategory: Learning in a multilingual and diverse society – 1 project

Principal Investigator

Aliette Lochy

Project title

The Reading Brain: Understanding How Neural Representations For Words Emerge And How They Are Shaped By Teaching Methods (READINGBRAIN)

Host institution

University of Luxembourg

FNR Committed

€600,000

Abstract

Reading is a complex cognitive and cultural ability, which needs to be acquired via instruction. It involves orthographic, phonologic, and semantic representations of words. After learning to decode letters into sounds, the child must develop automatic and fast word recognition, i.e. words are recognized “at a glance”, and considered to be stored and retrieved from an orthographic lexicon. Little is known on what characterizes the emergence of such lexical representations at the neural level in typical and dyslexic children or adults, in natural settings or when they learn new words. Here, using a highly sensitive paradigm that combines frequency-tagging and EEG recordings (Fast Periodic Visual Stimulation, FPVS), as well as behavioral tests, we aim at understanding how words are represented in the brain.

We will focus on 10-years-old children, an age at which fluency in reading is reached with high variability, and adults, both typical and dyslexics. FPVS allows to assess automatic and implicit discrimination between categories of stimuli. This powerful approach will be applied to answer several research questions without requiring any linguistic task, that may influence brain networks activated when seeing words:

(1) What drives and characterize the emergence of lexical responses for known words, both in children and in adults, typical and dyslexics?
(2) Do irregular and regular words entail similar brain responses?
(3) Are some teaching methods more efficient than others for learning irregular words?

First, we expect automatic neural responses to words among pseudowords to emerge at 10 years-old and to be related to behavioral reading fluency and vocabulary both in children and in adults. Further, comparing dyslexics to age-matched and younger children will help characterize this neurodevelopmental disorder at the brain level as reflecting hypoactivation, maturational delay or compensatory mechanisms. Second, in opaque languages like French, some words are difficult to read because they contain irregularities in the letters-to-sounds mappings. Studying if irregular words entail specific neural responses, both in children and in adults, typical and dyslexics, will help understand the nature of representations underlying reading. Finally, by focusing on learning new words, we will examine the impact of pre-existing vocabulary knowledge (meaning and spoken form), as well as teaching method, on the novel neural representations created both in children and in adults.

Our project will have a wide pedagogic and societal impact. Our results and their consequences for teaching will be disseminated not only to the scientific community but also to teachers and education actors.

RELATED NEWS