PUBLICATIONS

Privkit: A Toolkit of Privacy-Preserving Mechanisms for Heterogeneous Data Types
With the massive data collection from different devices, spanning from mobile devices to all sorts of IoT devices, protecting the privacy of users is a fundamental concern. In order to prevent unwanted disclosures, several Privacy-Preserving Mechanisms (PPMs) have been proposed. Nevertheless, due to the lack of a standardized and universal privacy definition, configuring and evaluating PPMs is quite challenging, requiring knowledge that the average user does not have. In this paper, we propose a privacy toolkit – Privkit – to systematize this process and facilitate automated configuration of PPMs. Privkit enables the assessment of privacy-preserving mechanisms with different configurations, while allowing the quantification of the achieved privacy and utility level of various types of data. Privkit is open source and can be extended with new data types, corresponding PPMs, as well as privacy and utility assessment metrics and privacy attacks over such data. This toolkit is available through a Python Package with several state-of-the-art PPMs already implemented, and also accessible through a Web application. Privkit constitutes a unified toolkit that makes the dissemination of new privacy-preserving methods easier and also facilitates reproducibility of research results, through a repository of Jupyter Notebooks that enable reproduction of research results.
Performance comparison of NWDAF-based security analytics techniques in 5G/B5G networks
This paper evaluates the performance of NWDAF-based security analytics techniques in 5G/B5G networks, focusing on anomaly detection for network security incidents. Utilizing a 5G testbed, the study examines both statistical methods (Z-Score, MAD, Hampel Filter) and machine learning techniques (Isolation Forest, LOF, One-Class SVM) for the detection of control-plane and data-plane DoS/DDoS attacks. Results indicate a better performance of statistical methods over ML algorithms in such volume-based attacks and suggest a hybrid approach, combining statistical and ML methods, to enhance anomaly detection and adapt to diverse network conditions for improved 5G security.
A Privacy-Aware Remapping Mechanism for Location Data
In an era dominated by Location-Based Services (LBSs), the concern of preserving location privacy has emerged as a critical challenge. To address this, Location Privacy-Preserving Mechanisms (LPPMs) were proposed, in where an obfuscated version of the exact user location is reported instead. Adding to noise-based mechanisms, location discretization, the process of transforming continuous location data into discrete representations, is relevant for the efficient storage of data, simplifying the process of manipulating the information in a digital system and reducing the computational overhead. Apart from enabling a more efficient data storage and processing, location discretization can also be performed with privacy requirements, so as to ensure discretization while providing privacy benefits. In this work, we propose a Privacy-Aware Remapping mechanism that is able to improve the privacy level attained by Geo-Indistinguishability through a tailored pre-processing discretization step. The proposed remapping technique is capable of reducing the re-identification risk of locations under Geo-Indistinguishability, with limited impact on quality loss.
Implementation of a traffic flow path verification system in a data network

This paper focuses on one of the recent concerns that has arisen regarding the network softwarization, specifically, traffic attestation in service chaining. The central focus of the paper is the design, development, and evaluation of an implementation of Ordered Proof of Transit (OPoT) as a solution to validate flow paths in the network. This solution uses Shamir’s Secret Sharing (SSS) system to add metadata to each packet, updating them at each node or service it traverses until reaching the final destination. This method ensures the validation of services traversed by the packet at the last crossing point, providing an additional layer of security and preventing unauthorized modifications to the flow of data traffic. We report here how a programmable data plane, based on the P4 language, can be used to provide OPoT features dynamically, according to user and network policy requirements. Additionally, a controller will be developed to configure the network nodes, execute OPoT, and monitor the system state.

Towards Privacy-First Security Enablers for 6G Networks: The PRIVATEER Approach
The advent of 6G networks is anticipated to introduce a myriad of new technology enablers, including heterogeneous radio, RAN softwarization, multi-vendor deployments, and AI-driven network management, which is expected to broaden the existing threat landscape, demanding for more sophisticated security controls. At the same time, privacy forms a fundamental pillar in the EU development activities for 6G. This decentralized and globally connected environment necessitates robust privacy provisions that encompass all layers of the network stack. In this paper, we present PRIVATEER’s approach for enabling “privacy-first” security enablers for 6G networks. PRIVATEER aims to tackle four major privacy challenges associated with 6G security enablers, i.e., i) processing of infrastructure and network usage data, ii) security-aware orchestration, iii) infrastructure and service attestation and iv) cyber threat intelligence sharing. PRIVATEER addresses the above by introducing several innovations, including decentralised robust security analytics, privacy-aware techniques for network slicing and service orchestration and distributed infrastructure and service attestation mechanisms.
Adrias: Interference-Aware Memory Orchestration for Disaggregated Cloud Infrastructures
Workload co-location has become the de-facto approach for hosting applications in Cloud environments, leading, however, to interference and fragmentation in shared resources of the system. To this end, hardware disaggregation is introduced as a novel paradigm, that allows fine-grained tailoring of cloud resources to the characteristics of the deployed applications. Towards the realization of hardware disaggregated clouds, novel orchestration frameworks must provide additional knobs to manage the increased scheduling complexity. We present Adrias, a memory orchestration framework for disaggregated cloud systems. Adrias exploits information from low-level performance events and applies deep learning techniques to effectively predict the system state and performance of arriving workloads on memory disaggregated systems, thus, driving cognitive scheduling between local and remote memory allocation modes. We evaluate Adrias on a state-of-art disaggregated testbed and show that it achieves 0.99 and 0.942 R^2 score for system state and application’s performance prediction on average respectively. Moreover, Adrias manages to effectively utilize disaggregated memory, by offloading almost 1/3 of deployed applications with less than 15% performance overhead compared to a conventional local memory scheduling, while clearly outperforms naive scheduling approaches (random and round-robin), by providing up to x2 better performance.
Post-Quantum and Blockchain-Based Attestation for Trusted FPGAs in B5G Networks
The advent of 5G and beyond has brought increased performance networks, facilitating the deployment of services closer to the user. To meet performance requirements such services require specialized hardware, such as Field Programmable Gate Arrays (FPGAs). However, FPGAs are often deployed in unprotected environments, leaving the user’s applications vulnerable to multiple attacks. With the rise of quantum computing, which threatens the integrity of widely-used cryptographic algorithms, the need for a robust security infrastructure is even more crucial. In this paper we introduce a hybrid hardware-software solution utilizing remote attestation to securely configure FPGAs, while integrating Post-Quantum Cryptographic (PQC) algorithms for enhanced security. Additionally, to enable trustworthiness across the whole edge computing continuum, our solution integrates a blockchain infrastructure, ensuring the secure storage of any security evidence. We evaluate the proposed secure configuration process under different PQC algorithms in two FPGA families, showcasing only 2% overheard compared to the non PQC approach.
Multi-Partner Project: Secure Hardware Accelerated Data Analytics for 6G Networks: The PRIVATEER Approach
Next generation 6G networks are designed to meet the requirements of modern applications, including the need for higher bandwidth and ultra-low latency services. While these networks show significant potential to fulfill these evolving connectivity needs, they also bring new challenges, particularly in the area of security. Meanwhile, ensuring the privacy is paramount in 6G network development, demanding robust solutions following “privacy-by-design” principles. To address these challenges, PRIVATEER project strengthens existing security mechanisms, introducing privacy-centric enablers tailored for 6G networks. This work, evaluates key enablers within PRIVATEER, focusing on the development and acceleration of AI -driven anomaly detection models, as well as attestation mechanisms for both hardware accelerators and containerized applications.
Performance comparison of NWDAF-based security analytics techniques in 5G/B5G networks
This paper evaluates the performance of NWDAF-based security analytics techniques in 5G/B5G networks, focusing on anomaly detection for network security incidents. Utilizing a 5G testbed, the study examines both statistical methods (Z-Score, MAD, Hampel Filter) and machine learning techniques (Isolation Forest, LOF, One-Class SVM) for the detection of control-plane and data-plane DoS/DDoS attacks. Results indicate a better performance of statistical methods over ML algorithms in such volume-based attacks and suggest a hybrid approach, combining statistical and ML methods, to enhance anomaly detection and adapt to diverse network conditions for improved 5G security.
On the Difficulty of NOT being Unique: Fingerprinting Users from Wi-Fi Data in Mobile Devices
Network connectivity in complex topologies add the risk of losing visibility and traceability of the traffic flows. Aspects such as regulatory or security policies can demand some mechanism to demonstrate this traceability of the traffic. In this demo, we showcase a novel P4 implementation deployed in the TeraflowSDN ecosystem. This P4 implementation, called Ordered Proof of Transit (OPoT), provides a solution for validating flow paths in the network. By using Shamir’s Secret Sharing Scheme, the system adds metadata to each packet in the network, updating it at each node or service the packet traverses until it reaches the final destination. This method ensures validation of the services traversed by the packet at the last crossing point, providing an additional layer of security and preventing unauthorized modifications to the flow of data traffic. Several mathematical parameters are required for each packet to execute OPoT. The TeraflowSDN controller is responsible for generating these parameters and configuring each node via its P4 framework. The demo consists of a 10-node linear topology network, virtualized with Mininet and connected to a Teraflow controller. Once both the topology and controller are deployed, Teraflow generates the mathematical parameters and configures the nodes by setting up a P4 service involving the OPoT configuration. Connectivity is then tested between two Mininet hosts at opposite ends of the network to verify the proper functionality of the system. Finally, packets are captured at various points in the network to demonstrate how nodes add and modify OPoT metadata.
throttLL’eM: Predictive GPU Throttling for Energy Efficient LLM Inference Serving
As Large Language Models (LLMs) gain traction, their reliance on power-hungry GPUs places ever-increasing energy demands, raising environmental and monetary concerns. Inference dominates LLM workloads, presenting a critical challenge for providers: minimizing energy costs under Service-Level Objectives (SLOs) that ensure optimal user experience. In this paper, we present throttLL’eM, a framework that reduces energy consumption while meeting SLOs through the use of instance and GPU frequency scaling. throttLL’eM features mechanisms that project future Key-Value (KV) cache usage and batch size. Leveraging a Machine-Learning (ML) model that receives these projections as inputs, throttLL’eM manages performance at the iteration level to satisfy SLOs with reduced frequencies and instance sizes. We show that the proposed ML model achieves R2 scores greater than 0.97 and miss-predicts performance by less than 1 iteration per second on average. Experimental results on LLM inference traces show that throttLL’eM achieves up to 43.8% lower energy consumption and an energy efficiency improvement of at least 1.71× under SLOs, when compared to NVIDIA’s Triton server. throttLL’eM is publicly available at https://github.com/WilliamBlaskowicz/throttLL-eM.
Optimizing QAM Demodulation with NEON SIMD and Algorithmic Approximation Techniques
In any telecommunication system, it is crucial to have a high-performance receiver to meet the desired requirements. However with the newer protocols demanding high order constellations, the demodulation process in the receiver becomes a bottleneck. To facilitate the implementation of telecommunication systems on embedded platforms, in this work we explore optimizations to the QAM demodulation, by applying SIMD operations with the NEON engine along with algorithmic approximation techniques. We implement a NEON-based Demodulator using the Approximate LLR algorithm, while we also propose an approximate method for QAM16/QAM64 that focuses on one quadrature for calculating the required Euclidean distances, along with the respective NEON accelerator. We perform a trade-off analysis between system’s BER and execution time of the Demodulator and the receiver module for the base and approximate implementations, while also exploring the impact of different bit widths and precision in computations. We demonstrate that our approximate technique can achieve x18-x37 speedup over the original algorithm without BER deviations on uncoded channels, while the use of LDPC is also examined.
User terminals as attackers: An open dataset analysis of DDoS attacks in 5G networks
The 5th Generation (5G) of cellular networks, developed by the 3rd Generation Partnership Project (3GPP), aims to meet the growing demands for data and communication services. A key component of the 5G architecture is the Network Data Analytics Function (NWDAF), which enhances network performance and detects anomalies by analyzing real-time data. This paper focuses on detecting abnormal user behavior, specifically Distributed Denial of Service (DDoS) attacks, using a comprehensive dataset captured in a 5G testbed. We compare the Z-score method, a traditional statistical method, with machine learning models, including Decision Trees, Naive Bayes, kNN, and XGBoost. Our results demonstrate the improved performance of machine learning models in detecting anomalies in this context. Furthermore, we study the impact of various network features through Principal Component Analysis (PCA), while also employing the inherent explainability capability of Decision Trees to highlight the importance of features in distinguishing between benign and malicious traffic. This study provides valuable insights into DDoS detection in 5G networks, and the dataset is made publicly available to facilitate further research.
Advancing Predictive Security for Consumer Applications in Beyond 5G/6G Networks With Annotated Datasets
The evolution of Beyond 5G (B5G) and 6G networks introduces new opportunities for consumer-centric applications, requiring robust predictive security measures to maintain reliability. A critical component in the B5G landscape is the Network Data Analytics Function (NWDAF), a Network Function (NF) of the 5G Core introduced by the 3rd Generation Partnership Project (3GPP) in Rel. 15, designed to provide data analytics capabilities to the cellular network. This study focuses on detecting Distributed Denial of Service (DDoS) attacks, leveraging a comprehensive dataset collected from a 5G testbed. We evaluate deep learning models-Convolutional Neural Networks (CNN), Long Short-Term Memory networks (LSTM), and Multi-Layer Perceptron (MLP)-and compare their performance with eXtreme Gradient Boosting (XGBoost), a machine learning technique based on gradient boosting, and Z-Score, a statistical method that quantifies how far a data point deviates from the mean. Results demonstrate that XGBoost achieves the highest F1-score of 0.97, precision of 0.96, and recall of 0.98, making it the preferred solution for identifying DDoS attacks from 5G network features, while also offering a computationally efficient solution for real-time applications. To improve interpretability, SHapley Additive exPlanations (SHAP) analysis identifies the network features influencing model decisions. The publicly available dataset used in our study supports further research in anomaly detection and provides valuable insights for future 6G applications, including immersive consumer experiences and autonomous services, while addressing emerging cyber threats.
WiFi-based Location Tracking: A Still Open Door on Laptops
Location privacy is a major concern in the current digital society, due to the sensitive information that can be inferred from location data. This has led smartphones’ Operating Systems (OSs) to strongly tighten access to location information in the last few years. The same tightening has, however, not yet happened when it comes to our second most carried around device: the laptop. In this work, we demonstrate the privacy risks resulting from the fact that major laptop OSs still expose WiFi data to installed software, thus enabling to infer location information from WiFi Access Points (APs). Using data collected in a real-world experiment, we show that laptops are often carried along with smartphones and that a large fraction of our mobility profile can be inferred from WiFi APs accessed on laptops, thus concluding on the need to protect the access to WiFi data on laptops.
Delving Into Security and Privacy of Joint Communication and Sensing: A Survey
Joint Communication and Sensing (JCAS) systems are emerging as a core technology for next-generation wireless systems due to the potential to achieve higher spectral efficiency, energy savings, and new services beyond communications. This paper provides a review of the state-of-the-art in JCAS systems by focusing on obtrusive passive sensing capabilities and inherent security and privacy challenges that arise from the integration of communication and sensing. From this point of view, we discuss existing techniques for mitigating security and privacy issues, as well as important aspects for the designing of secure and privacy-aware JCAS systems. Additionally, we discuss future research directions by emphasizing on new enabling technologies and their integration on JCAS systems along with their role in privacy and security aspects. We also discuss the required modifications to existing systems and the design of new systems with privacy and security awareness, where the challenging trade-offs between security, privacy and performance of the JCAS system must be considered.
Towards Asynchronous Peer-to-Peer Federated Learning for Heterogeneous Systems
Federated Learning (FL) enables collaborative model training across distributed, privacy-sensitive data sources. Traditional FL follows a centralized client-server architecture, relying on synchronized updates and uniform participation. However, real-world deployments face challenges such as client heterogeneity, stragglers, non-independent data distributions, and single points of failure due to server centralization. To address these limitations, we propose an asynchronous Peer-to-Peer FL scheme that enhances learning efficiency in heterogeneous environments. Our method employs a gradient-aware aggregation algorithm with a progress-based adaptive fusion weight, mitigating the impact of resource disparities among clients. Experimental results on CIFAR-10/100 datasets indicate that our scheme achieves 4.8 — 16.3% and 10.9 — 37.7% higher accuracy compared to FedAVG and FedSGD, considering constrained total number of exchanged updates among clients. Furthermore, it effectively handles client heterogeneity through its dynamic fusion weight adjustment.
PRIVATEER: Secure FPGA Acceleration for 6G AI Edge Analytics
The progression towards 6G networks promises enhanced performance but introduces significant security and privacy vulnerabilities, particularly at the network edge. This work presents advancements in PRIVATEER, focusing on AIdriven anomaly detection accelerated on FPGAs and robust security countermeasures for FPGA deployments. We detail an Attention-Autoencoder model for detecting DDoS attacks and evaluate its high-performance, energy-efficient FPGA implementation, achieving >8x/>9x latency reduction and >130x/>30x energy savings compared to CPU/GPU baselines, respectively, without accuracy loss. Additionally, we discuss security mechanisms including remote attestation, Physical Unclonable Functions (PUFs), and side-channel mitigation, demonstrating their efficacy with low overhead. These results showcase viable solutions for secure, hardware-accelerated AI analytics in future 6 G edge systems.
Exploiting temporal parallelism for LSTM
Recurrent Neural Networks (RNNs) are vital for sequential data processing. Long Short-Term Memory Autoencoders (LSTM-AEs) are particularly effective for unsupervised anomaly detection in time- series data. However, inherent sequential dependencies limit parallel computation. While previous work has explored FPGA-based acceleration for LSTM networks, efforts have typically focused on optimizing a single LSTM layer at a time. We introduce a novel FPGA-based accelerator using a dataflow architecture that exploits temporal parallelism for concurrent multi-layer processing of different timesteps within sequences. Experimental evaluations on four representative LSTM-AE models with varying widths and depths, implemented on a Zynq UltraScale+ MPSoC FPGA, demonstrate significant advantages over CPU (Intel Xeon Gold 5218R) and GPU (NVIDIA V100) implementations. Our accelerator achieves latency speedups up to 79.6x vs. CPU and 18.2x vs. GPU, alongside energy-per-timestep reductions of up to 1722x vs. CPU and 59.3x vs. GPU. These results, including superior network depth scalability, highlight our approach’s potential for high-performance, real-time, power-efficient LSTM-AE-based anomaly detection on FPGAs.
Application of Federated Learning and xAI in I4.0 - A Case Study
This study explores the applicability of explainable artificial intelligence (xAI) techniques in the analysis of deep learning models for anomaly detection in 5G/6G networks. With the increasing complexity of networks and network traffic, the mission to guarantee the security access points and devices against attacks and intrusions is also larger. Models used for these tasks operate like black boxes, making it difficult to understand and interpret their decisions at a human level. To address this challenge, we devised a case study with a real world dataset and a performant deep learning anomally detection algorithm and implemented strategies to generate humam understandable explanation through xAI algortihms. xAI can provide insights into the factors that lead to the detection of anomalies, allowing for greater transparency and reliability in the process. This work is part of the context of intelligent networks and is aligned with initiatives such as the Privateer project, contributing to the evolution of security in 5G/6G infrastructures. The integration of deep learning and xAI facilitates interaction between human operators and automated systems, promoting greater control over decision-making in modern networks.
5G/6G Architecture Evolution for XR and Metaverse: Feasibility Study, Security, and Privacy Challenges for Smart Culture Applications
This paper investigates the evolution of 5G/6G architectures to support demanding Extended Reality (XR) and Metaverse applications, focusing specifically on the “smart culture” domain. We evaluate the capabilities of the 5G Service-Based Architecture (SBA), including Multi-Access Edge Computing (MEC) and network analytics, through a comprehensive feasibility study comparing stringent XR requirements (bitrate, latency, capacity, power, accuracy) against current 5G performance. Our key contribution is the identification of significant performance gaps where 5G struggles to meet the demands of advanced XR, particularly concerning capacity, scalability, and ultra-low latency. Furthermore, we provide a detailed analysis of critical security and privacy challenges inherent in 5G-enabled XR environments, including virtualization vulnerabilities, API security, and sensitive data protection. While 5G provides core capabilities, significant challenges persist, emphasizing the need for continued research and the evolution toward 6G to effectively support immersive experiences in smart culture and the Metaverse.
Compromising Location Privacy Through Wi-Fi RSSI Tracking
The widespread availability of wireless networking, such as Wi-Fi, has led to the pervasiveness of always connected mobile devices. These devices are provided with several sensors that allow the collection of large amounts of data, which pose a threat to personal privacy. It is well known that Wi-Fi connectivity information (e.g. BSSID) can be used for inferring user locations. This has caused the imposition of limitations to the access to such data in mobile devices. However, other sources of information about wireless connectivity are available, such as the Received Signal Strength Indicator (RSSI). In this work, we show that RSSI can be used to infer the presence of a user at common locations throughout time. This information can be correlated with other features, such as the hour of the day, to further learn semantic context about such locations with a prediction performance above 90%. Our analysis shows the privacy implications of inferring user locations through Wi-Fi RSSI, but also emphasizes the fingerprinting risk that results from the lack of protection when accessing RSSI measurements.
Trust Evaluation Techniques for 6G Networks: A Comprehensive Survey with Fuzzy Algorithm Approach
Sixth-generation (6G) networks are poised to support an array of advanced technologies and promising high-quality and secure services. However, ensuring robust security, privacy protection, operational efficiency, and superior service delivery poses significant challenges. In this context, trust emerges as a foundational element that is critical for addressing the multifaceted challenges inherent in 6G networks. This review article comprehensively examines trust concepts, methodologies, and techniques that are vital for establishing and maintaining a secure and reliable 6G ecosystem. Beginning with an overview of the trust problem in 6G networks, this study underscores their pivotal role in navigating the network’s complexities. It proceeds to explore the conceptual frameworks underpinning trust and discuss various trust models tailored to the unique demands of 6G networks. Moreover, this article surveys a range of scholarly works presenting diverse techniques for evaluating trust by using the fuzzy logic algorithm, which is essential for ensuring the integrity and resilience of 6G networks. Through a meticulous analysis of these techniques, this study elucidates their technical nuances, advantages, and limitations. By offering a comprehensive assessment of trust evaluation methodologies, this review facilitates informed decision making in the design and implementation of secure and trustworthy 6G networks.
FBBTrust: Decentralized trust modeling in 6G via fuzzy inference, BiLSTM, and blockchain
FBBTrust is a framework designed for 6G networks, focusing on trust management which addresses uncertainty in signals, behavior drift, and tamper-resistant provenance. It utilizes fuzzy inference for projecting quality parameters, BiLSTM for learning temporal dependencies, and a lightweight blockchain for persistent trust updates. Tested on the CIC-IoT2023 benchmark, FBBTrust demonstrates superior performance over spatio-temporal trust and FL+GRU baselines in terms of accuracy metrics and efficiency, while addressing privacy and scalability challenges. The framework’s innovations include a uniquely tuned fuzzy prior, a BiLSTM predictor with learned weights, and on-chain privacy-preserving telemetry. Overall, it appears promising for enhancing trust in 6G systems, especially in IoT contexts such as vehicular and industrial applications.
Explainable Reputation Estimation fromWeb Service Reviews
Star ratings alone are noisy, manipulable, and ignore aspect-level sentiment. We present Scrape2Repute, a compact and reproducible pipeline that: ingests Yelp reviews under policy constraints; cleans and normalises text/metadata; learns a calibrated text sentiment per review; fuses stars and text via a tunable hybrid label; downweights suspicious reviews with unsupervised anomaly scoring; and aggregates evidence into a time-decayed business reputation with uncertainty bounds. The system is explainable (top-$k$ rationales, aspect summaries), runs on commodity hardware, and ships with CLI/GUI. On the Yelp Open Dataset we show strong predictive validity for forecasting future ratings and stable behaviour under sensitivity sweeps. We release implementation and an ethics checklist for compliant use.
Comparisson of xAI techniques for Deep Learning Algorithms with Timeseries Datasets
This study explores the applicability of explainable artificial intelligence (xAI) techniques in the analysis of deep learning models for anomaly detection in 5G/6G networks. With the increasing complexity of networks and network traffic, the mission to guarantee the security access points and devices against attacks and intrusions is also larger. Models used for these tasks operate like black boxes, making it difficult to understand and interpret their decisions at a human level. To address this challenge, we devised a case study with a real world dataset and a performant deep learning anomaly detection algorithm and implemented strategies to generate humam understandable explanation through xAI algortihms. xAI can provide insights into the factors that lead to the detection of anomalies, allowing for greater transparency and reliability in the process. This work is part of the context of intelligent networks and is aligned with initiatives such as the Privateer project, contributing to the evolution of security in 5G/6G infrastructures. The integration of deep learning and xAI facilitates interaction between human operators and automated systems, promoting greater control over decision-making in modern networks.
Scroll to Top