Imputation Task | Seen (Training) | Unseen (Test) |
---|---|---|
Traditional | ||
Cross Domain | ||
Cross Sensor |
Can we build a task-agnostic imputation pipeline that is transferable to new sensors without requiring additional training? In this work, we formalize the concept of zero-shot imputation and propose a novel approach that enables the adaptation of pre-trained models to handle data intermittency. This framework, named NeuralPrefix, is a generative neural component built as a continuous dynamical system. It precedes a task model during inference, filling in gaps caused by data intermittency.
Challenges. Learning a generalizable spatiotemporal sensory data imputation model is challenging due to sensor heterogeneity and noise. Additionally there is an inherent sparsity in the raw data. For example, a Wi-Fi-based motion tracking system captures only a few reflections per person, providing limited data for reconstruction. Unlike vision data, sensory data have lower resolution and lack spatial redundancy, making it harder to recover missing details. The challenge intensifies when operating across domains, as variations in sensor configurations, modalities, and data distributions further complicate imputation.
Opportunities. The raw signal of sensory data such as Radio Frequency (RF) signals or wearable sensors often appear noisy and highly variable. However, these signals encode smooth underlying dynamics that correspond to human motion or activity. This phenomenon is primarily due to the biomechanics of human movement and the inherent continuity of physical processes.
Above is an illustration. The motion frames shows the raw sensory data (RF signal) of hand gesture. In the second pane, we focus on the highest energy blob to elminate less significant movements and highlight the main object’s motion. It can be seen that the motion follows a continuous trajectory, confirming that despite noise, the dominant movement pattern remains smooth.
As human motion is governed by physical and biomechanical constraints, it is inherently consistent despite the apparent complexity and disparity in high-dimensional raw sensor data. We hypothesize that the underlying latent dynamics are often invariant to sensor types and modalities, making them valuable for generalization across different datasets and sensor configurations. In the example below, we show a push gesture captured by two different unpaired sensors. The visual resemblance of motions evolution hints at the common underlying dynamics.
@inproceedings{khamis2025nprefix,
author = "{Khamis, Abdelwahed and Khalifa, Sara}",
title = "NeuralPrefix: A Zero-shot Sensory Data Imputation Plugin",
year = "2025",
booktitle={2025 IEEE International Conference on Pervasive Computing and Communications (PerCom},
organization = "IEEE",
}