Eating Moment Monitoring Using WiFi

Project Objectives

This project aims to find a solution for fine-grained eating behavior monitoring, which will help users
form good eating habits and improve health. We propose a device-free system that leverages the wireless signals extracted from the commercial off-the-shelf smartphones to perform eating behavior monitoring. The system can capture the eating activities of users to determine eating moments, and further identify food intake gestures (e.g., eating with fork, knife, spoon, chopsticks).

Technology Rationale

  1. WiFi Signal. The gesture of eating will distort the signal pattern in commodity WiFi. In this project, we
    leverage the channel state information (CSI) of the WiFi signal to capture the eating gestures. By
    leveraging CSI, we can capture the gesture (e.g., using fork, knife) and the duration of eating.

  2. Machine Learning for Clustering and Classification. Based on CSI extracted from smartphone, a Fuzzy C-means clustering algorithm is developed to separate eating behaviors from other behaviors and obtain the CSI segments of eating movement. We then leverage a suite of machine learning models to perform eating gesture recognition and moment estimation.

Technology Approach

The system takes the CSI measurements extracted from personal mobile devices, such as smartphones, as input. To mitigate the impacts of ambient wireless interference, a noise removal module is applied to remove the outliers. Our system then examines the moving variance and accumulates short-time energy
(STE) of the calibrated CSI data to obtain the CSI segments of various human activities. We use a Fuzzy
C-means Clustering method to detect eating activities through profile matching. For each detected segmentation, the system extracts representative features from time and  frequency domains, then
performs gesture classification based on lightweight classifiers (random forest, Naïve Bayes, KNN, and discriminant analysis classifiers).

Project Status

This project led to papers in EAI HealthyIoT’19 and IEEE ICCCN’20. Particularly, our paper accepted by EAI HealthyIoT was accepted as the best paper in 2019. The proposed system shows high accuracy in eating behavior recognition and eating moment estimation. Figure 1 shows the confusion matrix on recognizing food intake gestures and the average accuracy is over 97.8%. Our system could achieve over
80% average accuracy under different training sizes as shown in Figure 2. We show the eating moment estimation results in Table 1, the largest error of average estimation for each gesture is only 1.1s. The results show that our system can achieve precise estimation of the dietary moment.


Zhenzhe Lin, Yucheng Xie, Xiaonan Guo, Chen Wang, Yanzhi Ren, Yingying Chen. Wi-Fi-Enabled Automatic Eating Moment Monitoring Using Smartphones. In Proceedings of the EAI International Conference on IoT Technologies for HealthCare (EAI HealthyIoT), pp. 77-91, 2019. (Best Paper Award)

Zhenzhe Lin, Yucheng Xie, Xiaonan Guo, Yanzhi Ren, Yingying Chen, Chen Wang. WiEat: Fine-grained Device-free Eating Monitoring Leveraging Wi-Fi Signals. In Proceedings of International
Conference on Computer Communications and Networks (IEEE ICCCN), pp. 1-9, 2020.