Huggingface_HumanActivityRecognition_using_MachineLearning
Автор: Muththukumar Ravinthiran
Загружено: 2026-03-06
Просмотров: 5
Описание:
In this project, I developed a Human Activity Recognition (HAR) system using wearable sensor data and deep learning techniques. The aim of the project is to automatically classify human activities based on motion signals collected from wearable sensors.
The dataset consists of five wearable sensors, each providing three-axis acceleration data (X, Y, Z). Using these measurements, the model was trained to recognize six different activities:
• Cycling
• Pushup
• Run
• Squat
• Table Tennis
• Walk
The raw sensor data was first cleaned and preprocessed, including handling missing values and normalizing the features. The continuous time-series data was then converted into fixed-length sequences using a sliding window approach with overlapping frames, allowing the model to capture temporal patterns in human motion.
To perform activity classification, I implemented a deep learning model combining Convolutional Neural Networks (CNN) and Bidirectional Long Short-Term Memory (BiLSTM) layers. The CNN layers extract spatial motion patterns from the sensor signals, while the BiLSTM layer learns temporal dependencies within the sequence data.
The model was evaluated using several performance metrics, including:
• Accuracy
• Confusion Matrix
• ROC-AUC Curves
• Training and Validation Performance
The trained model achieved high classification accuracy, demonstrating strong performance in recognizing different activities.
Sensor Pair Optimization (Optional Task)
In addition to the main model, I performed a sensor pair selection experiment to determine whether similar performance could be achieved using fewer sensors. All possible combinations of two sensors were tested.
The results showed that the sensor pair S2 and S5 provided the best performance with an accuracy of approximately 97.9%, using only six features (X2,Y2,Z2,X5,Y5,Z5). This demonstrates that accurate activity recognition can still be achieved with a reduced sensor configuration.
Model Deployment and Live Validation
To demonstrate the practical usability of the model, I deployed the trained model on Hugging Face Spaces. The deployed application allows users to upload sensor data and obtain activity predictions directly through a web interface.
In the video, I demonstrate the validation of the deployed model by uploading sample sensor data, and the system successfully predicts the corresponding human activity in real time. This deployment shows how the trained deep learning model can be integrated into an interactive application for real-world use.
Technologies Used
Python
TensorFlow / Keras
Scikit-learn
NumPy & Pandas
Matplotlib
Hugging Face Spaces
This project highlights the potential of wearable sensor-based activity recognition systems for applications such as healthcare monitoring, fitness tracking, and smart environments.
If you found this project interesting, please consider liking the video and subscribing to the channel.
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: