A Novel Approach for Product Recommendation Using Smartphone Sensor Data
DOI:
https://doi.org/10.3991/ijim.v16i16.31617Keywords:
human activity, smartphone sensors, preferences, object detection, recommendations.Abstract
Human Activity-based studies have become an omnipresent research topic in Machine Learning. Considering the countless impacts of human activity on persons' everyday life, we have analyzed the correlation between human activity and their product preferences in our study and proposed that daily human activity could be a metric for product recommendation models. To address this previously unaccounted phenomenon, a new approach is presented in our study that gives real-time recommendations to users by observing their activeness in daily life. However, product recommendation systems mostly believe in ratings, and the purchase behavior of users instead of investigating the precious insights of users' daily activities. But we examined smartphones' GPS sensor data using machine learning algorithms to urge insights from users' daily activeness and proposed a model for predicting the product of interest of the purchasers, based on the activeness of their daily life. Moreover, based on our model, we have introduced a prototype of a real-time recommendation system, especially for the retail shops that rely on users' implicit data from smartphone sensors to form product recommendations. For conducting our study, we developed an android application that—collects embedded smartphone sensor data and can detect objects to provide product recommendations and product details. Experiment shows, that our proffered daily activeness-based recommendation system using smartphone sensor data, performs with a precision of 66%, but it is also a promising performance because it does not use customers' explicit feedback.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Moontaha Nishat Chowdhury, H M Zabir Haque, Kazi Taqi Tahmid, Fatema-Tuz-Zohora Salma, Nafisa Ahmed
This work is licensed under a Creative Commons Attribution 4.0 International License.