Edge-Enabled Mobile App for Smart Agriculture Using Multi-Sensor Inputs and a Hybrid CNN–Vision Transformer Model

Authors

DOI:

https://doi.org/10.3991/ijim.v19i21.55919

Keywords:

Hybrid Deep Learning, Edge Computing, IoT Sensing, Edge Computing, IoT-Integrated Sensing

Abstract


Crop diseases significantly threaten food security, particularly in resource-limited regions. Existing mobile diagnostic tools often lack robustness and fail to integrate environmental context. To address these limitations, this study developed a hybrid CNN-Vision transformer model combining EfficientNetV2, MobileNetV2, and Vision Transformers. The model was trained on Kaggle’s PlantVillage dataset and locally collected field images covering 76 disease and condition classes. Multi-sensor inputs, including soil moisture, temperature, and humidity, were integrated to enhance prediction accuracy. The model achieved 99.2% accuracy, an AUC of 0.999998, and a 69% lower prediction variance than baseline models. Bayesian testing confirmed its superiority over DenseNet50 and other models. TensorFlow Lite was used to optimize the model for deployment on resource-constrained devices and was integrated into AgriScan, a 30.4 MB edge-enabled Android app. AgriScan supports offline inference, delivers real-time predictions with 0.094s latency, and reduces misdiagnoses by 92%. Evaluation on 249 unseen local images confirmed 97.97% accuracy, validating its field applicability. The model achieved a prediction variance of 0.000010, indicating strong confidence consistency across predictions. Benchmarking revealed 4.8× faster inference and 83% lower energy usage than cloud-based alternatives. Cross-device testing confirmed 100% diagnostic consistency, supporting the model’s generalizability. This framework combines edge AI and sensor fusion into a scalable, cloud-independent diagnostic solution for advancing smart agriculture in low-connectivity environments.

Downloads

Published

2025-11-07

How to Cite

Njoroge, T. K., Kibuku, R., & Mugoye Sindu, K. (2025). Edge-Enabled Mobile App for Smart Agriculture Using Multi-Sensor Inputs and a Hybrid CNN–Vision Transformer Model. International Journal of Interactive Mobile Technologies (iJIM), 19(21), pp. 145–162. https://doi.org/10.3991/ijim.v19i21.55919

Issue

Section

Papers