Classification Methods Performance On Logistic Package State Recognition

Muhammad Auzan(1*), Dzikri Rahadian Fudholi(2), Paulus Josianlie P(3), M Ridho Fuadin(4)

(1) Department of Computer Science and Electronics, FMIPA UGM, Yogyakarta
(2) Department of Computer Science and Electronics, FMIPA UGM, Yogyakarta
(3) Bachelor Program of Electronics and Instrumentation, FMIPA UGM, Yogyakarta
(4) Bachelor Program of Electronics and Instrumentation, FMIPA UGM, Yogyakarta
(*) Corresponding Author


 In the distribution sector, logistic package experience activities, such as transport, distribution, storage, packaging, and handling. Even though those processes have reasonable operational procedures, sometimes the package experience mishandling. The mishandling is hard to identify because many packages run simultaneously, and not all processes are monitored. An Inertial Measurement Unit (IMU) is installed inside a package to collect three acceleration and rotation data. The data is then labeled manually into four classes: correct handling, vertical fall, and thrown and rotating fall. Then, using cross-validation, ten classifiers were used to generate a model to classify the logistic package status and evaluate the accuracy score. It is hard to differentiate between free-fall and thrown. The classification only uses the accelerometer data to minimize the running time. The correct handling classification gives a good result because the data pattern has few variations. However, the thrown, free-fall and rotating data give a lower result because the pattern resembles each other. The average accuracy of the ten classifications is 78.15, with a mean deviation of 4.31. The best classifier for this research is the Gaussian Process, with a mean accuracy of 94.4 % and a deviation of 3.5 %.


Classification; Logistic; IoT; IMU; Mishandling

Full Text:



[1] M. Lafkihi, S. Pan, dan E. Ballot, “Freight transportation service procurement: A literature review and future research opportunities in omnichannel E-commerce,” Transp. Res. Part E Logist. Transp. Rev., vol. 125, hal. 348–365, Mei 2019, doi: 10.1016/j.tre.2019.03.021.

[2] R. Bandara, M. Fernando, dan S. Akter, “Privacy concerns in E-commerce: A taxonomy and a future research agenda,” Electron. Mark., vol. 30, no. 3, hal. 629–647, Sep 2020, doi: 10.1007/s12525-019-00375-6.

[3] A. Panghal, S. Manoram, R. S. Mor, dan P. Vern, “Adoption challenges of blockchain technology for reverse logistics in the food processing industry,” Supply Chain Forum An Int. J., hal. 1–10, Jun 2022, doi: 10.1080/16258312.2022.2090852.

[4] W. Prosser et al., “Redesigning immunization supply chains: Results from three country analyses,” Vaccine, vol. 39, no. 16, hal. 2246–2254, Apr 2021, doi: 10.1016/j.vaccine.2021.03.037.

[5] D. S.-M. Machado, D. Moreira, H. Castro, dan A. Barao, “Real-Time Logistic Monitoring (RTLM),” in 2020 15th Iberian Conference on Information Systems and Technologies (CISTI), Jun 2020, hal. 1–3, doi: 10.23919/CISTI49556.2020.9140441.

[6] H. Manoharan, Y. Teekaraman, I. Kirpichnikova, R. Kuppusamy, S. Nikolovski, dan H. R. Baghaee, “Smart Grid Monitoring by Wireless Sensors Using Binary Logistic Regression,” Energies, vol. 13, no. 15, hal. 3974, Agu 2020, doi: 10.3390/en13153974.

[7] D. S.-M. Machado, D. Moreira, H. Castro, dan A. Barão, “Real-Time Logistic Monitoring: System Prototype Overview,” J. Inf. Syst. Eng. Manag., vol. 6, no. 2, hal. em0139, 2021, doi: 10.29333/jisem/9671.

[8] J.-W. Cui, Z.-G. Li, H. Du, B.-Y. Yan, dan P.-D. Lu, “Recognition of Upper Limb Action Intention Based on IMU,” Sensors, vol. 22, no. 5, hal. 1954, Mar 2022, doi: 10.3390/s22051954.

[9] C. Mummadi et al., “Real-Time and Embedded Detection of Hand Gestures with an IMU-Based Glove,” Informatics, vol. 5, no. 2, hal. 28, Jun 2018, doi: 10.3390/informatics5020028.

[10] T. T. Alemayoh, M. Shintani, J. H. Lee, dan S. Okamoto, “Deep-Learning-Based Character Recognition from Handwriting Motion Data Captured Using IMU and Force Sensors,” Sensors, vol. 22, no. 20, hal. 7840, Okt 2022, doi: 10.3390/s22207840.

[11] E. D’Amato, V. A. Nardi, I. Notaro, dan V. Scordamaglia, “A Particle Filtering Approach for Fault Detection and Isolation of UAV IMU Sensors: Design, Implementation and Sensitivity Analysis,” Sensors, vol. 21, no. 9, hal. 3066, Apr 2021, doi: 10.3390/s21093066.

[12] I. A. Putra, O. D. Nurhayati, dan D. Eridani, “Human Action Recognition ( HAR ) Classification Using MediaPipe and Long Short-Term Memory ( LSTM ),” vol. 43, no. 2, hal. 190–201, 2022, doi: 10.14710/teknik.v43i2.46439.

[13] N. Gupta, S. K. Gupta, R. K. Pathak, V. Jain, P. Rashidi, dan J. S. Suri, Human activity recognition in artificial intelligence framework: a narrative review, vol. 55, no. 6. Springer Netherlands, 2022.

[14] O. Steven Eyobu dan D. Han, “Feature Representation and Data Augmentation for Human Activity Classification Based on Wearable IMU Sensor Data Using a Deep LSTM Neural Network,” Sensors, vol. 18, no. 9, hal. 2892, Agu 2018, doi: 10.3390/s18092892


Article Metrics

Abstract views : 426 | views : 286


  • There are currently no refbacks.

Copyright (c) 2023 IJCCS (Indonesian Journal of Computing and Cybernetics Systems)

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Copyright of :
IJCCS (Indonesian Journal of Computing and Cybernetics Systems)
ISSN 1978-1520 (print); ISSN 2460-7258 (online)
is a scientific journal the results of Computing
and Cybernetics Systems
A publication of IndoCEISS.
Gedung S1 Ruang 416 FMIPA UGM, Sekip Utara, Yogyakarta 55281
Fax: +62274 555133 |

View My Stats1
View My Stats2