Penentuan Learning Rate Terbaik CNN Pada Pengenalan Individu Berbasis Analisis Gait
DOI:
https://doi.org/10.33633/joins.v8i1.7806Keywords:
gait event detection, penentuan learning rate, convolutional neural network, pola trayektoriAbstract
Trayektori tubuh manusia untuk analisis gait tidak terbatas pada kondisi permukaan medan yang rata. Hal ini berpengaruh pada analisis gait untuk penelitian pengenalan identitas individu yang terkait dengan kondisi medan yang dilalui. Pergelangan kaki menjadi bagian tubuh yang berkontribusi pada trayektori tubuh manusia terhadap medan yang dilalui melalui dua kondisi yaitu Heel-Strike (HS) dan Toe-Off (TO). HS dan TO memiliki pola trayektori yang saling berbeda untuk setiap individu sehingga membutuhkan penentuan parameter learning rate yang tepat. Penentuan learning rate terbaik merupakan salah satu langkah penting dalam menghasilkan pengenalan identitas individu terbaik. Pada kegiatan penelitian ini, data yang digunakan adalah data berformat C3D yang direkam melalui perangkat motion capture dengan skenario berjalan lurus (WS/Walking Straight) oleh enam orang sebagai partisipan. Penentuan learning rate terbaik menggunakan metode convolutional neural network (CNN) dengan pretrain pembanding adalah ResNet18 dan ResNet50. Percobaan yang dilakukan menghasilkan performa terbaik diperoleh ResNet18 baik pada pengukuran Average Position (AP) maupun pendeteksian kondisi HS dan TO.References
R. LIAO, K. MORIWAKI, Y. MAKIHARA, D. MURAMATSU, N. TAKEMURA, and Y. YAGI, “Health indicator estimation by video-based gait analysis,” IEICE Trans. Inf. Syst., vol. E104D, no. 10, pp. 1678–1690, 2021, doi: 10.1587/transinf.2020ZDP7502.
A. Cosma and I. E. Radoi, “Wildgait: Learning gait representations from raw surveillance streams,” Sensors, vol. 21, no. 24, 2021, doi: 10.3390/s21248387.
R. Caldas, M. Mundt, W. Potthast, F. Buarque de Lima Neto, and B. Markert, “A systematic review of gait analysis methods based on inertial sensors and adaptive algorithms,” Gait Posture, vol. 57, no. February, pp. 204–210, 2017, doi: 10.1016/j.gaitpost.2017.06.019.
S. V. A. Kumar, E. Yaghoubi, A. Das, B. S. Harish, and H. Proenca, “The P-DESTRE: A Fully Annotated Dataset for Pedestrian Detection, Tracking, and Short/Long-Term Re-Identification from Aerial Devices,” IEEE Trans. Inf. Forensics Secur., vol. 16, pp. 1696–1708, 2021, doi: 10.1109/TIFS.2020.3040881.
Z. Deng, P. Wang, D. Yan, and K. Shang, “Foot-Mounted Pedestrian Navigation Method Based on Gait Classification for Three-Dimensional Positioning,” IEEE Sens. J., vol. 20, no. 4, pp. 2045–2055, 2020, doi: 10.1109/JSEN.2019.2949060.
B. J. Stetter, E. Buckeridge, S. R. Nigg, S. Sell, and T. Stein, “Towards a wearable monitoring tool for in-field ice hockey skating performance analysis,” Eur. J. Sport Sci., vol. 19, no. 7, pp. 893–901, 2019, doi: 10.1080/17461391.2018.1563634.
H. Wang, C. Xiao, J. Kossaifi, Z. Yu, A. Anandkumar, and Z. Wang, “AugMax: Adversarial Composition of Random Augmentations for Robust Training,” no. NeurIPS, pp. 1–14, 2021, [Online]. Available: http://arxiv.org/abs/2110.13771.
A. Loukas, M. Poiitis, and S. Jegelka, “What training reveals about neural network complexity,” no. NeurIPS, 2021, [Online]. Available: http://arxiv.org/abs/2106.04186.
G. Giorgi et al., “Walking Through the Deep : Gait Analysis for User Authentication Through Deep Learning,” pp. 0–14, 2019.
M. Yatsura, J. H. Metzen, and M. Hein, “Meta-Learning the Search Distribution of Black-Box Random Search Based Adversarial Attacks,” no. NeurIPS, pp. 1–15, 2021, [Online]. Available: http://arxiv.org/abs/2111.01714.
Y. Luo, S. M. Coppola, P. C. Dixon, S. Li, J. T. Dennerlein, and B. Hu, “A database of human gait performance on irregular and uneven surfaces collected by wearable sensors,” Sci. Data, vol. 7, no. 1, pp. 1–9, 2020, doi: 10.1038/s41597-020-0563-y.
H. G. Chambers and D. H. Sutherland, “A practical guide to gait analysis.,” J. Am. Acad. Orthop. Surg., vol. 10, no. 3, pp. 222–231, 2002, doi: 10.5435/00124635-200205000-00009.
O. Elharrouss, N. Almaadeed, S. Al-Maadeed, and A. Bouridane, “Gait recognition for person re-identification,” J. Supercomput., vol. 77, no. 4, pp. 3653–3672, 2021, doi: 10.1007/s11227-020-03409-5.
J. a Blaya, H. M. Herr, T. Supervisor, D. J. Newman, W. C. Flowers, and A. Sonin, “Force-Controllable Ankle Foot Orthosis ( AFO ) to Assist Drop Foot Gait,” Mech. Eng., no. June, 2003.
A. Shehata, Y. Hayashi, Y. Makihara, D. Muramatsu, and Y. Yagi, “Does My Gait Look Nice? Human Perception-Based Gait Relative Attribute Estimation Using Dense Trajectory Analysis,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 12047 LNCS, no. February, pp. 90–105, 2020, doi: 10.1007/978-3-030-41299-9_8.
A. Jamsrandorj, M. D. Nguyen, M. Park, K. S. Kumar, K. R. Mun, and J. Kim, “Vision-Based Gait Events Detection Using Deep Convolutional Neural Networks,” Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. EMBS, pp. 1936–1941, 2021, doi: 10.1109/EMBC46164.2021.9630431.
Downloads
Published
How to Cite
Issue
Section
License
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
This work is licensed under a Creative Commons Attribution 4.0 International License.