Providing a body language analysis model using artificial intelligence applications for the blind
Subject Areas : آموزشی
soheil naderi
1
,
tahmores shiri
2
*
,
Afsaneh Mozaffari
3
,
Mohammad Soltanifar
4
1 - PhD student, Department of Social Communication Sciences, Science and Research Branch, Islamic Azad University, Tehran, Iran
2 - Associate Professor, Department of Social Sciences, Central Tehran Branch, Islamic Azad University, Tehran, Iran. (Corresponding Author)
3 - Associate Professor, Department of Social Communication Sciences, Science and Research Branch, Islamic Azad University, Tehran, Iran
4 - Faculty member and Professor, Department of Social Communication Sciences, Science and Research Branch, Islamic Azad University, Tehran, Iran
Keywords: Artificial intelligence, body language analysis, grounded theory, blind people, assistive technology, auditory feedback,
Abstract :
Human communication is largely based on non-verbal cues, including body movements, facial expressions, and tone of voice. Blind people face fundamental difficulties in understanding these cues due to their limited visual perception, which limits the quality of their interpersonal communication and social participation. The aim of the present study was to design and explain a native model for body language analysis based on artificial intelligence with a needs-based approach for the blind. The research method was qualitative and based on Grounded Theory in the Glaser method. The statistical population consisted of 19 IT experts and 13 blind users who were selected through purposive sampling. The data collection tool was semi-structured and in-depth interviews. The data were analyzed through an open, axial, and selective coding process, and the continuous comparison process continued until theoretical saturation. The findings showed that the core category of designing an intelligent body language analysis model for the blind was extracted from the data. Key factors identified included emotion intensity, gaze direction, interaction distance, and movement shoulders, which play a fundamental role in social interactions. Participants emphasized that the desired technology should be simple, low-cost, portable, and based on real-time audio feedback. This model, while improving the quality of communication for the blind, also has the potential to be developed in the fields of education, employment, and rehabilitation. The results of the study, in addition to providing theoretical innovation in the field of communication and disability studies, also provide a new perspective for the development of intelligent assistive technologies from an applied perspective. Therefore, the use of artificial intelligence algorithms in combination with human-centered design can be an effective solution to reduce the communication gap for the blind
فارسی
• احمدی، ن.، و همکاران. (۱۳۹۸). آموزش الکترونیکی برای نابینایان: بررسی نقش فناوریهای تعاملی. مجله علوم تربیتی ایران, ۱۵(۲), ۴۵-۶۰.
• حسینی، م.، و همکاران. (۱۴۰۰). چالشهای ارتباطی نابینایان و نقش فناوریهای کمکی در بهبود کیفیت زندگی. مجله رفاه اجتماعی, ۲۱(۴), ۱۱-۳۵.
• کریمی، الف. (۱۴۰۱). فناوریهای کمکی برای افراد دارای معلولیت بینایی. تهران: انتشارات دانشگاه آزاد.
• زارع، م. (۱۳۹۹). زبان بدن و ارتباطات غیرکلامی در تعاملات نابینایان. مجله روانشناسی اجتماعی, ۱۲(۱), ۲۳-۴۰.
• صادقی، ع.، و همکاران. (۱۴۰۱). الگوریتمهای تشخیص موانع در مسیر حرکتی نابینایان با استفاده از گوشیهای هوشمند. مجله فناوری اطلاعات ایران, ۱۰(۳), ۷۸-۹۲.
• رضایی، س.، و احمدی، ن. (۱۳۹۸). نظریه دادهبنیاد: رویکردی برای کشف مدلهای بومی در علوم انسانی. پژوهشنامه علوم اجتماعی, ۱۳(۲), ۷۵-۱۰۲.
انگلیسی
• Ekman, P., & Friesen, W. V. (1969). The repertoire of nonverbal behavior. Semiotica, 1(1), 49-98.
• Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine.
• Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT Press.
• Guerrero, L. K., & Floyd, K. (2006). Nonverbal communication in close relationships. Routledge.
• Hersh, M., & Johnson, M. A. (2010). Assistive technology for visually impaired and blind people. London: Springer.
• Kelly, S. M. (2011). Assistive technology for visually impaired: Innovations and challenges. Journal of Visual Impairment & Blindness, 105(5), 301-310.
• Knapp, M. L., & Hall, J. A. (2010). Nonverbal communication in human interaction (7th ed.). Boston: Wadsworth.
• Mehrabian, A. (1972). Nonverbal communication. Chicago: Aldine-Atherton.
• Samadani, A., et al. (2019). Body language in education: Assessing student engagement. IEEE Transactions on Education, 62(4), 567-575.
• Sun, K., et al. (2019). Deep high-resolution representation learning for human pose estimation. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 569-577.
• Wang, S., et al. (2020). Depression detection via body language analysis. IEEE Journal of Biomedical and Health Informatics, 24(8), 2190-2200.
• Yang, J., et al. (2023). AI-based assistive systems for visually impaired users. ACM Transactions on Accessible Computing, 16(2), 1-25.
• Zadeh, A., Chen, M., Poria, S., Cambria, E., & Morency, L.-P. (2018). Multimodal sentiment analysis with word-level fusion and reinforcement learning. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, 2237-2246.