Musical approach through a gestural interpretation

Keywords: Gesture recognition, interactive systems, virtual environments, natural languages

Abstract

This article describes a virtual environment of an interactive computer system that can interpret hand gestures as pitches in a tonal scale. The methodology implemented here is based on a quantitative approach with three data collection stages: sequence of gestures, Curwen’s hand signs, and approximation to reality. This study was divided into three general stages conceptualization, design and implementation, and evaluation, to expand the scope of hand gesture recognition. The Kodály method implemented in this paper, which was adapted in Colombia, uses hand gesture to help students better understand musical concepts. The evaluation was conducted with a population sample of 15 users in each data collection stage. The results show that users adapted quickly to hand gesture recognition and its relationship with musical notes. Times and errors obtained in the tests were used to measure aspects of execution. A set of hand gestures was adapted to a virtual environment where users can play melodica by gestural interpretation, which showed that gesture recognition can help music instruction. Finally, users achieved a high level of understanding thanks to the use of natural hand movements, and it was a meaningful process for them because it was their first experience with a hand gesture recognition.

Author Biographies

Christian Quintero*, Universidad Militar Nueva Granada, Colombia

Universidad Militar Nueva Granada, Bogotá - Colombia,  christian.quintero@unimilitar.edu.co

Diego Roa, Universidad Militar Nueva Granada, Colombia

Universidad Militar Nueva Granada, Bogotá - Colombia, u1201198@unimilitar.edu.co

References

A. L. Zuleta Jaramillo, “El método Kodály y su adaptación en Colombia”, Cuad. Músic. Artes Vis. Artes Escén, vol. 1, no. 1, pp. 66-95, Oct. 2004. https://revistas.javeriana.edu.co/index.php/cma/article/view/6420

M. Lucato, “El método Kodály y la formación del profesorado de música”, Rev. electrónica LEEME, no. 7, pp. 1-7, May. 2001. https://ojs.uv.es/index.php/LEEME/article/view/9725/9161

R. I. Godøy; E. Haga; A. R. Jensenius, “Playing ‘Air Instruments’: Mimicry of Sound-Producing Gestures by Novices and Experts”, Springer, Berlin, 2006, pp. 256–267. https://doi.org/10.1007/11678816_29

M. Houlahan; P. Tacka, Kodály Today: A Cognitive Approach to Elementary Music Education. Oxford University Press, 2008.

D. Bell, An introduction to cybercultures, (1st ed.). London: Routledge. 2001. https://doi.org/10.4324/9780203192320

B. A. Myers, “A brief history of human-computer interaction technology”, Interactions, vol. 5, no. 2, pp. 44-54, Apr. 1998. http://dx.doi.org/10.1145/274430.274436

M. Nielsen; M. Störring; T. B. Moeslund; E. Granum, “A procedure for developing intuitive and ergonomic gesture interfaces for HCI”, en Gesture-Based Communication in Human-Computer Interaction, pp. 409-420, 2004. http://dx.doi.org/10.1007/978-3-540-24598-8_38

N. Henze; A. Löcken; S. Boll; T. Hesselmann; M. Pielot, “Free-hand gestures for music playback: deriving gestures with a user-centred process”, in Proceedings of the 9th International Conference on Mobile and Ubiquitous Multimedia - MUM ’10, 2010, pp. 1–10. http://dx.doi.org/10.1145/1899475.1899491

C. Cadoz; M. M. Wanderley, “Gesture-music”. 2000. https://hal.archives-ouvertes.fr/hal-01105543

M. Wanderley, “Non-obvious Performer Gestures in Instrumental Music”, en: Braffort A., Gherbi R., Gibet S., Teil D., Richardson J. (eds) Gesture-Based Communication in Human-Computer Interaction, Springer, Berlin, 1999, pp. 37–48. http://dx.doi.org/10.1007/3-540-46616-9_3

A.-M. Burns; M. M. Wanderley, “Visual Methods for the Retrieval of Guitarist Fingering”, en Proceedings of the 2006 conference on New Interfaces for Musical Expression, 2006, pp. 196-199. https://dl.acm.org/doi/10.5555/1142215.1142263

H. H. S. Ip; K. C. K. Law; B. Kwong, “Cyber Composer: Hand Gesture-Driven Intelligent Music Composition and Generation”, in 11th International Multimedia Modelling Conference, Melbourne, 2005. pp. 46–52. http://dx.doi.org/10.1109/MMMC.2005.32

H. S. Cabezas; W. J. Sarmiento, “Evaluación de modelos para el reconocimiento de gestos en señales biométricas, para un usuario con movilidad reducida”, TecnoLógicas, vol. 22, pp. 33-47, Nov. 2019. http://dx.doi.org/10.22430/22565337.1513

H. Zeng; X. He; H. Pan, “FunPianoAR: A Novel AR Application for Piano Learning Considering Paired Play Based on Multi-Marker Tracking”, J. Phys. Conf. Ser., vol. 1229, no. 1, pp. 1-7, 2019. https://iopscience.iop.org/article/10.1088/1742-6596/1229/1/012072/pdf

W. Molloy; E. Huang; B. C. Wünsche, “Mixed reality piano tutor: A gamified piano practice environment”, ICEIC 2019 - Int. Conf. Electron. Information, Commun., Auckland, 2019. http://dx.doi.org/10.23919/ELINFOCOM.2019.8706474

W. Qiao; R. Wei; S. Zhao; D. Huo; F. Li, “A real-time virtual piano based on gesture capture data”, en ICCSE 2017 - 12th Int. Conf. Comput. Sci. Educ., Houston 2017, pp. 740-743. http://dx.doi.org/10.1109/ICCSE.2017.8085592

I. Hwang; H. Son; J. R. Kim, “AirPiano: Enhancing music playing experience in virtual reality with mid-air haptic feedback”, en IEEE World Haptics Conf. WHC 2017, Munich, 2017, pp. 213-218 http://dx.doi.org/10.1109/WHC.2017.7989903

R. I. Godøy, “Motor-mimetic music cognition”, Leonardo, vol. 36, no. 4, pp. 317-319, Aug. 2003. http://dx.doi.org/10.1162/002409403322258781

J. Haueisen; T. R. Knösche, “Involuntary motor activity in pianists evoked by music perception”, J. Cogn. Neurosci., vol. 13, no. 6, pp. 786-792, Aug. 2001. http://dx.doi.org/10.1162/08989290152541449

P. M. Howard, “Kodaly Strategies for Instrumental Teachers: When the Kodály method is extended to string and other instrumental instruction, it offers opportunities for enhancing student musicianship”, Music Educ. J., vol. 82, no. 5, pp. 27-33, Mar.1996. http://dx.doi.org/10.2307/3398929

S. C. Guillén Martínez, “El método musical Kodaly en su nivel de iniciación para favorecer el proceso lecto-escritor en los niños de 5 y 6 años”, Instituto latinoamericano de Altos Estudios, 2014. https://ilae.edu.co/libros/207

E. FrØkjaer; M. Hertzum; K. Hornbæk, “Measuring usability: Are effectiveness, efficiency, and satisfaction really correlated?”, en Proceedings of the SIGCHI conference on Human Factors in Computing Systems, vol 2, no. 1. Springer, Dordrecht, 2000, pp. 345-352, http://dx.doi.org/10.1145/332040.332455

T. Bickmore; J. Cassell, “Social Dialongue with Embodied Conversational Agents”, en Advances in natural multimodal dialogue systems, Springer, 2005, pp. 23-54. http://dx.doi.org/10.1007/1-4020-3933-6_2

A. Jaimes; N. Sebe, “Multimodal human-computer interaction: A survey”, Comput. Vis. image Underst., vol. 108, no. 1-2, pp. 116-134, Oct. 2007. https://doi.org/10.1016/j.cviu.2006.10.019

P. Guiraud, El Lenguaje Del Cuerpo. Fondo de Cultura Económica,2013.

E. Hall, Beyond Culture. Anchor Books, 1977.

D. A. Norman, “Natural user interfaces are not natural”, Interactions, vol. 17, no. 3, pp. 6-10, Jun. 2010. http://dx.doi.org/10.1145/1744161.1744163

L. Cortés-Rico; G. Piedrahita-Solórzano, “Interacciones basadas en gestos: revisión crítica”, TecnoLógicas, vol. 22, pp. 119-132, Dic. 2019. http://dx.doi.org/10.22430/22565337.1512

F. Weichert; D. Bachmann; B. Rudak; D. Fisseler, “Analysis of the accuracy and robustness of the Leap Motion Controller”, Sensors, vol. 13, no. 5, pp. 6380-6393, May. 2013. http://dx.doi.org/10.3390/s130506380

LEAP Motion, “API Overview — Leap Motion C# SDK v3.2 Beta documentation”, s.f.. https://developer-archive.leapmotion.com/documentation/cpp/devguide/Leap_Overview.html

R. Hernández Sampieri; C. Fernández Collado; M. del P. Baptista Lucio, Metodología de la investigación, 6.a ed. McGraw-Hill, 2014.

How to Cite
[1]
C. Quintero and D. Roa, “Musical approach through a gestural interpretation”, TecnoL., vol. 25, no. 53, p. e2131, Mar. 2022.

Downloads

Download data is not yet available.
Published
2022-03-24
Section
Research Papers

Altmetric

Crossref Cited-by logo