A SURVEY ON MOVEMENT ANALYSIS (HAND, EYE, BODY) AND FACIAL EXPRESSIONS-BASED DIAGNOSIS AUTISM DISORDERS USING MICROSOFT KINECT V2
Kinect v2 may enhance the clinical practice of autism spectrum disorders (ASD). ASD means disorders of neurodevelopment that lasts a lifetime, which occurs in early childhood and usually associated with unusual movement and gait disturbances. The earlier diagnosis of ASD helps of providing well known of these disorders. The methods which are adopted by experts in diagnosis are expensive, time-consuming, and difficult to replicate, as it is based on manual observation and standard questionnaires to look for certain signs of behavior. This paper, to the best of our knowledge, is a first attempt to collect the previous researches of the Kinect v2 in the disorder's diagnosis. Relevant papers are divided into four groups which are: (1) papers suggest a system based on the analysis of facial expressions, (2) papers suggest a system based on the analysis of hand movement, (3) papers suggest a system based on analysis of eye movement, and (4) papers suggest a system based on analysis of body movement.
American Psychiatric Association, Diagnostic and Statistical Manual of Mental Disorders, 5th ed. Arlington: VA American Psychiatric Publishing, 2013.
C. Rhind et al., â€œAn examination of autism spectrum traits in adolescents with anorexia nervosa and their parents.â€ Molecular autism vol. 5, no.1, p. 56. 20 Dec. 2014. doi:10.1186/2040-2392-5-56.
A. Tapus et al., â€œChildren with Autism Social Engagement in Interaction with Nao , an Imitative Robot- A Series of Single Case Experiments,â€ Interaction Studies: Social Behaviour and Communication in Biological and Artificial Systems, vol. 13, Issue 3, p. 315 â€“347, Jan 2012. doi:10.1075/is.13.3.01tap.
S. Jaiswal et al., â€œAutomatic Detection of ADHD and ASD from Expressive Behaviour in RGBD Data,â€ 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017), Washington, DC, USA, 30 May-3 June 2017. pp. 715-722. doi:10.1109/FG.2017.95.
J. Bloom, American Council On Science And Health, June 7, 2017. [Online]. Available: https://www.acsh.org/news/2017/06/07/mit-researcher-glyphosate-will-cause-half-all-children-be-autistic-2025-yeah-sure-11337. [Last Accessed: July 2019]
J. Orquin and K. Holmqvist, â€œThreats to the validity of eye-movement research in psychologyâ€ Behavior Research Methods, vol. 50, p.1645â€“1656, Aug 2018. [Online]. Available: doi:10.3758/s13428-017-0998-z.
T. Y. Tang, â€œHelping Neuro-typical Individuals to "Read" the Emotion of Children with Autism Spectrum Disorder: an Internet-of-Things Approach,â€ IDC '16 Proceedings of the The 15th International Conference on Interaction Design and Children, Manchester, United Kingdom, June 21 - 24, 2016. pp. 666-671. doi: 10.1145/2930674.2936009.
A. E. Youssef et al., "Auto-Optimized Multimodal Expression Recognition Framework Using 3D Kinect Data for ASD Therapeutic Aid," International Journal of Modeling and Optimization, vol. 3, no. 2, pp. 112-115, 2013. [Online]. Available: 10.7763/IJMO.2013.V3.247.
D. Zhao et al., â€œFacial Expression Detection Employing a Brain Computer Interface,â€ 2018 9th International Conference on Information, Intelligence, Systems and Applications (IISA), Zakynthos, Greece, Greece, 23-25 July 2018. pp. 1-2. doi: 10.1109/IISA.2018.8633661.
M. Pantic and L. Ãœ. M. Rothkrantz, â€œAutomatic analysis of facial expressions: The state of the art,â€ IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, no. 12, pp. 1424â€“1445, Dec 2000. doi: 10.1109/34.895976.
T. Baudel and M. Baudouin-Lafon, â€œCharade: Remote Control of Objects Using Free-Hand Gestures,â€ Comm. ACM, vol. 36, no. 7, pp. 28-35, Jul 1993. [Online]. Available: doi:10.1145/159544.159562.
GRS Murthy and RS Jadon, "A review of vision based hand gestures recognition," International Journal of Information Technology and Knowledge Management, vol. 2, no.2, pp. 405-410. Jul 2009.
L. Chen et al., â€œA survey of human motion analysis using depth imagery,â€ Pattern Recognition Letters, vol.34, Issue 15, pp.1995-2006, 1 November 2013. doi: https://doi.org/10.1016/j.patrec.2013.02.006.
L. M. Pedro and G. Augusto, â€œKinect evaluation for human body movement analysis,â€ 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob)Z, Rome, Italy, 24-27 June 2012. doi: 10.1109/BioRob.2012.6290751.
Y. Li and A. S. Elmaghraby, â€œA framework for using games for behavioral analysis of autistic children,â€ 2014 Computer Games: AI, Animation, Mobile, Multimedia, Educational and Serious Games (CGAMES), Louisville, KY, USA, 28-30 July 2014. pp. 130â€“133. doi: 10.1109/CGames.2014.6934157.
A.RamÃrez-Duque et al., â€œRobot-Assisted Autism Spectrum Disorder Diagnostic Based on Artificial Reasoning,â€ J Intell Robot Syst, vol. 96, pp.267â€“281, 29 March 2019. doi: 10.1007/s10846-018-00975-y.
F. Gomez-donoso et al., â€œAutomatic Schaeffer â€™ s Gestures Recognition System,â€ Expert Systems, vol. 33, no.5, pp. 480â€“488, 13 July 2016. doi: https://doi.org/10.1111/exsy.12160.
S. Oprea et al., â€œA recurrent neural network based Schaeffer gesture recognition system,â€ 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 14-19 May 2017. pp. 425â€“431. July 2017. doi: 10.1109/IJCNN.2017.7965885.
E. Marinoiu et al., â€œ3D Human Sensing, Action and Emotion Recognition in Robot Assisted Therapy of Children with Autism,â€ 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18-23 June 2018. pp. 2158â€“2167. December 2018. doi: 10.1109/CVPR.2018.00230.
M. Magrini et al., â€œAn Interactive Multimedia System for Treating Autism Spectrum Disorder,â€ Springer, Cham, vol. 9914, pp. 331â€“342, November 2016. doi: https://doi.org/10.1007/978-3-319-48881-3_23.
B. Ge, "Detecting engagement levels for autism intervention therapy using RGB-D camera," M.S. thesis, School of Electrical and Computer Engineering, Georgia Institute of Technology, May 2016. Accessed on: 25 June 2019. http://hdl.handle.net/1853/55043
JY Kang et al.,â€ Automated Tracking and Quantification of Autistic Behavioral Symptoms Using Microsoft Kinect,â€ Stud Health Technol Inform, vol.220, pp. 167 â€“ 170, 2016. doi: 10.3233/978-1-61499-625-5-167.
R. Barmaki, â€Gesture Assessment of Teachers in an Immersive Rehearsal Environment,â€ Ph.D. dissertation, Computer Science, University of Central Florida, August 2016. [Online]. Available: http://purl.fcla.edu/fcla/etd/CFE0006260
A. Zaraki et al., â€œToward autonomous child-robot interaction: development of an interactive architecture for the humanoid kaspar robot,â€ In 3rd Workshop on Child-Robot Interaction (CRI2017) in International Conference on Human Robot Interaction (ACM/IEEE HRI 2017), Vienna, Austria, Mar 2017, pp. 6-9.
I.Budman et al., â€œQuantifying the social symptoms of autism using motion capture,â€ Springer Nature, vol. 9,Issue 7712, pp. 1â€“8, May 2019. doi: 10.1038/s41598-019-44180-9.
S. Piana et al., â€œEffects of Computerized Emotional Training on Children with High Functioning Autism,â€ IEEE Trans. Affect. Comput., p. 1. May 2019. doi: 10.1109/TAFFC.2019.2916023.
M. Uljarevic and A. Hamilton, â€œRecognition of Emotions in Autism: A Formal Meta-Analysis,â€ J Autism Dev Disord, vol.43, pp.1517â€“1526. 2013. doi: 10.1007/s10803-012-1695-5
K. A. Pelphreyet al., â€œVisual Scanning of Faces in Autism,â€ J Autism Dev Disord, vol. 32, no. 4, pp. 249-261, Aug 2002. doi:10.1023/a:1016374617369.
K. Humphreys et al., â€œA fine-grained analysis of facial expression processing in high-functioning adults with autism,â€ Neuropsychologia, vol. 45. Issue 4, pp. 685â€“695, 2007. doi: 10.1016/j.neuropsychologia.2006.08.003.
C. Cook et al., â€œAlexithymia, not autism, predicts poor recognition of emotional facial expressions,â€ Psychol Sci, vol. 24, Issue 5, pp. 723-732, May 2013. doi: 10.1177/0956797612463582.
C. Nolker and H. Ritter, â€œDetection of Fingertips in Human Hand Movement Sequences,â€ Springer, Berlin, Heidelberg, volume 1371, pp.209-218. May 2006. doi: https://doi.org/10.1007/BFb0053001
A. Bulling et al., â€œEye Movement Analysis for Activity Recognition Using Electrooculography Andreas,â€ IEEE Transactions on Pattern Analysis and Machine Intelligence, vol: 33, Issue: 4, pp. 741 â€“ 753, 2010. doi: 10.1109/TPAMI.2010.86.
J. HyÃ¶nÃ¤, (2011). Documenting literature, â€œThe usefulness and limitations of eye-tracking in the study of reading (and writing)â€. [Online]. Available: http://www.writingpro.eu/upload/presentations/Summerschool_EyeTracking_JukkaHyona.pdf
Spezio ML et al., â€œAnalysis of face gaze in autism using â€˜Bubblesâ€™,â€ vol. 45, pp. 144â€“151, 2007. DOI: 10.1016/j.neuropsychologia.2006.04.027.
D. Fiedler and H. Muller, â€œImpact of Thermal and Environmental Conditions on the Kinect Sensorâ€. Advances in Depth Image Analysis and Applications, vol 7854, 2013. https://doi.org/10.1007/978-3-642-40303-3_3.
N. Smolyanskiy et al., â€œReal-time 3D face tracking based on active appearance model constrained by depth dataâ€. Image and Vision Computing, vol. 32, Issue 11, pp. 860â€“869, Nov 2014. [Online]. Available: doi: https://doi.org/10.1016/j.imavis.2014.08.005
E. Lachat et al., â€œAssessment and Calibration of a RGB-D Camera (Kinect v2 Sensor) Towards a Potential Use for Close-Range 3D Modelingâ€. Remote Sensing, vol.7, pp. 13070-13097, 2015. [Online]. Available: doi: DOI:10.3390/rs71013070
R. Seggers, â€œPeople Tracking in Outdoor Environments Evaluating the Kinect 2 Performance in Different Lighting Conditionsâ€, Computer Science. June 26th, 2015.
G.R.S. Murthy and R.S. Jadon, â€œComputer Vision Based Human Computer Interaction. Journal of Artificial Intelligence,â€ Journal of Artificial Intelligence, vol. 4, pp. 245-256. December 08, 2011. Available doi:10.3923/jai.2011.245.256
The submitter hereby warrants that the Work (collectively, the “Materials”) is original and that he/she is the author of the Materials. To the extent the Materials incorporate text passages, figures, data or other material from the works of others, the undersigned has obtained any necessary permissions. Where necessary, the undersigned has obtained all third party permissions and consents to grant the license above and has all copies of such permissions and consents.
The submitter represents that he/she has the power and authority to make and execute this assignment. The submitter agrees to indemnify and hold harmless the COMPUSOFT from any damage or expense that may arise in the event of a breach of any of the warranties set forth above. For authenticity, validity and originality of the research paper the author/authors will be totally responsible.