Você está na página 1de 17

COMSATS Institute of Information Technology,

Abbottabad Campus

Synopsis for the degree of ✔ M.S./M.Phil. Ph.D.

PART-1 (to be completed by the student)

Name of Student SARA SHAFIQUE

Department COMPUTER SCIENCE

Registration No. SP16-R01-005 Date of Thesis 6 March 2018


Registration

Name of Research Supervisor DR. IFTIKHAR AHMED KHAN

Members of Supervisory Committee

1. Dr. IFTIKHAR AHMED KHAN


2. Dr. ZIA-UR REHMAN
3. Dr. SAJJID SHAH
4. Dr. WAQAS JADOON
Title of Research Proposal Towards Automated Detection of Social Anxiety Via Gaze
Interaction in a Live Communication
Signature of Student:

`
Summary of the Research

Social anxiety disorder(SAD) is a psychological disorder. A person with social anxiety feels strong
fear that others may judge him/her[1]. This fear can have a negative impact on quality of life, social
functioning, family life, and relationships[2]. Gaze interaction is necessary for successful social
situations whereas gaze avoidance falls into SAD[3]. Researchers have proposed methods for gaze
interaction detection like 1) detection in static mode(images), 2) detection in simulated(acting)
videos portraying people having SAD, and 3) detection in a live video having real people with the
disorder. In the live communication method, social anxiety detection involved manual coding. To
the best of authors knowledge, there is no related work to detect social anxious people in the live
communication automatically. Therefore, in this research study, we aim to develop a method to
detect gaze interaction to predict social anxiety in the people in a live one to one communication.
Furthermore, the results will be validated via subjective social anxiety assessment questionnaires.

Introduction

Gaze interaction leads to a healthy social communication whereas gaze avoidance damages
the human functionality in social situations [3]. Jordan et al. [4] defined social anxiety as
“An individual's level of fear or anxiety associated with either anticipated or real
communication with other person or persons.” The study in [5] estimated about 80% of
people with a social anxiety disorder also experience other psychiatric conditions. Gaze
avoidance falls into one of the psychiatric disorders known as social anxiety. In recent years,
many techniques for social anxiety detection using eye contact have been proposed (see, e.g.
[6-12]). These methods used either simulated interaction or manual coding to measure gaze
avoidance. Computers are being used to diagnose various physical diseases [13] and
psychological disorders[6]. There are various advantages of using computers to diagnose the
diseases like social anxiety. The following are the two major reasons to justify the use of a
computer to diagnose.
1. To avoid fear
Despite the damages caused by a SAD, merely half of the adults seek treatment [1]. One of
the reasons is that people fear of being negatively evaluated by a healthcare professional
which is a symptom of social anxiety by itself.
2. Financial issues
SAD imposes extensive economic costs on individuals. The treatment costs are significantly
higher for people that have social anxiety with other psychiatric disorders. The study in [14,
15] analyzed the data from a population of 100,000 people attending primary care services.
It estimated that a total amount of £195,000 per annum would be the cost to treat SAD.
However, to measure gaze interaction to detect SAD via computers, we face two major
challenges
1.Techniques to detect gaze avoidance use simulated social interactions that are entirely
different from live communication [8, 12]

`
2. Live communication involves manual detection methods of gaze avoidance such as [9, 11]
in which humans are used to detect gaze avoidance.
The problems mentioned above may be removed with the help of camera and latest image
processing technology. However, the effectiveness of this technique is still unknown. We
will use the potential simple arbitrary head-mounted camera. The system will automatically
collect evidence for eye contact detection. We use the technique described in [16] that is gaze
interaction can be measured using a simple camera.

`
Statement of the Problem

We aim to propose an everyday gaze interaction detection method to measure the social
anxiety in live communication. In this study we will consider the following issues.
 In the latest method, the gaze interaction from live communication is detected by manual
coding where coders press keys on the keyboard when the participant makes gaze
interaction or looks away while watching a video. This could lead to various problems
like:
1) Code bias to code value in one category whereas it could have been falling in the
other.

2) In addition, even if not biased, the coder could make an error and could put the value
in the wrong category erroneously.

3) The number of each type of symbol is counted manually after a coding to obtain a
number that represented that the participant engaged in eye contact or looked away.

4) The coders are not able to code all videos properly reasons may be monotony,
tiredness, and lack of concentration.

Motivations

Social anxiety is the world’s fourth most prevalent psychiatric disorder and affects 13% of
people during their lifetime [5, 17]. A person suffering from social anxiety is not able to live
a life of quality. One faces problem in achieving targets because of fear. Social situations
success mostly depends upon gaze interaction. If someone makes gaze interaction in a social
situation it is considered as a symbol of confidence and success. On the contrary, if someone
has weak gaze interaction and suffers from gaze avoidance one develops rude behavior, and
low self-esteem [3]. Qualified psychiatrists are not available everywhere, and even if they
are available then the socially anxious person may not go to them because of:

Fear of the social situation and might be due to


Financial issues

In such situations, computers can be used to detect such disorders and could provide effective
treatments. The motivation is to make this detection as accurate as possible and also to make
the treatment economical.

`
Related Work

Studies such as [8, 12, 18-20] suggest that SAD is associative with gaze avoidance as can
be observed while seeing a picture or video clip of another person. The use of gaze
avoidance approach to detect social anxiety in a live interaction as used by literature are
described below. The research work in [10] estimated the gaze avoidance in a live
situation where women saw the videos of the peers. The videos were pre-recorded, but
women participants did not know. For them, it was a live interaction. The research work in
[11] measured the gaze avoidance in a live interaction where all participants took part in a
role-play task which was a conversation with an unknown female. The participants were
told that unknown female was also a study participant. Two video cameras were mounted
on the wall of the behavior test room. A stopwatch was used to determine the total number
of seconds in which the participants-maintained eye contact with their peers during each
minute of the interaction. The research work in [8] measured the gaze avoidance in a
simulated environment with a live human through a webcam camera. The research work in
[9] measured the gaze behavior in a live interaction where participants talked with their
real friends. To measure the gaze avoidance, the researchers hired coders. There job was to
watch the videos and to manually code the gaze interaction of the participants. The coders
entered key presses into Microsoft Word (2003) that corresponded to the participants’
behavior; they held down one key on the keyboard (‘‘[’’) when the participant was making
eye contact with his/her partner and held down another key (‘‘]’’) when they were looking
away. The number of times each symbol was pressed was counted after the experiment.
This resulted into a number that the participant engaged in eye contact and the amount that
the participant looked away. In this research, all coders were not able to code all the videos.
The said research mentioned that the coders were unable to code all the videos. The
reasons might be the length of the videos as well as the requirement for more cognitive
efforts. To reduce these limitations, in this research we intend to develop an automatic gaze
detection technique by using the RGB camera.
There are manually annotated video dataset available for gaze interaction [21,22].
However, unlike the existing research, this work aim to provide automated annotation of
the gaze interactions to the SAD experts. This could provide extra support to the experts to
detect SAD disorders. There are many past studies that also proposed to develop new
mechanisms to browse various videos efficiently such as surveillance [24, 25], education
[23], first-person videos [27], and sports [26]. Such research studies use techniques based
on computer vision, and developed GUI support for browsing of videos like a colored video
timeline [33], content-aware fast-forwarding [31,32], and direct manipulation [28-30]. The
colored timeline marks the video part which is being focused by the viewer. This mark cues
of significant scenes to the viewers. Similarly, as a result of our research we could detect
gaze interaction in real time immediately focusing on the participant’s interaction attitudes.
With the advance in methodologies of machine learning, automatic gaze estimation
improved significantly. State-of-the-art facial and face landmark detection procedures [34-
36] are very effective even in hard situations. Appearance-based, calibration-free, gaze
estimation procedures have evolved with deep learning techniques and large datasets
[37,38]. Past studies also intended to automate detection of gaze interaction [39-42], with
eye trackers, or wearable cameras. However, there are several basic problems in detecting
gaze interaction with wearable technologies. First, it is hard for participants to wear eye
trackers, or cameras because they feel more self-conscious if they have a social anxiety
disorder and this method is not always appropriate in practical situations. Secondly, as
mentioned in [16], the gaze interaction depends on the object to be targeted, and it is almost
impractical to train gaze interaction detectors beforehand without knowing the object to be
targetted. Finally, gaze interaction is heavily dependent on the conditions of the interaction.
Gaze

`
interaction in social communication has various meanings [43], and observers need to be
familiar with the definition of social communication. These issues also motivated us to
include automatic video analysis in a supporting nature for guiding judgment of humans.
These issues are summarized in Table 1. The tick mark indicates the presences of an issue
in the corresponding research study. We aim to propose an everyday gaze interaction
detection method to measure social anxiety in live communication.

Referenc Live Automatic Dataset of Results


e Environment Coding SAD (Relationship
between SAD and
Gaze Interaction)
[8] ×  × 

[9]  × × 

[10]  × × ×

[11]  × × ×

Table 1: Gaze Interaction Detection Method to Measure the SAD in a Live


Communication Studies Comparison

`
Research Methodology

1. We will carry out literature review by stating most of therelative work according to social
anxiety and gaze avoidance and select the best approaches that satisfy our research objective
requirement for application and comparison.

2. We will develop social anxiety measuring in a live interaction using everyday gaze
contact technique that minimizes the issues stated in our problem statement.

3. We will implement our proposed approach.

4. Our implemented gaze avoidance detection for social anxiety proposed approach will be
tested in the real world to get results.

5. Finally, research process will be put in written form as a thesis.

Start

Motivation

Literature Review

Problem Statement

Gaze Interaction in a Live


Communication

Implementation

Results

Thesis Writing

End

`
`
Bibliography
1. Grant, B.F., et al., The epidemiology of social anxiety disorder in the United
States: results from the National Epidemiologic Survey on Alcohol and Related
Conditions. The Journal of clinical psychiatry, 2005. 66(11): p. 1351-61.
2. Carla Oliveira, A.S.P., Social Anxiety Mobile Application to Enhance University
Psychological Services. Journal of Medical Research and Health Education, ,
2017. 1(1): p. 1-5.
3. Dalton, K.M., Nacewicz, B. M., Johnstone, T., Schaefer, H. S., Gernsbacher, M.
A., Goldsmith, H., et al, Gaze fixation and the neural circuitry of face processing
in autism. Nature Neuroscience, 2005. 8(4): p. 519–526.
4. Rasmussen, J.J., Relationship Between Social Anxiety & Facebook Surveillance.
2017.
5. Farmer, A.S., et al., Clinical predictors of diagnostic status in individuals with
social anxiety disorder. Comprehensive psychiatry, 2014. 55(8): p. 1906-1913.
6. Bhakta, I. and A. Sau, Prediction of depression among senior citizens using
machine learning classifiers. International Journal of Computer Applications,
2016. 144(7): p. 1116.
7. Farabee, D.J., S.L. Ramsey, and S.G. Cole, Social anxiety and speaker gaze in a
persuasive atmosphere. Journal of Research in Personality, 1993. 27(4): p.
365376.
8. Howell, A.N., et al., Relations among social anxiety, eye contact avoidance, state
anxiety, and perception of interaction performance during a live conversation.
Cognitive behaviour therapy, 2016. 45(2): p. 111-122.
9. Langer, J.K., et al., Social anxiety disorder is associated with reduced eye contact
during conversation primed for conflict. Cognitive therapy and research, 2017.
41(2): p. 220-229.
10. Walters, K.S. and D.A. Hope, Analysis of social behavior in individuals with
social phobia and nonanxious participants using a psychobiological model.
Behavior Therapy, 1998. 29(3): p. 387-407.
11. Weeks, J.W., R.G. Heimberg, and R. Heuer, Exploring the role of behavioral
submissiveness in social anxiety. Journal of Social and Clinical Psychology, 2011.
30(3): p. 217-249.
12. Weeks, J.W., A.N. Howell, and P.R. Goldin, Gaze avoidance in social anxiety
disorder. Depression and anxiety, 2013. 30(8): p. 749-756.
13. McBride, M., Ranking Top 10 Hospital EMR Vendors by Number of Installed
Systems. Dark Daily, 2014.
14. Brown, G.D., et al., Discussing out-of-pocket expenses during clinical
appointments: An observational study of patient-psychiatrist interactions.
Psychiatric Services, 2017. 68(6): p. 610-617.
15. Patel, A., et al., The economic consequences of social phobia. Journal of affective
disorders, 2002. 68(2): p. 221-233.
16. Zhang, X., Y. Sugano, and A. Bulling. Everyday eye contact detection using
unsupervised gaze target discovery. in Proceedings of the 30th Annual ACM
Symposium on User Interface Software and Technology. 2017. ACM.
17. Teachman, R.K.N.a.B.A., Using Advances from Cognitive Behavioral Models of
Anxiety to Guide the Treatment for Social Anxiety Disorder. JOURNAL of
CLINICAL PSYCHOLOGY, 2017. 73: p. 524–535.

`
18. Horley, K., et al., Social phobics do not see eye to eye:: A visual scanpath study of
emotional expression processing. Journal of anxiety disorders, 2003. 17(1): p.
3344.
19. Horley, K., et al., Face to face: visual scanpath evidence for abnormal processing
of facial expressions in social phobia. Psychiatry research, 2004. 127(1): p. 43-53.
20. Schneier, F.R., et al., Neural circuitry of submissive behavior in social anxiety
disorder: a preliminary study of response to direct eye gaze. Psychiatry Research:
Neuroimaging, 2009. 173(3): p. 248-250.
21. Weeks, J.W., R.G. Heimberg, and R. Heuer, Exploring the role of behavioral
submissiveness in social anxiety. Journal of Social and Clinical Psychology, 2011.
30(3): p. 217-249.
22. Grant, B.F., et al., The epidemiology of social anxiety disorder in the United
States: results from the National Epidemiologic Survey on Alcohol and Related
Conditions. The Journal of clinical psychiatry, 2005. 66(11): p. 1351-61.
23. Juho Kim, Philip J. Guo, Carrie J. Cai, Shang-Wen (Daniel) Li, Krzysztof Z.
Gajos, and Robert C. Miller. 2014. Data-driven Interaction Techniques for
Improving Navigation of Educational Videos. In Proceedings of the ACM
Symposium on User Interface Software and Technology (UIST). ACM, New
York, NY, USA, 563–572. DOI: http://dx.doi.org/10.1145/2642918.2647389
24. Yael Pritch, Alex Rav-Acha, Avital Gutman, and Shmuel Peleg. 2007. Webcam
Synopsis: Peeking Around the World. In Proceedings of the IEEE International
Conference on Computer Vision (ICCV). IEEE, Washington, DC, USA, 1–8.
DOI: http://dx.doi.org/10.1109/ICCV.2007.4408934
25. Yael Pritch, Alex Rav-Acha, Avital Gutman, and Shmuel Peleg. 2007. Webcam
Synopsis: Peeking Around the World. In Proceedings of the IEEE International
Conference on Computer Vision (ICCV). IEEE, Washington, DC, USA, 1–8.
DOI: http://dx.doi.org/10.1109/ICCV.2007.4408934
26. Justin Matejka, Tovi Grossman, and George Fitzmaurice. 2014. Video Lens:
Rapid Playback and Exploration of Large Video Collections and Associated
Metadata. In Proceedings of the ACM Symposium on User Interface Software
and Technology (UIST). ACM, New York, NY, USA, 541–550. DOI:
http://dx.doi.org/10.1145/2642918.2647366
27. Keita Higuchi, Ryo Yonetani, and Yoichi Sato. 2017. EgoScanning: Quickly
Scanning First-Person Videos with Egocentric Elastic Timelines. In Proceedings
of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17).
6536–6546. DOI: http://dx.doi.org/10.1145/3025453.3025821
28. Cuong Nguyen, Yuzhen Niu, and Feng Liu. 2013. Direct Manipulation Video
Navigation in 3D. In Proceedings of the ACM CHI Conference on Human Factors
in Computing Systems (CHI). ACM, New York, NY, USA, 1169–1172. DOI:
http://dx.doi.org/10.1145/2470654.2466150
29. Thorsten Karrer, Malte Weiss, Eric Lee, and Jan Borchers. 2008. DRAGON: A
Direct Manipulation Interface for Frame-accurate In-scene Video Navigation. In
Proceedings of the ACM CHI Conference on Human Factors in Computing
Systems (CHI). ACM, New York, NY, USA, 247–250. DOI:
http://dx.doi.org/10.1145/1357054.1357097

`
30. Thorsten Karrer, Moritz Wittenhagen, and Jan Borchers. 2012. DragLocks:
Handling Temporal Ambiguities in Direct Manipulation Video Navigation. In
Proceedings of the ACM CHI Conference on Human Factors in Computing

Systems (CHI). ACM, New York, NY, USA, 623–626. DOI:


http://dx.doi.org/10.1145/2207676.2207764
31. Kai-Yin Cheng, Sheng-Jie Luo, Bing-Yu Chen, and Hao-Hua Chu. 2009.
SmartPlayer: User-centric Video Fast-forwarding. In Proceedings of the ACM
CHI Conference on Human Factors in Computing Systems (CHI). ACM, New
York, NY, USA, 789–798. DOI: http://dx.doi.org/10.1145/1518701.1518823
32. Suporn Pongnumkul, Jue Wang, Gonzalo Ramos, and Michael Cohen. 2010.
Content-aware Dynamic Timeline for Video Browsing. In Proceedings of the
ACM Symposium on User Interface Software and Technology (UIST). ACM,
New York, NY, USA, 139–142. DOI: http://dx.doi.org/10.1145/1866029.1866053
33. Manfred Del Fabro, Bernd Münzer, and Laszlo Böszörmenyi. 2013. Smart Video
Browsing with Augmented Navigation Bars. Springer Berlin Heidelberg, Berlin,
Heidelberg, 88–98. DOI http://dx.doi.org/10.1007/978-3-642-35728-2_9
34. T. Baltrusaitis, P. Robinson, and L. P. Morency. 2013. Constrained Local Neural
Fields for Robust Facial Landmark Detection in the Wild. In 2013 IEEE
International Conference on Computer Vision Workshops. 354–361. DOI:
http://dx.doi.org/10.1109/ICCVW.2013.54
35. T. Baltrusaitis, P. Robinson, and L. P. Morency. 2016. OpenFace: An open source
facial behavior analysis toolkit. In 2016 IEEE Winter Conference on Applications
of Computer Vision (WACV). 1–10. DOI:
http://dx.doi.org/10.1109/WACV.2016.7477553
36. Yandong Wen, Kaipeng Zhang, Zhifeng Li, and Yu Qiao. 2016. A Discriminative
Feature Learning Approach for Deep Face Recognition. Springer International
Publishing, Cham, 499–515. DOI: http://dx.doi.org/10.1007/978-3-319-
464787_31
37. Hanan Makki Zakari, Minhua Ma, and David Simmons. 2014. A Review of
Serious Games for Children with Autism Spectrum Disorders (ASD). Springer
International Publishing, Cham, 93–106. DOI: http://dx.doi.org/10.1007/978-
3319-11623-5_9
38. Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. It’s
Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation. In
Proc. of the IEEE Conference on Computer Vision and Pattern Recognition
Workshops (CVPRW) (2017-05-18). https://perceptual.Mpi
inf.mpg.de/files/2017/05/zhang_cvprw2017.pdf
39. Zhefan Ye, Yin Li, Alireza Fathi, Yi Han, Agata Rozga, Gregory D. Abowd, and
James M. Rehg. 2012. Detecting Eye Contact Using Wearable Eye-tracking
Glasses. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing
(UbiComp ’12). ACM, New York, NY, USA, 699–704. DOI:
http://dx.doi.org/10.1145/2370216.2370368

`
40. Z. Ye, Y. Li, Y. Liu, C. Bridges, A. Rozga, and J. M. Rehg. 2015. Detecting bids
for eye contact using a wearable camera. In 2015 11th IEEE Internationa
Conference and Workshops on Automatic Face and Gesture Recognition (FG),
Vol. 1. 1–8. DOI: http://dx.doi.org/10.1109/FG.2015.7163095
41. Brian A. Smith, Qi Yin, Steven K. Feiner, and Shree K. Nayar. 2013. Gaze
Locking: Passive Eye Contact Detection for Human-object Interaction. In
Proceedings of the 26th Annual ACM Symposium on User Interface Software and
Technology (UIST ’13). 271–280. DOI:
http://dx.doi.org/10.1145/2501988.2501994

42. Eunji Chong, Katha Chanda, Zhefan Ye, Audrey Southerland, Nataniel Ruiz,
Rebecca M. Jones, Agata Rozga, and James M. Rehg. 2017. Detecting Gaze
Towards Eyes in Natural Social Interactions and Its Use in Child Assessment.
Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 1, 3, Article 43 (Sept.
2017), 20 pages. DOI: http://dx.doi.org/10.1145/3131902

`
43. Peter Mundy, Christine Delgado, Jessica Block, Meg Venezia, Anne Hogan, and
Jeffrey Seibert. 2003. Early social communication scales (ESCS).

`
`
PART II

Recommendation by the Research Supervisor

Name _________________________ Signature _________________ Date___________

Signed by Supervisory Committee

S. # Name of Committee Member Designation Signature & Date


1 Dr. IFTIKHAR AHMED KHAN
2 Dr. ZIA-UR REHMAN
3 Dr. SAJJID SHAH
4 Dr. WAQAS JADOON

Approved by Departmental Advisory Committee

Certified that the synopsis has been seen by members of DAC and considered it suitable for
putting up to BASAR.

Secretary
Departmental Advisory Committee

Name: ______________________________ Signature: __________________________

Date: _______________

Chairman/HoD: ______________________________

Signature: ______________________________

Date: ________________

`
PART III

Dean, Faculty of Information Sciences & Technology

________________ Approved for placement before BASAR.

________________ Not Approved on the basis of following reasons

Signature ______________________________ Date ________________

Secretary BASAR

________________ Approved from BASAR.

________________ Not Approved on the basis of the following reasons

Signature: ______________________________ Date ________________

Dean, Faculty of Information Sciences & Technology

___________________________________________________________________________

___________________________________________________________________________

___________________________________________________________________________

Signature: _____________________________ Date______________________

`
Please provide the list of courses studied

Sr. # Course No Course Title


1 CSC511 Advanced Algorithms Analysis
2 CSC522 Advanced Topics in Operating Systems
3 CSC521 Advanced Topics in Computer Architecture
4 CSC618 Advanced Topics in Grid Computing
5 CSC650 Advanced Topics in Digital Image Processing
6 CSC668 Machine Learning
7 CSC670 Advanced Topics in Robotics
8 CSC626 Advanced Topics in Human Computer Interaction

Você também pode gostar