Spelling suggestions: "subject:"info:entrepo/classification/ddc/152.14"" "subject:"info:restrepo/classification/ddc/152.14""
1 |
Visual multistability: influencing factors and analogies to auditory streamingWegner, Thomas 03 May 2023 (has links)
Sensory inputs can be ambiguous. A physically constant stimulus that induces several perceptual alternatives is called multistable. Many factors can influence perception. In this thesis I investigate factors that affect visual multistability. All presented studies use a pattern-component rivalry stimulus consisting of two gratings drifting in opposite directions (called the plaid stimulus). This induces an “integrated” perception of a moving plaid (the pattern) or a “segregated” perception of overlaid gratings (the components). One study (chapter 2) investigates parameter dependence of a plaid stimulus on perception, with particular emphasis on the first percept. Specifically, it addresses how the enclosed angle (opening angle) affects the perception at stimulus onset and during prolonged viewing. The effects that are shown persist even if the stimulus is rotated. On a more abstract level it is shown that percepts can influence each other over time (chapter 3) which emphasizes the importance of instructions and report mode. In particular, it relates to the decision which percepts are instructed to be reported at all as well as which percepts can be reported as separate entities and which are pooled into the same response option. A further abstract level (predictability of a stimulus change, chapter 5) shows that transferring effects from one modality to another modality (specifically from audition to vision) requires careful choice of stimulus parameters. In this context, we give considerations to the proposal for a wider usage of sequential stopping rules (SSR, chapter 4), especially in studies where effect sizes are hard to estimate a priori. This thesis contributes to the field of visual multistability by providing novel experimental insights into pattern-component rivalry and by linking these findings to data on sequential dependencies, to the optimization of experimental designs, and to models and results from another sensory modality.:Bibliographische Beschreibung 3
Acknowledgments 4
CONTENTS 5
Collaborations 7
List of Figures 8
List of Tables 8
1. Introduction 9
1.1. Tristability 10
1.2. Two or more interpretations? 11
1.3. Multistability in different modalities 12
1.3.1. Auditory multistability 12
1.3.2. Haptic multistability 13
1.3.3. Olfactory multistability 13
1.4. multistability with several interpretations 13
1.5. Measuring multistability 14
1.5.1. The optokinetic nystagmus 14
1.5.2. Pupillometry 15
1.5.3. Measuring auditory multistability 15
1.5.4. Crossmodal multistability 16
1.6. Factors governing multistability 16
1.6.1. Manipulations that do not involve the stimulus 16
1.6.2. Manipulation of the stimulus 17
1.6.2.1. Factors affecting the plaid stimulus 17
1.6.2.2. Factors affecting the auditory streaming stimulus 18
1.7. Goals of this thesis 18
1.7.1. Overview of the thesis 18
2. Parameter dependence in visual pattern-component rivalry at onset and
during prolonged viewing 21
2.1. Introduction 21
2.2. Methods 24
2.2.1. Participants 24
2.2.2. Setup 24
2.2.3. Stimuli 25
2.2.4. Procedure 26
2.2.5. Analysis 27
2.2.6. (Generalized) linear mixed-effects models 30
2.3. Results 30
2.3.1. Experiment 1 30
2.3.1.1. Relative number of integrated percepts 31
2.3.1.2. Generalized linear mixed-effects model 32
2.3.1.3. Dominance durations 33
2.3.1.4. Linear mixed-effects models 33
2.3.1.5. Control: Disambiguated trials 33
2.3.1.6. Time course of percept reports at onset 34
2.3.1.7. Eye movements 35
2.3.2. Experiment 2 36
2.3.2.1. Relative number of percepts 36
2.3.2.2. Generalized linear mixed-effects model 37
2.3.2.3. Dominance durations 38
2.3.2.4. Linear mixed-effects model 38
2.3.2.5. Control: Disambiguated trials 40
2.3.2.6. Time course of percept reports at onset 42
2.3.2.7. Eye movements 44
2.4. Discussion 45
2.5. Appendix 49
2.5.1. Appendix A 49
3. Perceptual history 51
3.1. Markov chains 52
3.1.1. Markov chains of order 1 and 2 52
3.2. Testing for Markov chains 55
3.2.1. The method of Naber and colleagues (2010) 56
3.2.1.1. The method 56
3.2.1.2. Advantages and disadvantages of the method 56
3.2.2. Further methods for testing Markov chains 57
3.3. Summary and discussion 58
4. Sequential stopping rules 60
4.1. The COAST rule 61
4.2. The CLAST rule 61
4.3. The variable criteria sequential stopping rule 61
4.4. Discussion 62
4.5. Using the vcSSR when transferring an effect from audition to vision 64
5. Predictability in visual multistability 66
5.1. Pretests 66
5.2. Predictability effects in visual pattern-component rivalry 69
5.2.1. Introduction 69
5.2.2. Methods 71
5.2.2.1. Participants 71
5.2.2.2. Setup 72
5.2.2.3. Stimuli 73
5.2.2.4. Conditions 73
5.2.2.5. Design and procedure 73
5.2.2.6. Analysis 74
5.2.3. Results 75
5.2.3.1. Valid reports 75
5.2.3.2. Verification of reports by eye movements 76
5.2.3.3. Onset latency 76
5.2.3.4. Dominance durations 78
5.2.3.5. Relative dominance of the segregated percept 78
5.2.4. Discussion 78
6. General discussion 83
6.1. Reporting percepts 83
6.1.1. Providing two versus three response options 83
6.1.2. Stimuli with more than three percepts 84
6.1.3. When to pool percepts together and when not 84
6.1.4. Leaving out percepts 87
6.1.5. Measuring (unreported) percepts 88
6.2. Comparing influencing factors on different levels 88
6.3. The use of the vcSSR 90
6.4. Valid reports 90
6.5. Conclusion 93
References 94
|
2 |
Effects of Gaze on Displays in the Context of Human-Machine InteractionSchmitz, Inka 13 May 2024 (has links)
Gaze is an important social signal in interactions between humans, but also in interactions between humans and artificial agents such as robots, autonomous vehicles, or even avatars presented via displays. It can help to recognize to which persons or objects the interaction partners direct their attention and to infer their intentions to act. By consciously directing the gaze, it is possible to point to a specific position.
In many works, arrows are used as learned, artificial directional cues for comparison with gaze cues. Three studies on different aspects of gaze perception form the core of this thesis. Study 1 deals with the estimation of gaze target positions in videoconferencing settings. For this purpose, pictures of persons ('senders' of gaze) who looked at certain positions on a screen were shown. The task of the subjects ('receivers' of the gaze) was to estimate these positions and mark them on their own screen by mouse click. The results show that the precision of such estimates is greater in the horizontal direction than in the vertical direction. Furthermore, a bias of the estimates in the direction of the senders' face or eyes was found.
Studies 2 and 3 investigated the influence of cues on visual attention control using a so-called spatial cueing paradigm. After the appearance of a central cue, the participants' task was to respond to target stimuli away from the center of the screen by pressing keys. Whether the cue direction predicted the target position depended on the condition. In Study 2, schematic faces were compared with arrows. Data obtained when cues were used that always pointed in the opposite direction of the target, suggests that the direction of action of arrows is more easily overridden than that of gaze directions. A dynamic, geometric cue stimulus was investigated in Study 3. Here, a blue disc with two red lines moving from the center to the side served as a cue. This abstract stimulus resulted in a reaction time advantage in the direction of movement of the lines. After several trials, videos were shown in which a human avatar wearing a blue helmet, also with red lines, was shown from behind. When he looked sideways and turned his head, the lines moved in the opposite direction to his gaze. This was intended to elicit an alternative interpretation of the abstract cue as the back of the helmet. The results show a partial reduction of the initial effect of the abstract cue. This suggests that stimuli that do not resemble eyes can be learned as gaze and used as cues.
In context of human-machine interactions the three studies provide fundamental insights into the human ability to estimate gaze orientation and the effect of gaze cues, which are particularly relevant for the design of gaze displays.:Contents
Bibliographische Beschreibung
Zusammenfassung
1 Introduction 1
1.1 Gaze as social cue 2
1.1.1 Joint attention 2
1.1.2 Biological and evolutionary perspective 3
1.1.3 Reflective and volitional processes 3
1.2 Cueing paradigms 4
1.2.1 Arrows predicting target position 5
1.2.2 Schematic faces that look only randomly in the direction of the target 5
1.3 Human gaze 6
1.3.1 Attention orientation and eye movements 6
1.3.2 Visual sensors in humans 6
1.4 Artificial gaze 8
1.4.1 Stationary displays 8
1.4.2 Precision of gaze direction estimation 9
1.4.3 Sensor types 9
1.4.4 Gaze of robots and mobile vehicles 10
1.5 Gaze as a communicative sensor system 11
1.6 Goals of the thesis 12
2 Summary of individual studies 14
2.1 Study 1: Gaze estimation in videoconferencing settings 14
2.2 Study 2: Attentional cueing: gaze is harder to override than arrows 15
2.3 Study 3: Effects of Interpreting a Dynamic Geometric Cue as Gaze on Attention Allocation 17
3 Discussion 19
3.1 Performance of gaze orientation estimates 19
3.2 Social cues 20
3.3 Sensor-display linkage 21
3.4 Future Perspectives 23
3.4.1 Receivers’ eye movements 23
3.4.2 Gaze in complex and interactive settings 24
3.4.3 Spatial cueing paradigm in product development 25
3.5 Conclusions 26
4 References 28
5 Appendix 36
Author’s Contributions 36
Gaze estimation in videoconferencing settings 37
Attentional cueing: gaze is harder to override than arrows 76
Accuracy 90
Cue Type 90
Cueing 90
Cue Type : Cueing 90
Effects of Interpreting a Dynamic Geometric Cue as Gaze on Attention Allocation 107
Danksagungen 130 / Blicke sind wichtige soziale Signale in Interaktionen zwischen Menschen, aber auch in Interaktionen zwischen Menschen und künstlichen Agenten wie Robotern, autonomen Fahrzeugen oder auch Avataren, die über Displays präsentiert werden. Sie können helfen zu erkennen, auf welche Personen oder Objekte die Interaktionspartner:innen ihre Aufmerksamkeit richten und auf deren Handlungsabsichten zu schließen. Durch bewusste Blickausrichtung kann gezielt in eine Richtung oder auf eine Position gezeigt werden. In vielen Arbeiten werden Pfeile als erlernte, künstliche Richtungshinweise zum Vergleich mit Blickhinweisen verwendet. Den Kern dieser Arbeit bilden drei Studien, die jeweils unterschiedliche Aspekte der Blickwahrnehmung untersuchen.
Studie 1 beschäftigt sich mit der Schätzung von Blickzielpositionen in Videokonferenzsituationen. Dazu wurden Bilder von Personen ('Sender' der Blicke) gezeigt, die auf bestimmte Positionen auf einem Bildschirm blickten. Die Aufgabe der Versuchspersonen ('Empfänger' der Blicke) bestand darin, diese Positionen zu schätzen und auf dem eigenen Bildschirm per Mausklick zu markieren. Die Ergebnisse zeigen, dass die Präzision solcher Schätzungen in horizontaler Richtung größer ist als in vertikaler Richtung. Außerdem wurde eine Verzerrung der Schätzungen in Richtung des Gesichtes bzw. der Augen der Sender festgestellt.
Studien 2 und 3 untersuchten den Einfluss von Hinweisreizen auf die visuelle Aufmerksamkeitslenkung mittels eines sogenannten Spatial Cueing Paradigmas. Die Aufgabe der Versuchspersonen bestand darin, nach dem Erscheinen eines Hinweisreizes in der Bildschirmmitte, der je nach Experiment und Bedingung in die Richtung des Zielreizes zeigte oder nicht, durch Tastendruck auf Zielreize neben der Bildschirmmitte zu reagieren. In Studie 2 wurden schematische Gesichter mit Pfeilen verglichen. Ergebnisse für Hinweisreize, die immer in die entgegengesetzte Richtung des Zielreizes zeigten, deuten darauf hin, dass die Wirkrichtung von Pfeilen leichter überschrieben werden kann als die von Blickrichtungen.
Ein dynamischer, geometrischer Hinweisreiz wurde in Studie 3 untersucht. Hier diente eine blaue Scheibe mit zwei roten Linien, die sich von der Mitte zur Seite bewegten als Hinweisreiz. Dieser abstrakte Reiz führte zu einem Reaktionszeitvorteil in der Bewegungsrichtung der Linien. Nach einigen Durchgängen wurden Videos gezeigt, in denen ein menschlicher Avatar einen blauen Helm mit ebenfalls roten Linien trug. Der Avatar wurde von hinten gezeigt und wenn er seinen Blick zur Seite richtete und dazu den Kopf drehte, bewegten sich die Linien entgegen der Blickrichtung. Auf diese Weise sollte eine alternative Interpretation des abstrakten Hinweisreizes als Helmrückseite induziert werden. Die Ergebnisse zeigen eine teilweise Reduktion des ursprünglichen Effektes des abstrakten Hinweisreizes, was auf eine Wirkung der Induktionsvideos entgegen der ursprünglichen Wirkrichtung schließen lässt. Das weist darauf hin, dass auch Stimuli als Blicke gelernt und so als Hinweisreiz genutzt werden können, die keine Ähnlichkeit mit Augen haben.
Eingebettet in den Kontext von Mensch-Maschine-Interaktionen, liefern die drei Studien grundlegende Erkenntnisse zur menschlichen Fähigkeit Blickausrichtung zu schätzen und der Wirkung von Blicken als Hinweisreize, die insbesondere für die Gestaltung von Blickdisplays relevant sind.:Contents
Bibliographische Beschreibung
Zusammenfassung
1 Introduction 1
1.1 Gaze as social cue 2
1.1.1 Joint attention 2
1.1.2 Biological and evolutionary perspective 3
1.1.3 Reflective and volitional processes 3
1.2 Cueing paradigms 4
1.2.1 Arrows predicting target position 5
1.2.2 Schematic faces that look only randomly in the direction of the target 5
1.3 Human gaze 6
1.3.1 Attention orientation and eye movements 6
1.3.2 Visual sensors in humans 6
1.4 Artificial gaze 8
1.4.1 Stationary displays 8
1.4.2 Precision of gaze direction estimation 9
1.4.3 Sensor types 9
1.4.4 Gaze of robots and mobile vehicles 10
1.5 Gaze as a communicative sensor system 11
1.6 Goals of the thesis 12
2 Summary of individual studies 14
2.1 Study 1: Gaze estimation in videoconferencing settings 14
2.2 Study 2: Attentional cueing: gaze is harder to override than arrows 15
2.3 Study 3: Effects of Interpreting a Dynamic Geometric Cue as Gaze on Attention Allocation 17
3 Discussion 19
3.1 Performance of gaze orientation estimates 19
3.2 Social cues 20
3.3 Sensor-display linkage 21
3.4 Future Perspectives 23
3.4.1 Receivers’ eye movements 23
3.4.2 Gaze in complex and interactive settings 24
3.4.3 Spatial cueing paradigm in product development 25
3.5 Conclusions 26
4 References 28
5 Appendix 36
Author’s Contributions 36
Gaze estimation in videoconferencing settings 37
Attentional cueing: gaze is harder to override than arrows 76
Accuracy 90
Cue Type 90
Cueing 90
Cue Type : Cueing 90
Effects of Interpreting a Dynamic Geometric Cue as Gaze on Attention Allocation 107
Danksagungen 130
|
Page generated in 0.1426 seconds