• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 18
  • 7
  • 6
  • 1
  • 1
  • 1
  • Tagged with
  • 42
  • 16
  • 12
  • 9
  • 8
  • 8
  • 7
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Can participants extract subtle information from gesturelike visual stimuli that are coordinated with speech without using any other cues?

Abdalla, Marwa 01 May 2012 (has links)
Embodied cognition is the reflection of an organism's interaction with its environment on its cognitive processes. We explored the question whether participants are able to pick up on subtle cues from gestures using the Tower of Hanoi task. Previous research has shown that listeners are sensitive to the height of the gestures that they observe, and reflect this knowledge in their mouse movements (Cook & Tanenhaus, 2009). Participants in our study watched a modified video of someone explaining the Tower of Hanoi puzzle solution, so that participants only saw a black background with two moving dots representing the hand positions from the original explanation in space and time. We parametrically manipulated the location of the dots to examine whether listeners were sensitive to this subtle variation. We selected the transfer gestures from the original explanation, and tracked the hand positions with dots at varying heights relative to the original gesture height. The experimental gesture heights reflected 0%, 25%, 50%, 75% and 100% of this original height. We predicted, based on previous research (Cook in prep), that participants will be able to extract the difference in gesture height and reflect this in their mouse movements when solving the problem. Using linear model for our analysis, we found that the starting trajectory confirmed our hypothesis. However, when looking at the averaged first 15 moves (the minimum to solve the puzzle) across the five conditions, the ordered effect of the gesture heights was lost, although there were apparent differences between the gesture heights. This is an important finding because it shows that participants are able to glean subtle height information from gestures. Listeners truly interpret iconic gestures iconically.
2

Crossmodal correspondences between visual, olfactory and auditory information

Persson, Viktor January 2011 (has links)
Our senses take in a large amount of information, information that sometimes is congruent across sensory modalities. Crossmodal correspondences are the study of how this information across modalities is integrated by the brain, across which dimensions the correspondences exists, and how it affect us. In the present paper four experiments were conducted, in which potential crossmodal correspondences between audition, vision and olfaction were investigated. It was hypothesized that crossmodal correspondences between olfaction, vision and audition exist along different dimensions. The results showed significant correlations between olfaction and audition when volume varies, i.e., a high volume is associated to a high concentration of an odor, and a low volume is associated to a low concentration of an odor, and vice versa. Furthermore, existing correspondences between vision and audition is reconfirmed. In conclusion, the results provide support to the notion that crossmodal correspondences exists between all sensory modalities, although along different dimensions.
3

Do Crossmodal Correspondences Found between Marketed Shampoo Fragrances and the Angularity of Shapes Transfer to the Shape of 2-dimensional and 3-dimensional Shampoo Bottle Designs?

Cessna, Trevor C. 02 June 2016 (has links)
No description available.
4

Musik som "krydda" : En studie i hur musik kan användas på stjärnkrogar för att förhöja upplevelsen av mat och dryck

Pettersson, Kristian January 2016 (has links)
När restaurangguiden Guide Michelin utdelar de prestigefyllda stjärnorna är det endast matenoch drycken som ska bedömas. Vid ett besök på en stjärnkrog möter vi emellertid en helhet iform av en atmosfär som vi upplever genom våra sinnen (syn, hörsel, känsel, lukt och smak).Upplevelsen av mat och dryck är en multisensorisk upplevelse som skapas via neurologiskaprocesser i hjärnan. Det innebär att allt runtomkring oss som vi registrerar med våra sinneninnan, samt medan vi äter och dricker kommer att påverka hur vi upplever maten ochdrycken. Därför kan det ifrågasättas om det möjligtvis kan finnas andra faktorer som påverkarGuide Michelins bedömning? Kan musik ha en inverkan?Denna studie har undersökt hur stjärnkrogar kan utveckla måltidsupplevelsen med inriktningpå hur musik kan förhöja upplevelsen av mat och dryck.Semistrukturerade intervjuer har genomförts med fyra stjärnkrogar i Stockholm. Dessutom görstudien en kartläggning av och ger en inblick i den forskning som försöker hitta sambandmellan musik, mat och dryck.Resultatet från studien visar att det är möjligt för stjärnkrogar att använda sig av musik iutformandet av måltidsupplevelsen för att förhöja upplevelsen av mat och dryck. Det har dockinte varit möjligt att utforma en konkret modell för hur det ska gå till, utan endastrekommendationer och förslag ges. Rekommendationerna och förslagen skulle även innebäraatt de undersökta stjärnkrogarna behöver ändra sina nuvarande arbetssätt.Ett behov av ytterligare forskning ses som nödvändig innan forskningen om hur musik kanpåverka upplevelsen av mat och dryck kan få ett större praktiskt tillämpningsområde. / When the restaurant guide Michelin awards its prestigious stars, the review is to be based solelyon the quality of food and beverages. However, visiting a starred restaurant, the visitor enters awhole atmosphere experienced through all senses (sight, hearing, touch, smell and taste). Theexperience of food and beverages is multi-sensorial stemming from neurological processes inthe brain. Hence, everything our senses register before and during the meal will affect ourexperience of what we are eating and drinking. In light of this, questions can be raised aboutthe reviews of the Michelin Guide – are there perhaps other factors besides the quality of thefood and beverages influencing the assessment of the restaurants? Can music be such aninfluence?This study has examined in which ways Michelin starred restaurants can develop the mealexperience, specifically how music can enhance the experience of food and beverages.Semi-structured interviews have been conducted with four Michelin starred restaurants inStockholm. Moreover, the study aims to outline previous research connecting food, beveragesand music.The result of the study shows the possibility for Michelin starred restaurants to use music as anenhancement in the creation of the experience of food and beverages. A construction of anexplicit model for restaurants hasn't been possible. The study contains generalrecommendations and suggestions, which in turn would require changes in star restaurants’work methods.Additional research is seen as necessary before the study of the influence of music on the mealexperience can have a broader practical application.
5

Crossmodal correspondences and attention in the context of multisensory (product) packaging design : applied crossmodal correspondences

Velasco, Carlos January 2015 (has links)
The term 'crossmodal correspondence' refers to the tendency for people to match information across the senses. In this thesis, the associations between taste/flavour (tastants and words) information with shapes and colours is investigated. Furthermore, such correspondences are addressed in the context of multisensory packaging design. The focus in this thesis is on the way in which taste/flavour information can be communicated by means of the visual elements of product packaging. Through a series of experiments, I demonstrate that people associate tastes and the roundness/angularity of shapes, and that taste quality, hedonics, and intensity influence such correspondences. However, packaging roundness/angularity does not seem to drive these associations. Additionally, I demonstrate that culture and context systematically influence colour/flavour associations. Importantly, the results reported in this thesis suggest that taste/shape correspondences can influence taste expectations as a function of the visual attributes of product packaging. The results reported here also reveal that colour can influence the classification of, and search for, flavour information on a product’s packaging. It turns out that the strength of the association between a flavour category and a colour is crucial to such an effect. The implications of these findings are discussed in light of the theories of crossmodal correspondences, its applications, and directions for future research.
6

Investigating the Neural Correlates of Crossmodal Facilitation as a Result of Attentional Cueing: An Event-Related fMRI Study

Fatima, Zainab 25 July 2008 (has links)
Investigating the Neural Correlates of Crossmodal Facilitation as a Result of Attentional Cueing: An Event-Related fMRI Study. Degree of Masters of Science, 2008 Zainab Fatima Institute of Medical Science, University of Toronto ABSTRACT Attentional cueing modulated neural processes differently depending on input modality. I used event-related fMRI to investigate how auditory and visual cues affected reaction times to auditory and visual targets. Behavioural results showed that responses were faster when: cues appeared first compared to targets and cues were auditory versus visual. The first result was supported by an increase in BOLD percent signal change in sensory cortices upon cue but not target presentation. Task-related activation patterns showed that the auditory cue activated auditory and visual cortices while the visual cue activated the visual cortices and the fronto-polar cortex. Next, I computed brain-behaviour correlations for both cue types which revealed that the auditory cue recruited medial visual areas and a fronto-parietal attentional network to mediate behaviour while the visual cue engaged a posterior network composed of lateral visual areas and subcortical structures. The results suggest that crossmodal facilitation occurs via independent neural pathways depending on cue modality.
7

Investigating the Neural Correlates of Crossmodal Facilitation as a Result of Attentional Cueing: An Event-Related fMRI Study

Fatima, Zainab 25 July 2008 (has links)
Investigating the Neural Correlates of Crossmodal Facilitation as a Result of Attentional Cueing: An Event-Related fMRI Study. Degree of Masters of Science, 2008 Zainab Fatima Institute of Medical Science, University of Toronto ABSTRACT Attentional cueing modulated neural processes differently depending on input modality. I used event-related fMRI to investigate how auditory and visual cues affected reaction times to auditory and visual targets. Behavioural results showed that responses were faster when: cues appeared first compared to targets and cues were auditory versus visual. The first result was supported by an increase in BOLD percent signal change in sensory cortices upon cue but not target presentation. Task-related activation patterns showed that the auditory cue activated auditory and visual cortices while the visual cue activated the visual cortices and the fronto-polar cortex. Next, I computed brain-behaviour correlations for both cue types which revealed that the auditory cue recruited medial visual areas and a fronto-parietal attentional network to mediate behaviour while the visual cue engaged a posterior network composed of lateral visual areas and subcortical structures. The results suggest that crossmodal facilitation occurs via independent neural pathways depending on cue modality.
8

The influence of auditory cues on visual spatial perception

Geeseman, Joseph W. 01 December 2010 (has links)
Traditional psychophysical studies have been primarily unimodal experiments due to the ease in which a single sense can be isolated in a laboratory setting. This study, however, presents participants with auditory and visual stimuli to better understand the interaction of the two senses in visuospatial perception. Visual stimuli, presented as Gaussian distributed blobs, moved laterally across a computer monitor to a central location and "bounced" back to their starting position. During this passage across the screen, a brief auditory "click" was presented via headphones. Participants were asked to respond to the bounce of the ball, and response latency was recorded. Response latency to the bounce position varied as a function of baseline (no sound) and the varying sound offset locations.
9

AUDITORY CUES AND RESPONSE MODES MEDIATE PERIPHERAL VISUAL MISLOCALIZATION

Geeseman, Joseph W. 01 August 2012 (has links)
The current study investigates the influence of auditory cues on the localization of briefly presented peripheral visual stimuli. Because the brief presentation of peripheral visual stimuli often leads to mislocalization (Binda, Morrone, & Burr, 2010; Bocianski, Musseler, & Erlhagen, 2008; Musseler, Heijden, Mahmud, Dubel, & Ertsey, 1999) these types of stimuli are the most commonly studied and represent the basis of the current study. Musseler et al. (1999) found that peripheral mislocalization toward the fovea occurred during asynchronous presentations of a pair of visual stimuli in retinal periphery, but not during synchronous presentations of stimuli. The current project is an investigation of how sound influences mislocalization of briefly presented peripheral stimuli. If the mechanism of mislocalization is an increased variability of responses when the peripheral stimuli are presented asynchronously, could sound reduce the variability of localization judgments and thus, reduce or eliminate the mislocalization effect? Does sound influence peripheral mislocalization in some other way? This study found that during a relative judgment task, a brief, laterally presented sound leads to mislocalization of a target stimulus toward the direction of the sound (Experiment 1). During an absolute judgment task, however, the influence of the brief, laterally presented sound no longer evokes mislocalization in the direction of the sound. Rather, stimulus onset asynchrony elicits mislocalization similar to the results of Musseler et al. (Experiment 2). When a dynamic sound stimulus occurs prior to the onset of the target stimulus during an absolute judgment task, however, sound idiosyncratically influences the localization of a target stimulus toward the onset of the sound stimulus or direction of the apparent motion of the sound stimulus (Experiment 3).
10

Auditory cuing of visual attention : spatial and sound parameters

Lee, Jae Won January 2017 (has links)
The experiments reported in this thesis investigate whether the current understanding of crossmodal spatial attention can be applied to rear space, and how sound parameters can modulate crossmodal spatial cuing effects. It is generally accepted that the presentation of a brief auditory cue can exogenously orient spatial attention to the cued region of space so that reaction times (RTs) to visual targets presented there are faster than those presented elsewhere. Unlike the conventional belief in such crossmodal spatial cuing effects, RTs to visual targets were equally facilitated from the presentation of an auditory cue in the front or in the rear, as long as the stimuli were presented ipsilaterally. Moreover, when an auditory cue and a visual target were presented from one of two lateral positions on each side in front, the spatial co-location of the two stimuli did not always lead to the fastest target RTs. Although contrasting with the traditional view on the importance of cue-target spatial co-location in exogenous crossmodal cuing effects, such findings are consistent with the evidence concerning multisensory integration in the superior colliculus (SC). Further investigation revealed that the presentation of an auditory cue with an exponential intensity change might be able to exogenously orient crossmodal spatial attention narrowly to the cued region of space. Taken together, the findings reported in this thesis suggest that not only the location but also sound parameters (e.g., intensity change) of auditory cues can modulate the crossmodal exogenous orienting of spatial attention.

Page generated in 0.0482 seconds