• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 327
  • 58
  • 46
  • 35
  • 21
  • 9
  • 9
  • 8
  • 7
  • 6
  • 4
  • 4
  • 4
  • 3
  • 3
  • Tagged with
  • 635
  • 66
  • 65
  • 54
  • 54
  • 49
  • 47
  • 45
  • 41
  • 36
  • 35
  • 34
  • 33
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Análise de consistência de traçado de uma rodovia de múltiplas faixas. / Geometric design consistency analysis of a multilane highway.

Ana Luísa Martins Torres 22 October 2015 (has links)
O número de acidentes de trânsito é crescente nas últimas décadas no Brasil. Uma das principais causas de acidentes em rodovias brasileiras é o excesso de velocidade, que contribui para a possibilidade de ocorrência de acidentes. As velocidades praticadas pelos motoristas são também função dos elementos geométricos que compõem a via (raio, rampa, largura da faixa, etc). A consistência de traçado não afeta a expectativa dos motoristas e garante uma operação segura. A maioria dos motoristas consegue perceber as falhas de coordenação, mas tecnicamente, por exemplo, desconhecem a origem das mesmas. Esta pesquisa apresenta como objetivo a análise de consistência de um trecho de uma determinada rodovia do país de múltiplas faixas, com elevado índice de acidentes e alto fluxo de veículos comerciais. Os pontos com maior ocorrência de acidentes foram identificados e realizaram-se medições de velocidade para elaboração de um modelo de previsão de velocidade operacional (V85) do trecho de estudo. De posse deste modelo, procedeu-se à análise de consistência através do método dos critérios de segurança, que identificou 2 seções com problemas de consistência. Por fim, verificou-se se estas seções correspondiam aos locais de maior número de acidentes: a tangente T5 precede uma curva com alto índice de acidentes (km 511+000); o local com maior concentração de acidentes (km 514) foi classificado como RAZOÁVEL. / In Brazil, the number of traffic accidents is increasing in recent decades, and speeding is one of the major causes of accidents on highways, which contributes to the possibility of accidents. The speed used by the driver is determined by the geometric elements of the road (radius, slope, lane width, etc). Geometric design consistency does not affect the expectation of drivers and ensures a safe operation. Most drivers can perceive coordination failures, but, technically, they are unaware of their source. This research aims to perform a consistency analysis of a stretch of a Brazilian multiple lane highway with a high rate of accidents and large number of commercial vehicles. Thus, the sections with a higher accident occurrence were identified, and speeds were measured to determine the operating speed (V85) and, subsequently, to develop a model to predict operating speeds. With this model, the design consistency was analyzed using the method of the safety criteria. This analysis identified two sections with consistency problems. Finally, it was verified if these sections correspond to the sites with higher accident occurrence: tangent T5 precedes a curve with a high number of accidents (km 511+000); the site with higher accident occurrence (km 514) was classified as FAIR.
232

Use of Global Consistency Checking for Exploring and Refining Relationships between Distributed Models : A Case Study

Rad, Yasaman Talaei, Jabbari, Ramtin January 2012 (has links)
Context. Software systems, becoming larger and more complex day-by-day, have resulted in software development processes to become more complex to understand and manage. Many companies have started to adapt distributed software engineering practices that would allow them to work in distributed teams at different organizations and/or geographical locations. For example, model-driven engineering methods are being used in such global software engineering projects. Among the activities in model-based software development, consistency checking is one of the widely known ones. Consistency checking is concerned with consistent models; in particular, having a consistent group of multiple models for a whole system, e.g., multiple models produced by distributed teams. Objectives. This thesis aims to find out how ‘Global Consistency Checking (GCC)’ can be utilized for exploring inconsistency problems between distributed models; particularly among UML class diagram relationships (in terms of consistency), as well as how GCC can be scaled with large number of models and relationships. Thereby, these inconsistencies are also aimed to incrementally resolve in our approach. Methods. We made a review in distributed software development domain and model management, in particular, methods of consistency checking between ‘Distributed Models (DM)’. Next, we conducted two case studies in two problem domains in order to apply our ‘consistency checking methodology’. We concurrently constructed and implemented new consistency rules, most of which are gathered from literatures and brainstorming with our coordinators. Generally, the method contains implementing different models of the case studies with a tool support and trying to figure out overlaps, merging models and checking the merged model against the consistency rules, and evaluating the results of GCC. We mainly addressed issues focused on consistency checking of individual models and the mapping between them e.g., pair-wise consistency checking (PCC), which are incapable of fully addressing problems against any consistency rules encountered in distributed environments. Results. We have identified seven types of inconsistency, which are divided in two groups named ‘Global inconsistency’ and ‘Pair-wise inconsistency’. In the first case study, we have 94 global inconsistencies and 73 pair-wise. In the second one, 14 global and 25 pair-wise inconsistencies are resulted. During ‘Resolution approach’, we followed six steps as a ‘systematic procedure’ for resolving these inconsistencies and constructed new merged model in each iteration. The initial merged model (inconsistent model) as an input for the first step has 1267 elements, and the consistent merged model (the output) from the sixth step has 686 elements. ‘time duration’ and ‘required effort’ for checking consistency against each ‘consistency rule’ were recorded, analyzed and illustrated in Sections 4.1.5 and 4.2.4. Conclusions. We concluded that GCC enables us to explore the inconsistencies, inclusive of resolving them and therefore, refining the relationships between different models, which are difficult to detect by e.g., a pair-wise method. The most important issues are: The number of model comparisons conducted by PCC, The inability of PCC for identifying some inconsistencies, Model relationships refinement and classification based on PCC approach will not lead to a final consistent DM, whereas, GCC guarantees it. Consistency rules application, inconsistency identification and resolving them could be generalized to any UML class diagram model representing a problem domain within the fields of consistency checking in software engineering. / 0046760850792, 0046737749752
233

Classifying Research on UML model Inconsistencies with Systematic Mapping / Classifying Research on UML model Inconsistencies with Systematic Mapping

Thalanki, Pavan Kumar, Maddukuri, Vinay Kiran January 2013 (has links)
Context: Unified Modeling Language (UML) is a universal and standard modeling language that has been extensively used in software development process. Due to overlap and synchronous nature among different modeling artefacts in UML, several consistency issues have been identified in many software development projects that may lead to project failure. To reduce the level of such threat, over the past decade, a substantial research addressing those problems has been done both in academic and industry. This study is intended to investigate the reported research and to provide a systematic picture on different researched aspects of UML model inconsistencies, using the systematic mapping method. Objectives: The overall goal was to be achieved by fulfilling the following two main objectives: elaborating a proper and justified tool for performing the mapping and later used the tool in order to obtain a systematic and multidimensional picture of the approaches and the performed research in the area relating to different issues considering inconsistencies when using UML in software development. Research Methods: In order to ensure quality of the final foreseen systematic picture of the conducted research, a considerable effort was put first on a preparation of the tool that was used to obtain the mapping. The tool was a rigorous process based on classification methods and mapping guidelines obtained from a systematic literature review on the systematic mapping in software engineering. Then the tool was applied in a systematic way to obtain a number of mappings, followed by the analysis of the obtained results. Results: The systematic literature review resulted in identifying 5 mapping guidelines, 21 classifications, and 2 categorization methods. After analysis of them, a justified mapping process was developed by selecting standard guidelines, appropriate classifications and categorization methods. The mapping process applied for the period of 1999-2012 revealed 198 relevant studies developed by 321 researchers. On the basis this evidences, a number of mappings illustrating the conducted research on UML model inconsistencies ware obtained. The mapping reviled that the published research is mostly focused on rather formal issues such as semantic, syntactic, intramodel, inter-model and evolution problems, while a less attention is placed on more practical on time, and security problems. When the quality of research is concerned, 38% of papers proposed solutions as well as validated them through academic, industry or both, 35% of papers proposed only solutions. When the usage of empirical methods is considered, case studies are most frequently used (in almost half of the relevant papers) and followed by experiments (reported in 15% of papers), while 25% carried works do on report a systematic method used. Conclusions: The findings of systematic mapping study revealed that there are some aspect related to consistency such as time and security that are not given big attention. Identification and in-depth studying of inconsistencies in UML designs along with their dependencies are also missing. Most of the investigations are also academic with no evidence whether these reports produce interest for industry or not. State-of-the-art followed by state-of-the-practice studies related to consistency checking techniques and validating them in real industrial setting could be recommended. / Context: Unified Modeling Language (UML) is a universal and standard modeling language that has been extensively used in software development process. Due to overlap and synchronous nature among different modeling artefacts in UML, several consistency issues have been identified in many software development projects that may lead to project failure. To reduce the level of such threat, over the past decade, a substantial research addressing those problems has been done both in academic and industry. This study is intended to investigate the reported research and to provide a systematic picture on different researched aspects of UML model inconsistencies, using the systematic mapping method. Objectives: The overall goal was to be achieved by fulfilling the following two main objectives: elaborating a proper and justified tool for performing the mapping and later used the tool in order to obtain a systematic and multidimensional picture of the approaches and the performed research in the area relating to different issues considering inconsistencies when using UML in software development. Research Methods: In order to ensure quality of the final foreseen systematic picture of the conducted research, a considerable effort was put first on a preparation of the tool that was used to obtain the mapping. The tool was a rigorous process based on classification methods and mapping guidelines obtained from a systematic literature review on the systematic mapping in software engineering. Then the tool was applied in a systematic way to obtain a number of mappings, followed by the analysis of the obtained results. Results: The systematic literature review resulted in identifying 5 mapping guidelines, 21 classifications, and 2 categorization methods. After analysis of them, a justified mapping process was developed by selecting standard guidelines, appropriate classifications and categorization methods. The mapping process applied for the period of 1999-2012 revealed 198 relevant studies developed by 321 researchers. On the basis this evidences, a number of mappings illustrating the conducted research on UML model inconsistencies ware obtained. The mapping reviled that the published research is mostly focused on rather formal issues such as semantic, syntactic, intramodel, inter-model and evolution problems, while a less attention is placed on more practical on time, and security problems. When the quality of research is concerned, 38% of papers proposed solutions as well as validated them through academic, industry or both, 35% of papers proposed only solutions. When the usage of empirical methods is considered, case studies are most frequently used (in almost half of the relevant papers) and followed by experiments (reported in 15% of papers), while 25% carried works do on report a systematic method used. Conclusions: The findings of systematic mapping study revealed that there are some aspect related to consistency such as time and security that are not given big attention. Identification and in-depth studying of inconsistencies in UML designs along with their dependencies are also missing. Most of the investigations are also academic with no evidence whether these reports produce interest for industry or not. State-of-the-art followed by state-of-the-practice studies related to consistency checking techniques and validating them in real industrial setting could be recommended.
234

Classifying Research on UML model Inconsistencies with Systematic Mapping / Classifying Research on UML model Inconsistencies with Systematic Mapping

Thalanki, Pavan Kumar, Maddukuri, Vinay Kiran January 2013 (has links)
Context: Unified Modeling Language (UML) is a universal and standard modeling language that has been extensively used in software development process. Due to overlap and synchronous nature among different modeling artefacts in UML, several consistency issues have been identified in many software development projects that may lead to project failure. To reduce the level of such threat, over the past decade, a substantial research addressing those problems has been done both in academic and industry. This study is intended to investigate the reported research and to provide a systematic picture on different researched aspects of UML model inconsistencies, using the systematic mapping method. Objectives: The overall goal was to be achieved by fulfilling the following two main objectives: elaborating a proper and justified tool for performing the mapping and later used the tool in order to obtain a systematic and multidimensional picture of the approaches and the performed research in the area relating to different issues considering inconsistencies when using UML in software development. Research Methods: In order to ensure quality of the final foreseen systematic picture of the conducted research, a considerable effort was put first on a preparation of the tool that was used to obtain the mapping. The tool was a rigorous process based on classification methods and mapping guidelines obtained from a systematic literature review on the systematic mapping in software engineering. Then the tool was applied in a systematic way to obtain a number of mappings, followed by the analysis of the obtained results. Results: The systematic literature review resulted in identifying 5 mapping guidelines, 21 classifications, and 2 categorization methods. After analysis of them, a justified mapping process was developed by selecting standard guidelines, appropriate classifications and categorization methods. The mapping process applied for the period of 1999-2012 revealed 198 relevant studies developed by 321 researchers. On the basis this evidences, a number of mappings illustrating the conducted research on UML model inconsistencies ware obtained. The mapping reviled that the published research is mostly focused on rather formal issues such as semantic, syntactic, intramodel, inter-model and evolution problems, while a less attention is placed on more practical on time, and security problems. When the quality of research is concerned, 38% of papers proposed solutions as well as validated them through academic, industry or both, 35% of papers proposed only solutions. When the usage of empirical methods is considered, case studies are most frequently used (in almost half of the relevant papers) and followed by experiments (reported in 15% of papers), while 25% carried works do on report a systematic method used. Conclusions: The findings of systematic mapping study revealed that there are some aspect related to consistency such as time and security that are not given big attention. Identification and in-depth studying of inconsistencies in UML designs along with their dependencies are also missing. Most of the investigations are also academic with no evidence whether these reports produce interest for industry or not. State-of-the-art followed by state-of-the-practice studies related to consistency checking techniques and validating them in real industrial setting could be recommended. / Context: Unified Modeling Language (UML) is a universal and standard modeling language that has been extensively used in software development process. Due to overlap and synchronous nature among different modeling artefacts in UML, several consistency issues have been identified in many software development projects that may lead to project failure. To reduce the level of such threat, over the past decade, a substantial research addressing those problems has been done both in academic and industry. This study is intended to investigate the reported research and to provide a systematic picture on different researched aspects of UML model inconsistencies, using the systematic mapping method. Objectives: The overall goal was to be achieved by fulfilling the following two main objectives: elaborating a proper and justified tool for performing the mapping and later used the tool in order to obtain a systematic and multidimensional picture of the approaches and the performed research in the area relating to different issues considering inconsistencies when using UML in software development. Research Methods: In order to ensure quality of the final foreseen systematic picture of the conducted research, a considerable effort was put first on a preparation of the tool that was used to obtain the mapping. The tool was a rigorous process based on classification methods and mapping guidelines obtained from a systematic literature review on the systematic mapping in software engineering. Then the tool was applied in a systematic way to obtain a number of mappings, followed by the analysis of the obtained results. Results: The systematic literature review resulted in identifying 5 mapping guidelines, 21 classifications, and 2 categorization methods. After analysis of them, a justified mapping process was developed by selecting standard guidelines, appropriate classifications and categorization methods. The mapping process applied for the period of 1999-2012 revealed 198 relevant studies developed by 321 researchers. On the basis this evidences, a number of mappings illustrating the conducted research on UML model inconsistencies ware obtained. The mapping reviled that the published research is mostly focused on rather formal issues such as semantic, syntactic, intramodel, inter-model and evolution problems, while a less attention is placed on more practical on time, and security problems. When the quality of research is concerned, 38% of papers proposed solutions as well as validated them through academic, industry or both, 35% of papers proposed only solutions. When the usage of empirical methods is considered, case studies are most frequently used (in almost half of the relevant papers) and followed by experiments (reported in 15% of papers), while 25% carried works do on report a systematic method used. Conclusions: The findings of systematic mapping study revealed that there are some aspect related to consistency such as time and security that are not given big attention. Identification and in-depth studying of inconsistencies in UML designs along with their dependencies are also missing. Most of the investigations are also academic with no evidence whether these reports produce interest for industry or not. State-of-the-art followed by state-of-the-practice studies related to consistency checking techniques and validating them in real industrial setting could be recommended. / C/o Thalanki Anjaneyulu, H.No.76/119-D5-43, Mahaveer Colony, B.G.Road, Kurnool -518003, Andhra Pradesh, India
235

How to implement Bounded-Delay replication in DeeDS

Eriksson, Daniel January 2002 (has links)
In a distributed database system, pessimistic concurrency control is often used to ensure consistency which implies that the execution time of a transaction is not predictable. The execution time of a transaction is not dependent on the local transactions only, but on every transaction in the system. In real-time database systems it is important that transactions are predictable. One way to make transactions predictable is to use eventual consistency where transactions commit locally before they are propagated to other nodes in the system. It is then possible to get predictable transactions due to the fact that the execution time of the transaction only depends on concurrent transactions on the local node and not on delays on other nodes and delays from a network. In this report an investigation is made on how a replication protocol using eventual consistency can be designed for, and implemented in, DeeDS, a distributed real-time database prototype. The protocol consists of three parts: a propagation method, a conflict detection algorithm, and a conflict resolution mechanism. The conflict detection algorithm is based on version vectors. The focus is on the propagation mechanism and the conflict detection algorithm of the replication protocol. An implementation design of the replication protocol is made. A discussion on how the version vectors may be applied in terms of granularity (container, page, object or attribute) and how the log filter should be designed and implemented to suit the particular conflict detection algorithm is carried out. A number of test cases with focus on regression testing have been defined. It is concluded that the feasibility of the conflict detection algorithm is dependent on the application type that uses DeeDS.
236

On recovery and consistency preservation in distributed real-time database systems

Gustavsson, Sanny January 2000 (has links)
In this dissertation, we consider the problem of recovering a crashed node in a distributed database. We especially focus on real-time recovery in eventually consistent databases, where the consistency of replicated data is traded off for increased predictability, availability and performance. To achieve this focus, we consider consistency preservation techniques as well as recovery mechanisms. Our approach is to perform a thorough literature survey of these two fields. The literature survey considers not only recovery in real-time, distributed, eventually consistent databases, but also related techniques, such as recovery in main-memory resident or immediately consistent databases. We also examine different techniques for consistency preservation. Based on this literature survey, we present a taxonomy and state-of-the-art report on recovery mechanisms and consistency preservation techniques. We contrast different recovery mechanisms, and highlight properties and aspects of these that make them more or less suitable for use in an eventually consistent database. We also identify unexplored areas and uninvestigated problems within the fields of database recovery and consistency preservation. We find that research on real-time recovery in distributed databases is lacking, and we also propose further investigation of how the choice of consistency preservation technique affects (or should affect) the design of a recovery mechanism for the system.
237

Only a Shadow : Industrial computed tomography investigation, and method development, concerning complex material systems

Jansson, Anton January 2016 (has links)
The complexity of components fabricated in today's industry is ever increasing. This increase is partly due to market pressure but it is also a result from progress in fabrication technologies that opens up new possibilities. The increased use of additive manufacturing and multi-material systems, especially, has driven the complexity of parts to new heights. The new complex material systems brings benefits in many areas such as; mechanical properties, weight optimisation, and sustainability. However, the increased complexity also makes material integrity investigations and dimensional control more difficult. In additive manufacturing, for example, internal features can be fabricated which cannot be seen or measured with conventional tools. There is thus a need for non-destructive inspection methods that can measure these geometries. Such a method is X-ray computed tomography. Computed tomography utilizes the X-rays ability to penetrate material to create 3D digital volumes of components. Measurements and material investigations can be performed in these volumes without any damage to the investigated component. However, computed tomography in material science is still not a fully mature method and there are many uncertainties associated with the investigation technique. In the work presented in this thesis geometries fabricated by various additive manufacturing processes have been investigated using computed tomography. Also in this work, a dual-energy computed tomography tool has been developed with the aim to increase the measurement consistency of computed tomography when investigating complex geometries and material combinations. / MultiMatCT
238

Photon migration in pulp and paper

Saarela, J. (Juha) 07 December 2004 (has links)
Abstract The thesis clearly demonstrates that photon migration measurements allow characterization of pulp and paper properties, especially the fines and filler content of pulp, and the basis weight, thickness and porosity of paper. Pulp and paper are materials with a worldwide significance. Their properties strongly depend on the manufacturing process used. For efficient process control, the employed monitoring and measuring has to be fast. Therefore it is worthwhile to try to develop new approaches and techniques for such measurements. Recent advancements in optics offer new possibilities for such development. If two samples have different optical properties their photon migration distributions are different. The measurement of a photon migration distribution allows some features between two optically slightly dissimilar samples to be distinguished. Some simple measurements, which only yielded the photons' average time of flight, were made with an oscilloscope and a time-of-flight lidar. More precise measurements yielding photon pathway distribution or some selected characteristics like light pulse rise time, broadening, or fall time were measured with a streak camera. Two methods to assess photon path length distribution were introduced: particle determination with simulation, and streak camera with deconvolution. The basic properties for pulp are consistency and fines content and for paper the basic properties are thickness, basis weight and porosity. The influence on photon migration caused by changes in these basic properties was determined. As pulp and paper are rarely very basic, an additional property was demonstrated for both materials. For pulp it was the content of filler talc, and for paper it was the use of beaten pulp as a raw material. These additional properties were also distinguishable.
239

A general framework for modifying health-relevant behavior: reducing undergraduate binge drinking by appealing to commitment and reciprocity

Conner, Amy E. January 1900 (has links)
Doctor of Philosophy / Department of Psychology / Laura A. Brannon / Binge drinking is a serious health problem among American college students (Wechsler, Lee, Kuo, & Lee, 2000a). One technique that may reduce binge drinking is compliance. Cialdini (2001) defined compliance as taking an action because it has been requested and described sequential request tactics, including the commitment/consistency-based foot-in-the-door (FITD) tactic, and the reciprocity-based door-in-the-face (DITF) tactic. Cialdini claimed that these tactics yield automatic compliance. The present research investigated Cialdini’s automaticity assumption within the context of reducing binge drinking, by including a neutral or weak message along with the compliance request (consistent with Brannon & Brock, 2001). The main hypothesis was that compliance is not automatic, as demonstrated by differential compliance consistent with message strength. Parallel experiments investigated compliance with requests to reduce one’s drinking behavior (Experiment 1, N=129) or communicate about responsible drinking (Experiment 2, N=122). Participants were randomly assigned to one of six conditions in each experiment. Consistent with the purpose of each experiment, participants indicated whether they would comply with initial requests consistent with FITD and DITF methodology, or were not asked to comply with an initial request (control); read either a neutral or weak message about the importance of moderate alcohol consumption; then responded to the target request (dependent variable) by reporting the likelihood that they would not drink excessively for one week (Experiment 1) or would discuss responsible drinking with someone (Experiment 2). Participants in both experiments completed demographic and alcohol consumption information and a social desirability measure (Strahan & Gerbasi, 1972). Data were submitted to 2(Strength) × 3(Appeal) × 2(Gender) ANCOVAs (drinks per occasion and social desirability were covariates). Experiment 1 revealed a significant Strength × Appeal interaction, with the DITF and FITD appeals eliciting lower compliance rates than the control appeal when accompanied by a weak persuasive message, thereby refuting Cialdini’s automaticity assumption. A significant main effect for appeal in Experiment 2 (DITF yielded lower compliance than FITD or control appeal) did not support Cialdini’s (2001) claim. Correlates of drinking behavior among college students are discussed, as are implications of the present research for compliance theory and reducing binge drinking on American college campuses.
240

Cause Placement: A Conceptual Framework and Empirical Findings

Shoreibah, Ream A. 22 June 2016 (has links)
The use of embedded marketing, the practice of seamlessly integrating advertising messages into entertainment vehicles, continues to grow as media consumption shifts to on-demand forms, and reaching audiences with traditional advertising becomes more challenging. This dissertation investigates cause placement, the term proposed for the social marketing equivalent of product placement, the more widely known form of embedded marketing. Cause placement is the promotion of pro-social causes by verbally and/or visually inserting related elements into entertainment programming. Cause placement merits its own stream of research, because consumers are expected to react differently to the placement of social issues than to the placement of commercial products. However, cause placement has enjoyed little empirical research. This two-essay dissertation proposes a theoretical framework for the relationship between six independent variables, three of which have not been previously investigated in the embedded marketing research, on three dependent variables that measure the effectiveness of cause placement. The independent variables are placement modality, placement centrality, programming genre, image of the character, consistency of the behavior, and brandedness of the cause. The dependent variables are recall of the cause, attitude toward the cause, and intention to support the cause. Each of the two essays tests a portion of the proposed framework. Essay 1 (Chapter 4) investigates the effects of brandedness of the cause and placement modality on the three dependent variables using a 2 (branded/unbranded) by 3 (verbal/visual/ both) between-subjects design. As hypothesized, a branded cause was found to yield better recall than an unbranded one regardless of modality. Contrary to expectations however, there was no interaction effect between modality and brandedness on attitude toward the cause and intention to support the cause. The branded cause resulted in higher attitudes than the unbranded ones, and there were no significant differences among the groups for intention to support the cause, likely due to a ceiling effect reached because of the familiar cause used. The pattern of results plotted for attitude toward the cause was in the predicted direction, such that for the unbranded conditions the both verbal and visual modality had the highest attitude while for the branded conditions the opposite was true. Essay 2 (Chapter 5) investigates the effect of image of the character and consistency of the behavior on the three dependent variables using a 2 (“good guy”/”bad guy”) by 2 (consistent/ inconsistent) between-subjects design. As hypothesized, recall of the cause was higher when the main character’s behavior was consistent with his personality, regardless of the image of the character. Also as predicted, there was an interaction effect between image of the character and consistency of the behavior, such that attitude toward the cause was higher for consistent than inconsistent behavior when the image of the character was “bad guy,” but there was no significant difference in attitude toward the cause for consistent versus inconsistent behavior, when the image of the character was “good guy.” The analogous pattern hypothesized for intention to support the cause did not hold, however, perhaps due to the moral obligation that participants may have felt to follow the promoted behavior regardless of their personal attitude toward the cause. Limitations for both essays are discussed, as well as areas for future research.

Page generated in 0.079 seconds