• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 37
  • 20
  • 10
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 87
  • 23
  • 17
  • 14
  • 12
  • 11
  • 10
  • 9
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Modélisation de la contamination par Listeria monocytogenes pour l'amélioration de la surveillance dans les industries agro-alimentaires / Contamination modeling of Listeria monocytogenes to improve surveillance in food industry

Commeau, Natalie 04 June 2012 (has links)
Les industriels du secteur agro-alimentaire sont responsables de la qualité des produits mis sur le marché. Un moyen de vérifier cette qualité consiste à déterminer la distribution de la contamination. Dans cette thèse, nous avons utilisé des données portant sur L. monocytogenes durant le procédé de fabrication de lardons et du saumon fumé. Nous avons ensuite élaboré des modèles hiérarchiques pour décrire la concentration en prenant ou non en compte diverses variabilités, nous avons estimé les paramètres par inférence bayésienne, puis comparé leur capacité à simuler des données proches des observations. Nous avons également comparé l'estimation de paramètres par inférence fréquentiste sur deux modèles en utilisant les données brutes issues des analyses microbiologiques et ces mêmes données converties en concentration. Par ailleurs, nous avons amélioré un modèle décrivant le devenir de L. monocytogenes au cours de la fabrication des lardons. Le plan d'échantillonnage permettant d'estimer la qualité des produits, nous avons appliqué la théorie de la décision aux couples L. monocytogenes/lardons et L. monocytogenes/saumon fumé en sortie usine pour déterminer la taille optimale de l'échantillon à prélever par lot de manière à minimiser les coûts moyens supportés par le fabricant. Enfin, nous avons comparé plusieurs plans d'échantillonnage de mesure de la température d'un plat en sauce fabriqué dans une cuisine centrale et placé dans une cellule de refroidissement rapide. L'objectif était de sélectionner le meilleur plan d'échantillonnage en fonction du risque admissible pour le gestionnaire quant à la croissance de C. perfringens. / Food business operators are responsible for the quality of the products they sell. A way to assess the safety of food is to determine the contamination distribution. During my PhD thesis, we used data about L. monocytogenes during the process of diced bacon and of cold smoked salmon. Then, we constructed several hierarchical models to describe contamination taking or not into account several kinds of variability such as between batches variability. We compared the capacity of each model to simulate data close to the observed ones. We also compared the parameters assessment by frequentist inference using raw data (the results of the microbiological analyses) and concentration-like data. In addition to the models describing the contamination at one step of the process, we improved an existing model describing the fate of L. monocytogenes throughout the diced bacon process. A tool to assess the quality of a product is the sampling plan. We applied the Bayesian theory of decision to the pairs L. monocytogenes/diced bacon and L. monocytogenes/cold smoked salmon at the end of the process to determine the optimal size of a sample analysed per batch so that the average cost for the manufacturer is as los as possible. We also compared several sampling plans of temperature measurement of a meal cooked in an institutional food service facility and put in a blast-chiller just after cooking. The aim was to select the best sampling plan regarding the risk of C. perfringens growth that the manager is ready to take.
72

[en] SEMANTIC INFERENCES IN INFORMATION RETRIEVAL FOR HYPERMEDIA APPLICATIONS / [pt] INFERÊNCIAS SEMÂNTICAS NA RECUPERAÇÃO DE INFORMAÇÕES PARA APLICAÇÕES HIPERMÍDIA

CRISTIANO BRAZ ROCHA 27 October 2003 (has links)
[pt] O problema de sobrecarga de informação é um dos mais sérios enfrentados atualmente. Para tentar resolver esse problema, áreas distintas como Gestão do Conhecimento, Web Semântica e Modelagem de Aplicações Hipermídia têm utilizado soluções parecidas que consistem basicamente na estruturação semântica da informação, para que ela seja mais facilmente acessada. Esta dissertação propõe uma infra-estrutura baseada em técnicas e algoritmos clássicos da área de Inteligência Artificial, que aproveita a crescente disponibilidade de modelos relativos a um domínio para permitir que as aplicações onde os mesmos estão definidos realizem inferências sobre o domínio em questão. Isso possibilita a introdução de diversas novas funcionalidades nessas aplicações. Foram propostas e desenvolvidas quatro novas funcionalidades, a principal sendo a busca semântica. As novas funcionalidades foram testadas com grande sucesso em duas aplicações: o site do Departamento de Informática da PUC-Rio e o Portal do Conhecimento a respeito da obra do grande pintor brasileiro Candido Portinari. / [en] The information overload problem is one of the most challenging problems being faced today. In order to solve this problem, different areas such as Knowledge Management, Semantic Web and Hypermedia Applications Modeling have used similar solutions that consist basically of semantically structuring the information so it can be better accessed. This dissertation proposes an infrastructure based on classic algorithms and techniques of Artificial Intelligence that utilizes the increase in the availability of domain specific models to enable the applications where they are defined to make inferences about these particular domains. These inferences enable the creation of new functionalities in these applications. Four new functionalities were proposed and implemented, the most important being a semantic search. The new functionalities presented were successfully tested in two existing applications: the website of the Computer Science Department of PUC-Rio and the Portinari Knowledge Portal that presents all the work of the famous brazilian painter Candido Portinari.
73

Standard and Non-standard reasoning in Description Logics / Standard- und Nicht-Standard-Inferenzen in Beschreibungslogiken

Brandt, Sebastian-Philipp 23 May 2006 (has links) (PDF)
The present work deals with Description Logics (DLs), a class of knowledge representation formalisms used to represent and reason about classes of individuals and relations between such classes in a formally well-defined way. We provide novel results in three main directions. (1) Tractable reasoning revisited: in the 1990s, DL research has largely answered the question for practically relevant yet tractable DL formalisms in the negative. Due to novel application domains, especially the Life Sciences, and a surprising tractability result by Baader, we have re-visited this question, this time looking in a new direction: general terminologies (TBoxes) and extensions thereof defined over the DL EL and extensions thereof. As main positive result, we devise EL++(D)-CBoxes as a tractable DL formalism with optimal expressivity in the sense that every additional standard DL constructor, every extension of the TBox formalism, or every more powerful concrete domain, makes reasoning intractable. (2) Non-standard inferences for knowledge maintenance: non-standard inferences, such as matching, can support domain experts in maintaining DL knowledge bases in a structured and well-defined way. In order to extend their availability and promote their use, the present work extends the state of the art of non-standard inferences both w.r.t. theory and implementation. Our main results are implementations and performance evaluations of known matching algorithms for the DLs ALE and ALN, optimal non-deterministic polynomial time algorithms for matching under acyclic side conditions in ALN and sublanguages, and optimal algorithms for matching w.r.t. cyclic (and hybrid) EL-TBoxes. (3) Non-standard inferences over general concept inclusion (GCI) axioms: the utility of GCIs in modern DL knowledge bases and the relevance of non-standard inferences to knowledge maintenance naturally motivate the question for tractable DL formalism in which both can be provided. As main result, we propose hybrid EL-TBoxes as a solution to this hitherto open question.
74

L’évolution des pratiques de lecture à haute voix d’enseignantes expertes et leur influence sur le développement de l’habileté des élèves du préscolaire à faire des inférences

Dupin de Saint-André, Marie 08 1900 (has links)
La présente recherche collaborative vise à étudier les pratiques de lecture à haute voix d’enseignantes expertes et leur influence sur le développement de l’habileté des élèves du préscolaire à faire des inférences. Plus précisément, nous décrivons les interventions, lors des lectures à haute voix, de quatre enseignantes expertes que nous avons formées pour travailler la compréhension inférentielle et les comparons à celles de deux enseignantes expertes non formées (objectif 1). Puis, nous examinons l’influence de leurs pratiques de lecture à haute voix sur le développement de l’habileté à faire des inférences de leurs élèves (n=92) (objectif 2). Enfin, nous nous intéressons, à titre exploratoire, au potentiel de la recherche collaborative comme voie de développement professionnel des enseignants (objectif 3). Afin d’atteindre nos objectifs, nous avons observé les enseignantes à quatre reprises, lors de lectures d’albums de littérature de jeunesse. Les données obtenues à partir de ces pratiques observées ont été complétées par celles issues des pratiques déclarées des enseignantes pour toute la durée de la recherche (neuf semaines). De plus, nous avons évalué l’habileté à réaliser des inférences des élèves de ces six enseignantes à deux reprises, au début et à la fin de la recherche. Finalement, les enseignantes ont rempli deux questionnaires écrits sur l’impact de cette recherche sur leurs pratiques, l’un à la fin de la recherche et l’autre deux ans après celle-ci. Nos résultats indiquent que si toutes les enseignantes travaillaient les inférences, il existe cependant des différences notables dans leur façon de mener ce travail. Trois des enseignantes formées (enseignantes 1, 2 et 3) ont majoritairement privilégié la co-élaboration du sens des épisodes implicites et se sont montrées très efficaces pour soutenir leurs élèves dans ce travail de construction du sens. L’autre enseignante formée (enseignante 4), en raison d’une appropriation difficile du contenu de la formation, n’est pas parvenue à offrir à ses élèves un étayage adéquat, malgré ses nombreuses tentatives pour les amener à faire des inférences. Les enseignantes non formées (enseignantes 5 et 6) ont, quant à elles, misé plus fréquemment sur la transmission du sens des épisodes implicites et ont également soutenu moins efficacement leurs élèves dans l’élaboration du sens. Ces différences dans la façon de travailler les inférences se sont répercutées sur la progression des élèves entre le début et la fin de la recherche. Ceux des enseignantes 1, 2 et 3 obtiennent des résultats significativement supérieurs à ceux des trois autres enseignantes. Ainsi, il ne suffit pas de travailler les inférences lors des lectures à haute voix pour assurer la progression des élèves. D’autres facteurs revêtent aussi une grande importance : le choix d’oeuvres de qualité, l’engagement des élèves dans les discussions pour co-élaborer le sens et l’étayage adéquat de l’enseignant. Enfin, il semblerait que l’activité réflexive suscitée lors d’une participation à une recherche collaborative et le soutien offert par l’étudiante-chercheuse aient donné aux enseignantes la possibilité d’apporter des changements durables dans leurs pratiques. En ce sens, la recherche collaborative paraît être une voie prometteuse pour contribuer au développement professionnel des enseignants. / This collaborative research is an attempt to study expert teachers' reading aloud practices and their impact on the development of preschool students’ ability to make inferences. First, the interventions of four expert teachers specifically trained to work inferences while reading aloud are described and compared with those of two non trained expert teachers (objective 1). Next, the impact of all teachers’ practices on the development of their students’ ability to make inferences (n=92) is examined (objective 2). Finally, we look into collaborative research as a way to support professional development for teachers (objective 3). In order to attain our objectives, we observed participant teachers four times while they were reading children’s books. Data gathered from those observations were completed with other data provided by the teachers about their practices over a nine-week period. Moreover, the ability of the students to make inferences was assessed twice: once at the beginning and once at the end of the research. Teachers also completed two written questionnaires, one at the beginning and the other 2 years after the end of this research, concerning the impacts of their participation in this research on their practices. While all teachers worked inferences with their students, our results showed that there are significant differences in the way they do it. Three of the trained teachers (number 1, 2, and 3) predominantly put emphasis on co-elaboration of the implicit episodes’ meaning and gave their students appropriate scaffolding. Because she found it difficult to assimilate training contents, the other trained teacher (number 4) failed - not without numerous attempts - to offer the same kind of scaffolding. As for the two non trained teachers (number 5 and 6), they gave preference to the transmission of the implicit episodes’ meaning, and supported their students less efficiently when they tried to elaborate it. The differences in how the work on inferences was done had an impact on students’ progression from the beginning to the end of the research. Three of the trained teachers’ students had significantly superior results in comparison to the others’. Therefore, a specific work on inferences only does not ensure that students will make progress. There are other important contributing factors: choosing quality books, students’ involvement in discussions in order to co-elaborate the text’s meaning, and appropriate scaffolding from the teacher. Finally, according to our results, a participation in a collaborative research, which promotes reflexivity and offers coaching, seems to give the teachers the opportunity to bring about long-term changes in their practices. In consequence, a participation in that type of research appears to contribute to teachers’ professional development.
75

參考價格的效果:以消費者認知需求程度探討 / The moderating impact of need for cognition on the effect of reference prices - A contingency model

李景浩, Li,Ching-how Unknown Date (has links)
在各式各樣的銷售技巧中,誇張廠商建議價格是一種常見的手法,廠商漫天喊價誇大原始售價,無非是希望能讓消費覺得實際售價低廉且商品值得購買。但在廠商使用這樣的策略時,有些人會察覺到廠商操弄的意圖進而對廠商、商品產生負面的感覺。本研究認為消費者的認知需求程度是參考價格產生效果的重要調節變項,因此,本研究發展兩階段的實驗來探討消費者認知需求高低對其處理價格資訊的影響。研究方式與結果簡述如下: 在第一個階段中研究的是誇張的行銷訊息(誇張的廠商售價)對於認知需求程度不同的消費者的效果差異,以及廠商售價對於消費者的商品態度與購買意願的影響。研究結果發現,認知需求低者較易被廠商宣稱的誇張價格影響其對商品的推估價格,進而對商品產生較佳的態度與購買意願。 在第二個階段中,本研究設計讓已經受到廠商訂價影響的消費者接觸實際售價,探討消費者的認知需求程度對於其操弄意圖推論程度的影響及消費者操弄意圖推論程度對於其最終購買意願與商品態度產生的效果。結果發現,認知需求高的消費者其操弄意圖推論程度亦高,且消費者的操弄意圖推論程度愈高;對於消費者商品態度與購買意願負面的影響也愈大。 / One main concern regarding the use of reference prices in advertisements relates to the possibility of perceived deception due to consumers' reaction towards exaggerated or implausible price claims. This paper examines the moderating roles of an individual level variable-need for cognition (NFC), in influencing consumers’ evaluation of the reference prices in a two-stage experiment. The results support the hypothesized effects of need for cognition and demonstrate that, in the first stage, consumers with a high need for cognition assimilate a smaller portion of the external reference points (ERPs) into their existing internal reference point (IRPs). In addition, for consumers with a low need for cognition, the increasing level of reference price results in positive effects on value perception, brand attitudes and purchase intention. In the second stage, I introduced inferences of manipulative intent (IMI) as a dependant variable. The results suggest that for consumers with a high need for cognition, the greater the discrepancy between their estimated price and real price, the higher level of perceived manipulative intent of advertisers. This leads to negative attitudes toward the advertiser and results in negative effects on brand attitudes and purchase intention. Implications for research and practitioners are discussed.
76

Default reasoning and neural networks

Govender, I. (Irene) 06 1900 (has links)
In this dissertation a formalisation of nonmonotonic reasoning, namely Default logic, is discussed. A proof theory for default logic and a variant of Default logic - Prioritised Default logic - is presented. We also pursue an investigation into the relationship between default reasoning and making inferences in a neural network. The inference problem shifts from the logical problem in Default logic to the optimisation problem in neural networks, in which maximum consistency is aimed at The inference is realised as an adaptation process that identifies and resolves conflicts between existing knowledge about the relevant world and external information. Knowledge and data are transformed into constraint equations and the nodes in the network represent propositions and constraint equations. The violation of constraints is formulated in terms of an energy function. The Hopfield network is shown to be suitable for modelling optimisation problems and default reasoning. / Computer Science / M.Sc. (Computer Science)
77

Outils d'élaboration de stratégie de recyclage basée sur la gestion des connaissances : application au domaine du génie des procédés / Tools of elaboration of strategy of waste recycling based on knowledge management : application on process engineering

Chazara, Philippe 06 November 2015 (has links)
Dans ce travail, une étude est réalisée sur le développement d'une méthodologie permettant la génération et l'évaluation de nouvelles trajectoires de valorisation pour des déchets. Ainsi, pour répondre à cette problématique, trois sous problèmes ont été identifiés. Le premier concerne un cadre de modélisation permettant la représentation structurée et homogène de chaque trajectoire, ainsi que les indicateurs choisis pour l'évaluation de ces dernières, permettant une sélection ultérieure. Le deuxième se concentre sur le développement d'une méthodologie puis la réalisation d'un outil permettant la génération de nouvelles trajectoires en s'appuyant sur d'autres connues. Enfin, le dernier sous problème concerne le développement d'un second outil développé pour modéliser et estimer les trajectoires générées. La partie de création d'un cadre de modélisation cherche à concevoir des structures globales qui permettent la catégorisation des opérations unitaires sous plusieurs niveaux. Trois niveaux de décomposition ont été identifiés. La Configuration générique de plus haut niveau, qui décrit la trajectoire sous de grandes étapes de modélisation. Le second niveau, Traitement générique propose des ensembles de structures génériques de traitement qui apparaissent régulièrement dans les trajectoires de valorisation. Enfin, le plus bas niveau se focalise sur la modélisation des opérations unitaires. Un second cadre a été créé, plus conceptuel et comportant deux éléments : les blocs et les systèmes. Ces cadres sont ensuite accompagnés par un ensemble d'indicateurs choisis à cet effet. Dans une volonté d'approche de développement durable, un indicateur est sélectionné pour chacune de des composantes : économique, environnemental et social. Dans notre étude, l'impact social se limite à l'estimation du nombre d'emplois créés. Afin de calculer cet indicateur, une nouvelle approche se basant sur les résultats économiques d'une entreprise a été proposée et validée.L'outil de génération de nouvelles trajectoires s'appuie sur l'utilisation de la connaissance en utilisant un système de raisonnement à partir de cas (RàPC). Pour être adapté à notre problématique, la mise en œuvre de ce dernier a impliqué la levée de plusieurs points délicats. Tout d'abord, la structuration des données et plus largement la génération de cas sources sont réalisées par un système basé sur des réseaux sémantiques et l'utilisation de mécanismes d'inférences. Le développement d'une nouvelle méthode de mesure de similarité est réalisé en introduisant la notion de définition commune qui permet de lier les états, qui sont des descriptions de situations, à des états représentant des définitions générales d'un ensemble d'états. Ces définitions communes permettent la création d'ensembles d'états sous différents niveaux d'abstraction et de conceptualisation. Enfin, un processus de décompositions des trajectoires est réalisé afin de résoudre un problème grâce à la résolution de ses sous-problèmes associés. Cette décomposition facilite l'adaptation des trajectoires et l'estimation des résultats des transformations. Basé sur cette méthode, un outil a été développé en programmation logique, sous Prolog. La modélisation et l'évaluation des voies de valorisation se fait grâce à la création d'outil spécifique. Cet outil utilise la méta-programmation permettant la réalisation dynamique de modèle de structure. Le comportement de ces structures est régi par la définition de contraintes sur les différents flux circulants dans l'ensemble de la trajectoire. Lors de la modélisation de la trajectoire, ces contraintes sont converties par un parser permettant la réalisation d'un modèle de programmation par contraintes cohérent. Ce dernier peut ensuite être résolu grâce à des solveurs via une interface développée et intégrée au système. De même, plusieurs greffons ont été réalisés pour analyser et évaluer les trajectoires à l'aide des critères retenus. / In this work, a study is realised about the creation of a new methodology allowing the generation and the assessment of new waste recovery processes. Three elements are proposed for that. The first one is the creation of a modelling framework permitting a structured and homogeneous representation of each recovery process and the criteria used to asses them. The second one is a system and a tool generating new recovery processes from others known. Finally, the last element is another tool to model, to estimate and to asses the generated processes. The creation of a modelling framework tries to create some categories of elements allowing the structuring of unit operations under different levels of description. Three levels have been identified. In the higher level, the Generic operation which describes global structure of operations. The second one is Generic treatment which is an intermediate level between the two others. It proposes here too categories of operations but more detailed than the higher level. The last one is the Unit operation. A second framework has been created. It is more conceptual and it has two components : blocs and systems. These frameworks are used with a set of selected indicators. In a desire of integrating our work in a sustainable development approach, an indicator has been chosen for each of its components: economical, environmental and social. In our study, the social impact is limited to the number of created jobs. To estimate this indicator, we proposed a new method based on economical values of a company. The tool for the generation of new waste recovery processes used the methodology of case-based reasoning CBR which is based on the knowledge management. Some difficult points are treated here to adapt the CBR to our problem. The structuring of knowledge and generally the source case generation is realised by a system based on connections between data and the use of inference mechanisms. The development of a new method for the similarity measure is designed with the introduction of common definition concept which allows linking states, simply put description of objects, to other states under different levels of conceptualizations and abstractions. This point permits creating many levels of description. Finally, recovery process is decomposed from a main problem to some sub-problems. This decomposition is a part of the adaptation mechanism of the selected source case. The realisation of this system is under logic programming with Prolog. This last one permits the use of rules allowing inferences and the backtracking system allowing the exploration to the different possible solution. The modelling and assessment of recovery processes are done by a tool programmed in Python. It uses the meta-programming to dynamically create model of operations or systems. Constraint rules define the behaviour of these models allowing controlling the flux circulating in each one. In the evaluation step, a parser is used to convert theses rules into a homogeneous system of constraint programming. This system can be solved by the use of solvers with an interface developed for that and added to the tool. Therefore, it is possible for the user to add solvers but also to add plug-ins. This plug-ins can make the assessment of the activity allowing to have different kinds of evaluation for the same criteria. Three plug-ins are developed, one for each selected criterion. These two methods are tested to permit the evaluation of the proposed model and to check the behaviour of them and their limits . For these tests, a case-base on waste has been created Finally, for the modelling and assessment tool, a study case about the recovery process of used tyres in new raw material is done.
78

Standard and Non-standard reasoning in Description Logics

Brandt, Sebastian-Philipp 05 April 2006 (has links)
The present work deals with Description Logics (DLs), a class of knowledge representation formalisms used to represent and reason about classes of individuals and relations between such classes in a formally well-defined way. We provide novel results in three main directions. (1) Tractable reasoning revisited: in the 1990s, DL research has largely answered the question for practically relevant yet tractable DL formalisms in the negative. Due to novel application domains, especially the Life Sciences, and a surprising tractability result by Baader, we have re-visited this question, this time looking in a new direction: general terminologies (TBoxes) and extensions thereof defined over the DL EL and extensions thereof. As main positive result, we devise EL++(D)-CBoxes as a tractable DL formalism with optimal expressivity in the sense that every additional standard DL constructor, every extension of the TBox formalism, or every more powerful concrete domain, makes reasoning intractable. (2) Non-standard inferences for knowledge maintenance: non-standard inferences, such as matching, can support domain experts in maintaining DL knowledge bases in a structured and well-defined way. In order to extend their availability and promote their use, the present work extends the state of the art of non-standard inferences both w.r.t. theory and implementation. Our main results are implementations and performance evaluations of known matching algorithms for the DLs ALE and ALN, optimal non-deterministic polynomial time algorithms for matching under acyclic side conditions in ALN and sublanguages, and optimal algorithms for matching w.r.t. cyclic (and hybrid) EL-TBoxes. (3) Non-standard inferences over general concept inclusion (GCI) axioms: the utility of GCIs in modern DL knowledge bases and the relevance of non-standard inferences to knowledge maintenance naturally motivate the question for tractable DL formalism in which both can be provided. As main result, we propose hybrid EL-TBoxes as a solution to this hitherto open question.
79

Inference generation in the reading of expository texts by university students

Pretorius, Elizabeth Josephine 02 1900 (has links)
The continued underperformance of many L2 students at primary, secondary and tertiary level is a cause for grave concern in South Africa. In an attempt to better understand the cognitivelinguistic conditions and processes that underlie academic performance and underperformance, this study looks at the problem of differential academic performance by focussing on the inferential ability of undergraduate L2 students during the reading of expository texts. The study works within a constructivist theory of reading, where the successful understanding of a text is seen to involve the construction of a mental representation of what the text is about. Inferencing plays an important role in constructing meaning during reading because it enables the reader to link incoming information with already given information, and it enables the reader to construct a mental representation of the meaning of a text by converting the linear input into a hierarchical mental representation of interrelated information. The main finding showed that the ability to make inferences during the reading of expository texts was strongly related to academic performance: the more inferences students made during the reading of expository texts, the better they performed academically. This relationship held across the making of various inferences, such as anaphoric inferences, vocabulary inferences, inferences about various semantic relations, and thematic inferences. In particular, the ability to make anaphoric, contrastive and causal inferences emerged as the strongest predictors of academic performance. The study provides strong empirical evidence that the ability to make inferences during reading enables a reader to construct meaning and thereby also to acquire new knowledge. Reading is not only a tool for independently accessing information in an information-driven society, it is fundamentally a tool for constructing meaning. Reading and inferencing are not additional tools that students need to master in the learning context- they constitute the very process whereby learning occurs. / Linguistics and Modern Languages / D.Litt. et Phil. (Linguistics)
80

Facebook與Instagram之跨訊息綜效比較 / A comparison of cross-message synergy effect between Facebook and Instagram

陳又瑈, Chen, Yu Jou Unknown Date (has links)
數位行動浪潮襲捲全球,智慧型手機的普及讓行動行銷成為一股趨勢,如何善用行動媒體的特性來提高整合行銷傳播的綜效亦是近年來數位行銷領域相當關注的話題。APP是行動媒體中最能直接與消費者接觸的工具,而社群APP因為內容具備即時性、社群性等特徵,成為各大品牌搶奪消費者眼球的必爭之地。   在眾多社群APP中,Facebook的廣告服務已行之有年,後起之秀Instagram亦於2015年9月對台灣企業開放廣告功能,讓廣告主能以付費的方式觸及更多的使用者。此二平台的廣告運作機制非常相似,消費者都會先接收貼文廣告,再從貼文廣告連接到品牌帳號的主要頁面,兩個品牌訊息加總將有機會產生跨訊息綜效。不過,這兩個平台在內容表現形式上卻不盡相同,且Instagram強大的濾鏡功能能讓照片更具美感,進而影響使用者對品牌的解讀及反應,所呈現的綜效也可能有所不同。   因此本研究選定Facebook與Instagram為研究對象,以社群APP平台為自變項,廣告態度及品牌態度為依變項,來驗證跨訊息綜效是否產生,並比較此二平台在廣告資訊處理歷程及效果上的差異。本研究採用實驗法,招募100名政治大學非傳播學院之大學生及研究生進行實驗,所得結果如下: 一、 單一媒體的跨訊息綜效並不顯著,但不同媒體(Facebook與Instagram)的資訊處理歷程、態度形成及跨訊息效果卻不盡相同。 二、 與說服知識相比,美感反應才是影響操弄意圖推論高低的主要變項,接收Facebook廣告的消費者會因為較低美感反應產生較高操弄意圖推論,接收Instagram廣告的消費者則會因為美感反應較高而使操弄意圖推論較低。 三、 操弄意圖推論對訊息涉入度具顯著負向影響,操弄意圖愈高,訊息涉入度愈低,反之亦然。Facebook的操弄意圖推論較高,因此訊息涉入度較低,消費者較缺乏訊息處理動機,而Instagram的操弄意圖推論較低使得訊息涉入度較高,亦即消費者具備較高的訊息處理動機。

Page generated in 0.0636 seconds