61 |
Cause-Related Marketing : En undersökning av generation y’s attityder till CRM-begreppetFehrm, Camilla, Wikström, Erik January 2009 (has links)
No description available.
|
62 |
Optimal coordinate sensor placements for estimating mean and variance components of variation sourcesLiu, Qinyan 29 August 2005 (has links)
In-process Optical Coordinate Measuring Machine (OCMM) offers the potential of diagnosing in a timely manner variation sources that are responsible for product quality defects. Such a sensor system can help manufacturers improve product quality and reduce process downtime. Effective use of sensory data in diagnosing variation sources depends on the optimal design of a sensor system, which is often known as the problem of sensor placements. This thesis addresses coordinate sensor placement in diagnosing dimensional variation sources in assembly processes. Sensitivity indices of detecting process mean and variance components are defined as the design criteria and are derived in terms of process layout and sensor deployment information. Exchange algorithms, originally developed in the research of optimal experiment deign, are employed and revised to maximize the detection sensitivity. A sort-and-cut procedure is used, which remarkably improve the algorithm efficiency of the current exchange routine. The resulting optimal sensor layouts and its implications are illustrated in the specific context of a panel assembly process.
|
63 |
Teacher-centred Classrooms and Passive Resistance: Implications for Inclusive SchoolingSium, Bairu 07 January 2013 (has links)
This thesis is based on an ethnographic study conducted in a split grade five and six classroom in Toronto during the 1985/86 academic year. Data were collected through participatory observation, as well as through individual and focus group interviews. A group of eight activist African-Canadian high school students, as well as 26 Euro- Canadian “drop-backs” were also interviewed. The time during which I conducted the study was a period of intensive education activism of parents and the community in Toronto. I was interested in determining whether or not, and this activism was reflected at the school level, and if it was reflected, how. I also wanted to examine whether or not the historically supportive auxiliary role that parents played during this period was elevated to more substantive and meaningful active involvement in the education of their children during the last half of the 1980s.
This study shows that activities in the classroom were driven by pre-packaged curriculum materials and were implemented with very few modifications. Coupled with teacher-centred practice, this closed the door for any diversifying opportunities that could have found their way into the classroom, not only from the homes of the children and the school community, but also from critics of the use of prepackaged material and, most importantly, from the students themselves.
Furthermore, teacher-centred classroom discourse pushed students to develop a cynical attitude towards schooling. Having no say in what or how they were taught provided the children with few choices but to develop a coping mechanism of passive resistance. Their short-term survival strategies included appearing as though they were striding along, but not embracing their school experiences fully. By the same token, they were not challenged to think critically, to evaluate or to problem-solve. A link was also established between the students’ passive resistance at the elementary level with ‘fading out’ or ‘dropping out’ and successful resistance at the high school level.
|
64 |
The Resurfice Exception: Causation in Negligence Without ProbabilityCheifetz, David 21 November 2012 (has links)
Resurfice Corp. v. Hanke, [2007] 1 S.C.R. 333, 2007 SCC 7, creates a new causation doctrine in Canadian negligence law that is available to plaintiffs only in exceptional cases. Under this doctrine, negligence and the possibility of specific factual causation may be sufficient to satisfy the causation requirements of a cause of action in negligence. Proof of specific factual causation on the balance of probability is not required. The justification for this doctrine is fairness and justice. The application of the doctrine does not produce a decision that the negligence did cause the injury. Where the requirements of the Resurfice doctrine are satisfied, the causation requirements of the cause of action are deemed to be satisfied despite the finding that factual causation was not established on the balance of probability. The authorities cited are current to June 21, 2012.
|
65 |
The Resurfice Exception: Causation in Negligence Without ProbabilityCheifetz, David 21 November 2012 (has links)
Resurfice Corp. v. Hanke, [2007] 1 S.C.R. 333, 2007 SCC 7, creates a new causation doctrine in Canadian negligence law that is available to plaintiffs only in exceptional cases. Under this doctrine, negligence and the possibility of specific factual causation may be sufficient to satisfy the causation requirements of a cause of action in negligence. Proof of specific factual causation on the balance of probability is not required. The justification for this doctrine is fairness and justice. The application of the doctrine does not produce a decision that the negligence did cause the injury. Where the requirements of the Resurfice doctrine are satisfied, the causation requirements of the cause of action are deemed to be satisfied despite the finding that factual causation was not established on the balance of probability. The authorities cited are current to June 21, 2012.
|
66 |
Log Event Filtering Using Clustering TechniquesWasfy, Ahmed January 2009 (has links)
Large software systems are composed of various different run-time components, partner
applications and, processes. When such systems operate they are monitored so that audits can be
performed once a failure occurs or when maintenance operations are performed. However, log files
are usually sizeable, and require filtering and reduction to be processed efficiently. Furthermore, there
is no apparent correspondence of how logged events relate to particular use cases the system may be
performing. In this thesis, we have developed a framework that is based on heuristic clustering
algorithms to achieve log filtering, log reduction and, log interpretation. More specifically we define
the concept of the Event Dependency Graph, and we present event filtering and use case
identification techniques, that are based on event clustering. The clustering process groups together
all events that relate to a collection of initial significant events that relate to a use case. We refer to
these significant events as beacon events. Beacon events can be identified automatically or semiautomatically
by examining log event types or event names against event types or event names in the
corresponding specification of a use case being considered (e.g. events in sequence diagrams).
Furthermore, the user can select other or additional initial clustering conditions based on his or her
domain knowledge of the system. The clustering technique can be used in two possible ways. The
first is for large logs to be reduced or sliced, with respect to a particular use case so that, operators can
better focus their attention to specific events that relate to specific operations. The second is for the
determination of active use cases where operators select particular seed events of interest and then
examine the resulting reduced logs against events or event types stemming from different alternative
known use cases being considered, in order to identify the best match and consequently provide
insights on which of these alternative use cases may be running at any given time. The approach has
shown very promising results towards the identification of executing use cases among various
alternative ones in various runs of the Session Initiation Protocol.
|
67 |
Log Event Filtering Using Clustering TechniquesWasfy, Ahmed January 2009 (has links)
Large software systems are composed of various different run-time components, partner
applications and, processes. When such systems operate they are monitored so that audits can be
performed once a failure occurs or when maintenance operations are performed. However, log files
are usually sizeable, and require filtering and reduction to be processed efficiently. Furthermore, there
is no apparent correspondence of how logged events relate to particular use cases the system may be
performing. In this thesis, we have developed a framework that is based on heuristic clustering
algorithms to achieve log filtering, log reduction and, log interpretation. More specifically we define
the concept of the Event Dependency Graph, and we present event filtering and use case
identification techniques, that are based on event clustering. The clustering process groups together
all events that relate to a collection of initial significant events that relate to a use case. We refer to
these significant events as beacon events. Beacon events can be identified automatically or semiautomatically
by examining log event types or event names against event types or event names in the
corresponding specification of a use case being considered (e.g. events in sequence diagrams).
Furthermore, the user can select other or additional initial clustering conditions based on his or her
domain knowledge of the system. The clustering technique can be used in two possible ways. The
first is for large logs to be reduced or sliced, with respect to a particular use case so that, operators can
better focus their attention to specific events that relate to specific operations. The second is for the
determination of active use cases where operators select particular seed events of interest and then
examine the resulting reduced logs against events or event types stemming from different alternative
known use cases being considered, in order to identify the best match and consequently provide
insights on which of these alternative use cases may be running at any given time. The approach has
shown very promising results towards the identification of executing use cases among various
alternative ones in various runs of the Session Initiation Protocol.
|
68 |
Product or Cause? Influences of Donation Magnitude and Consumer MoodYang, Chia-yen 12 August 2010 (has links)
Cause-Related Marketing(CRM), which was initiated by American Express Company in 1981 to support the arts in San Francisco raising fund. Many academic researches have tried to discover the benefit and risk of CRM since 1990s. Print advertisement is a main channel for CRM. Therefore, how to structure CRM ads, especially for the focus of visual component, will be important in CRM campaigns. Based on previous studies relevant to charitable donations, this study compares the effects of cause-focused and product-focused CRM ads through experimental design method. In addition, donation magnitude and consumer mood are also considered to observe how they sway the effectiveness of CRM ads.
The present study employs an experimental design to investigate the effects of the types of visual component (cause-focused vs. product-focused), donation magnitude (5% of invoice price vs. 20% of invoice price) and consumer mood
(positive vs. negative) on CRM effectiveness. A 2x2x2 factorial design is conducted. Eight different scenarios are established and the ad effects are measured by purchase intention and attitude toward the brand.
The results indicate that the cause-focused ads are more effective than the product-focused ads. Although donation magnitude does not make a difference in a cause-focused ad, but low donation magnitude leads to higher purchase intention when a product-focused ad is presented. Positive mood facilitates the advertising effects of cause-focused ads. Finally, low donation magnitude and positive mood enhance the advertising effects of product-focused ads. The implications of these findings are discussed as well as the limitations and directions for future research.
|
69 |
An efficient logic fault diagnosis framework based on effect-cause approachWu, Lei 15 May 2009 (has links)
Fault diagnosis plays an important role in improving the circuit design process and the
manufacturing yield. With the increasing number of gates in modern circuits, determining
the source of failure in a defective circuit is becoming more and more challenging.
In this research, we present an efficient effect-cause diagnosis framework for
combinational VLSI circuits. The framework consists of three stages to obtain an accurate
and reasonably precise diagnosis. First, an improved critical path tracing algorithm is
proposed to identify an initial suspect list by backtracing from faulty primary outputs
toward primary inputs. Compared to the traditional critical path tracing approach, our
algorithm is faster and exact. Second, a novel probabilistic ranking model is applied to
rank the suspects so that the most suspicious one will be ranked at or near the top. Several
fast filtering methods are used to prune unrelated suspects. Finally, to refine the diagnosis,
fault simulation is performed on the top suspect nets using several common fault models.
The difference between the observed faulty behavior and the simulated behavior is used to rank each suspect. Experimental results on ISCAS85 benchmark circuits show that this
diagnosis approach is efficient both in terms of memory space and CPU time and the
diagnosis results are accurate and reasonably precise.
|
70 |
Optimal coordinate sensor placements for estimating mean and variance components of variation sourcesLiu, Qinyan 29 August 2005 (has links)
In-process Optical Coordinate Measuring Machine (OCMM) offers the potential of diagnosing in a timely manner variation sources that are responsible for product quality defects. Such a sensor system can help manufacturers improve product quality and reduce process downtime. Effective use of sensory data in diagnosing variation sources depends on the optimal design of a sensor system, which is often known as the problem of sensor placements. This thesis addresses coordinate sensor placement in diagnosing dimensional variation sources in assembly processes. Sensitivity indices of detecting process mean and variance components are defined as the design criteria and are derived in terms of process layout and sensor deployment information. Exchange algorithms, originally developed in the research of optimal experiment deign, are employed and revised to maximize the detection sensitivity. A sort-and-cut procedure is used, which remarkably improve the algorithm efficiency of the current exchange routine. The resulting optimal sensor layouts and its implications are illustrated in the specific context of a panel assembly process.
|
Page generated in 0.0359 seconds