• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 39
  • 8
  • 4
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 77
  • 77
  • 26
  • 15
  • 14
  • 13
  • 12
  • 11
  • 11
  • 9
  • 9
  • 8
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

An Investigation of the Impact of Note Taking on the Quality of Mock Jurors’ Decisions

Tanya Strub Unknown Date (has links)
Abstract This research investigated the extent to which taking notes influenced the quality of mock jurors’ decisions. High quality decisions were defined in this research as those which did not reflect the influence of the offender stereotype. The impact of note taking on the quality of jurors’ decisions is central to the judicial community’s concerns about note taking as a jury aid and their willingness to offer it in trial contexts. Previous research has argued that note takers make better quality decisions than non-note takers because note takers recall more trial content and make judgements that better reflect the evidence presented. However, according to dual process models of persuasion, high quality decisions should show evidence of both effortful processing of information and no influence of peripheral cues, such as stereotypes. To date, the existing literature has neglected to consider the extent to which note takers, as compared to non-note takers, are influenced by peripheral cues. The current research sought to address this by investigating the extent to which note taking and non-note taking mock jurors were influenced by stereotypes when making decisions in a mock criminal trial. In particular, note taking and non-note taking mock jurors were presented with a criminal trial in which either a male or female defendant had been charged with a stereotypically masculine crime (e.g., aggravated robbery or murder). The extent to which mock jurors were more likely to convict the male defendant and acquit the female defendant was used as a marker of the extent that stereotypes about offenders influenced participants in these studies. Across studies, note takers’ perceptions of guilt, evaluation of the defendant, and, in some instances, recall of trial content, reflected stereotype-based processing while the corresponding measures for non-note takers did not. This research then went on to investigate why note takers were more vulnerable to the influence of stereotypes than non-note takers. It was proposed that one reason might be the requirement that note takers simultaneously record and evaluate trial content. Previous research has shown that persons engaged in dual tasks rely on stereotypes to increase information processing efficiency and are therefore able to re-direct cognitive resources to the additional task. Consistent with previous studies, the current research found that both note takers and mock jurors engaged in an additional task during the trial were more vulnerable to the influence of stereotypes than non-note takers. Furthermore, whilst investigating interventions designed to reduce the influence of stereotypes on note takers’ decisions, results revealed that such interventions were less successful in improving decision quality than interventions that removed the requirement to engage in dual tasks. In particular, the influence of stereotypes was reduced when note takers were encouraged to elaborate on the content of their notes during designated review periods. Whilst methodological features of this research program--namely a reliance on student samples and the relative brevity of mock trials used--may have led to an underestimation of the reliance on stereotypes for note takers, the research has implications for the instructions given to jurors about note taking in judicial contexts. Specifically, the central conclusion of the thesis is that it would seem prudent to amend instructions to direct note takers to engage in the effortful review of their notes prior to coming together to reach a verdict.
22

An Investigation of the Impact of Note Taking on the Quality of Mock Jurors’ Decisions

Tanya Strub Unknown Date (has links)
Abstract This research investigated the extent to which taking notes influenced the quality of mock jurors’ decisions. High quality decisions were defined in this research as those which did not reflect the influence of the offender stereotype. The impact of note taking on the quality of jurors’ decisions is central to the judicial community’s concerns about note taking as a jury aid and their willingness to offer it in trial contexts. Previous research has argued that note takers make better quality decisions than non-note takers because note takers recall more trial content and make judgements that better reflect the evidence presented. However, according to dual process models of persuasion, high quality decisions should show evidence of both effortful processing of information and no influence of peripheral cues, such as stereotypes. To date, the existing literature has neglected to consider the extent to which note takers, as compared to non-note takers, are influenced by peripheral cues. The current research sought to address this by investigating the extent to which note taking and non-note taking mock jurors were influenced by stereotypes when making decisions in a mock criminal trial. In particular, note taking and non-note taking mock jurors were presented with a criminal trial in which either a male or female defendant had been charged with a stereotypically masculine crime (e.g., aggravated robbery or murder). The extent to which mock jurors were more likely to convict the male defendant and acquit the female defendant was used as a marker of the extent that stereotypes about offenders influenced participants in these studies. Across studies, note takers’ perceptions of guilt, evaluation of the defendant, and, in some instances, recall of trial content, reflected stereotype-based processing while the corresponding measures for non-note takers did not. This research then went on to investigate why note takers were more vulnerable to the influence of stereotypes than non-note takers. It was proposed that one reason might be the requirement that note takers simultaneously record and evaluate trial content. Previous research has shown that persons engaged in dual tasks rely on stereotypes to increase information processing efficiency and are therefore able to re-direct cognitive resources to the additional task. Consistent with previous studies, the current research found that both note takers and mock jurors engaged in an additional task during the trial were more vulnerable to the influence of stereotypes than non-note takers. Furthermore, whilst investigating interventions designed to reduce the influence of stereotypes on note takers’ decisions, results revealed that such interventions were less successful in improving decision quality than interventions that removed the requirement to engage in dual tasks. In particular, the influence of stereotypes was reduced when note takers were encouraged to elaborate on the content of their notes during designated review periods. Whilst methodological features of this research program--namely a reliance on student samples and the relative brevity of mock trials used--may have led to an underestimation of the reliance on stereotypes for note takers, the research has implications for the instructions given to jurors about note taking in judicial contexts. Specifically, the central conclusion of the thesis is that it would seem prudent to amend instructions to direct note takers to engage in the effortful review of their notes prior to coming together to reach a verdict.
23

A Tool-Supported Method for Fallacies Detection in Process-Based Argumentation

Gómez Rodríguez, Laura January 2018 (has links)
Process-based arguments aim at demonstrating that a process, compliant with a standard, has been followed during the development of a safety-critical system. Compliance with these processes is mandatory for certification purposes, so the generation of process-based arguments is essential, but also a very costly and time-consuming task. In addition, inappropriate reasoning in the argumentation such as insufficient evidence (i.e. a fallacious argumentation), may result in a loss of quality of the system, leading to safety-related failures. Therefore, avoiding or detecting fallacies in process-based arguments is crucial. However, the process of reviewing such arguments is currently done manually and is based on the expert’s knowledge, so it is a very laborious and error-prone task.In this thesis, an approach to automatically generate fallacy-free process-based arguments is proposed and implemented. This solution is composed of two parts; (i) detecting omission of key evidence fallacies on the modelled processes, and (ii) transforming them into process-based safety arguments. The former checks automatically if the process model, compliant with the Software & Systems Process Engineering Metamodel (SPEM) 2.0, contains the sufficient information for not committing an omission of key evidence fallacy. If fallacies are detected, the functionality provides the proper recommendation to resolve them. Once the safety engineers/process engineers modify the process model following the provided recommendations, the second part of the solution can be applied. This one generates automatically the process-based argument, compliant with the Structured Assurance Case Metamodel (SACM), and displays it –rendered via Goal Structuring Notation (GSN)– into the OpenCert assurance case editor within the AMASS platform. The applicability of the solution is validated in the context of the ECSS-E-ST-40C standard.
24

Flexibilité des processus de développement à la conception et à l'exécution : application à la plasticité des interfaces homme-machine / Development processes flexibility at design- and enactment-times : application to Human-Computer Interfaces plasticity

Ceret, Eric 04 July 2014 (has links)
La diversité des dispositifs et les exigences des utilisateurs en termes de disponibilité et de continuité de service complexifient l'ingénierie de l'interaction homme-machine : il devient nécessaire de créer des IHM douées d'adaptation dynamique à leur contexte d'usage. L'ingénierie de ces IHM, dites plastiques, peut suivre une approche dirigée par les modèles mais ces approches sont encore peu pratiquées et souffrent d'un coût d'apprentissage important. Il est donc impératif d'accompagner les concepteurs et développeurs par un guidage, mais ce guidage doit être suffisamment flexible pour intégrer des compétences variées et des pratiques diverses en constante évolution.L'ingénierie des méthodes de développement logiciel s'est depuis longtemps préoccupée de la flexibilité des modèles de processus pendant leur conception, mais très peu de travaux se sont préoccupés de la flexibilité à l'exécution. Pourtant, plusieurs études montrent que les concepteurs et les développeurs, qui sont les principaux utilisateurs des méthodes, en expriment le besoin. Ils souhaitent par exemple disposer de modèles de processus exprimés dans les langages qu'ils maîtrisent, qui les laissent maîtres des choix de conception ou de réalisation et les aident dans l'apprentissage de la démarche. La flexibilité des modèles de processus à l'exécution, telle que nous la proposons, permet de répondre à ces attentes et ouvre donc la possibilité de fournir un guidage adéquat pour le développement d'IHM plastiques.Nous nous sommes focalisés dans un premier temps sur la conceptualisation de la propriété de flexibilité. Cette étude nous a conduits à proposer une taxonomie des modèles de processus, Promote, qui définit et gradue la flexibilité selon six dimensions. Nous avons ensuite transcrit cette définition de la flexibilité dans un métamodèle de processus flexible, M2Flex, et l'avons implémenté dans deux outils : D2Flex (D pour Design time), un outil collaboratif de conception de modèles de processus, et R2Flex (R pour Runtime), un environnement d'exécution des modèles définis dans D2Flex. Nous avons appliqué notre approche aux modèles de processus de développement d'IHM plastiques en rendant flexible la méthode UsiXML. L'environnement logiciel est en maturation technologique pour un transfert vers l'industrie. Ces différentes contributions ont fait l'objet de validations, en particulier auprès de concepteurs novices, en ingénierie de l'interaction homme-machine et des systèmes d'information. / The increasing diversity of devices and services makes the engineering of user interfaces (UI) more complex: in particular, the UIs need to be capable of dynamic adaptation to the user's context of use. This property is named plasticity and so far addressed by model-based approaches. However, these approaches suffer from a high threshold of use. Therefore there is a need to support designers and developers with a flexible guidance, i.e. a guidance capable of adaptation to the evolving variety of skills and practices.Software development methods engineering has long been concerned with flexibility of process models at design time, but very few work has been done about enactment-time although several studies show that designers and developers, who are the primary users of methods, call for such a flexibility. For instance, they expect process models to be expressed in languages they master, to let them make decisions about design choices, and to help them in learning the approach.Our proposition of process models flexibility at both design time and runtime meets these expectations and thus opens the possibility of providing adequate guidance for the development of plastic UIs.We first focused on the conceptualization of flexibility. Thanks to this study, we elaborated Promote, a taxonomy of process models, which defines and graduates six kinds of flexibility. Then we transcribed this definition of flexibility into M2Flex, a flexible process metamodel, and implemented it in two tools: D2Flex (with a D as "Design time"), a collaborative tool for the Design of process models, and R2Flex (with a R as "Runtime") , a tool for enacting the process models defined in D2Flex. We applyed our approach to the development of plastic UIs by making the UsiXML methodology flexible. FlexiLab, our software environment, is actually under technological maturation for being transferred to companies. These contributions have been validated, especially with novice designers, in the fields of the engineering of plastic UIs and Information Systems.
25

Risk maturity at a life insurer

Mokgoantle, Oupa Joseph 17 June 2014 (has links)
M.Com. (Business Management) / Risk management is an important factor in ensuring business and project success. Thus, risk management methodologies are constantly being developed and improved. In order to define the goals, specify the process and manage progress, it is necessary to have a clear view of the enterprise‟s current approach to risk, as well as a definition of the intended destination. Benchmarking offers the opportunity to determine the current maturity capability against agreed frameworks, and also provides a structured route to improvement. A generally accepted framework is needed in order for an organisation to benchmark its current maturity and capability in managing risk, and this framework should also assist in defining progress towards increased maturity. Being an assessment tool, a risk maturity model is designed to measure risk management capability and to provide objectives for improvement The purpose of the research is to identify, adapt and recommend a sound risk maturity model, together with an easily applicable and effective questionnaire for use to measure the risk capability maturity of a Life Insurer (“Liberty Life”). To achieve this aim, six risk management maturity models were identified through a literature review and the proposed model was further supported with long-term insurance specific attributes of risk management as advocated by leading corporate governance codes and regulations such as King III and the newly proposed Financial Services Board (FSB) Solvency Assessment and Management (SAM) regime. Despite the widening consensus on the value of risk management, effective implementations of risk processes into organisations are not common. The benefits of mature risk management have been discussed in Chapter 2. By adopting an exploratory approach, the researcher conducted a qualitative research project, in the form of an in-depth case study, on a multinational financial services organisation. Unstructured face-to-face interviews were held with senior executives and risk managers in order to gather data regarding what they perceive as key attributes, including acceptable measurement criteria, of a risk maturity model appropriate and effective for implementation in their organisation.
26

The role of heterogeneity in spatial plant population dynamics

van Waveren, Clara-Sophie 24 October 2016 (has links)
No description available.
27

Integrated digital forensic process model

Kohn, Michael Donovan 10 June 2013 (has links)
The Information and Communications Technology (ICT) environment constitutes an integral part of our daily lives. Individual computer users and large corporate companies are increasingly dependent on services provided by ICT. These services range from basic communication to managing large databases with corporate client information. Within these ICT environments something is bound to go wrong for a number of reasons, which include an intentional attack on information services provided by an organisation. These organisations have in turn become interested in tracing the root cause of such an incident with the intent of successfully prosecuting a suspected malicious user. Digital forensics has developed signi cantly towards prosecuting such criminals. The volumes of information and rapid technological developments have contributed to making simple investigations rather cumbersome. In the digital forensics community a number of digital forensic process models have been proposed encapsulating a complete methodology for an investigation. Software developers have also greatly contributed toward the development of digital forensics tools. These developments have resulted in divergent views on digital forensic investigations. This dissertation presents the IDFPM - Integrated Digital Forensic Process Model. The model is presented after examining digital forensic process models within the current academic and law enforcement literature. An adapted sequential logic notation is used to represent the forensic models. The terminology used in the various models is examined and standardised to suit the IDFPM. Finally, a prototype supports a limited selection of the IDFPM processes, which will aid a digital forensic investigator. / Dissertation (MSc)--University of Pretoria, 2012. / Computer Science / unrestricted
28

Beyond process tracing: The response dynamics of preferential choice

Koop, Gregory James 25 July 2012 (has links)
No description available.
29

Evaluation of process models to estimate time to produce aircraft engine part features

Pepper, Brian P. January 2002 (has links)
No description available.
30

Towards a reusable process model structure for higher education institutions

Van der Merwe, Aletta Johanna 30 June 2005 (has links)
One of the tools used during re-engineering of an environment is the process model as modelling tool. The identification of process models within an institution is a difficult and tedious task. A problem is that often process model structures is identified for one specific project and not stored for future reuse. The ideal for institutions is to reuse process model structures within the institution. This study focused on the generic structures within the higher education application domain where the hypothesis for this study was that a generic educational process model structure for higher education institutions can be established; a process management flow procedure can be used to manage the flow within an educational process model; and that aneducational process model can be stored and reused in re-engineering efforts. The study was divided into three research questions, where the first focused on the identification of generic process model structures, the second on the usability of the process model structures within a re-engineering effort, and the last on the preservation of a process model structure. For the first research question, the identification of process model structures, three institutions were used for data collection. It was necessary to develop a requirements elicitation procedure for data collection. The structure derived was confirmed at a fourth institution. For the second research question, which focuses on the usability of process model structures, an ordinal measurement was defined to measure the usefulness of the process model structures in a reengineering effort. A re-engineering procedure was developed for re-engineering the application domain, called the process management flow procedure, and used for a re-engineering effort at one institution. Lastly, for the third research question the preservation of the process model structures, the abstraction of the process model structure was investigated as well as the feasibility of implementing the process model structures physically using existing repository software. The conclusion after the investigation of the three research questions was that the hypothesis was confirmed that there is indeed a set of process model structures within the higher education institution that are generic, preservable and reusable in a re-engineering effort. / Computing / Ph. D. (Computer Science)

Page generated in 0.0555 seconds