841 |
A Computational Approach to the Analysis and Generation of Emotion in TextKeshtkar, Fazel 09 August 2011 (has links)
Sentiment analysis is a field of computational linguistics involving identification,
extraction, and classification of opinions, sentiments, and emotions expressed in
natural language. Sentiment classification algorithms aim to identify whether the author of a text has a positive or a negative opinion about a topic. One of the main indicators which help to detect the opinion are the words used in the texts. Needless to say, the sentiments expressed in the texts also depend on the syntactic structure and the discourse context. Supervised machine learning approaches to sentiment classification were shown to achieve good results. Classifying texts by emotions requires finer-grained analysis than sentiment classification. In this thesis, we explore the task of emotion and mood classification for blog postings. We propose a
novel approach that uses the hierarchy of possible moods to achieve better results than
a standard flat classification approach. We also show that using sentiment orientation features improves the performance of classification. We used the LiveJournal blog corpus as a dataset to train and evaluate our method.
Another contribution of this work is extracting paraphrases for emotion terms based on the six basics emotions proposed by Ekman (\textit{happiness, anger, sadness, disgust, surprise, fear}). Paraphrases are different ways to express the same information. Algorithms to extract and automatically identify paraphrases are of interest from both linguistic and practical points of view. Our paraphrase extraction method is based on a bootstrapping algorithms that starts with seed words. Unlike in previous work, our algorithm does not need a parallel corpus.
In Natural Language Generation (NLG), paraphrasing is employed to create more varied and natural text. In our research, we extract paraphrases for emotions, with the goal of using them to automatically generate emotional texts (such as friendly or hostile texts) for conversations between intelligent agents and characters in educational games.
Nowadays, online services are popular in many disciplines such as: e-learning, interactive games, educational games, stock market, chat rooms and so on. NLG methods can be used in order to generate more interesting and normal texts for such applications. Generating text with emotions is one of the contributions of our work.
In the last part of this thesis, we give an overview of NLG from an applied
system's points of view. We discuss when NLG techniques can be used; we explained the requirements analysis and specification of NLG systems. We also, describe the main NLG tasks of content determination, discourse planning, sentence aggregation, lexicalization, referring expression generation, and linguistic realisation. Moreover, we describe our Authoring Tool that we developed in order to allow writers without programming skills to automatically generate texts for educational games.
We develop an NLG system that can generate text with different emotions. To do this, we introduce our pattern-based model for generation. We show our model starts with initial patterns, then constructs extended patterns from which we choose ``final'' patterns that are suitable for generating emotion sentences. A user can generate sentences to express the desired emotions by using our patterns. Alternatively, the user can use our Authoring Tool to generate sentences with emotions. Our acquired paraphrases will be employed by the tool in order to generate more varied outputs.
|
842 |
Development and Usability Testing of a Neonatal Intensive Care Unit Physician-Parent Decision Support Tool (PPADS)Weyand, Sabine A 09 August 2011 (has links)
This thesis presents the development and evaluation of a computerized physician-parent decision support tool for a neonatal intensive care unit (NICU), known as Physician and Parent Decision Support (PPADS). The NICU is a specialized hospital unit that treats very-ill neonates. Many difficult care decisions are made daily for this vulnerable population. The PPADS tool aims to augment current NICU decision-making by helping parents and physicians make more informed decisions, improving physician-parent communication, increasing parent decision-making satisfaction, decreasing conflict, and increasing decision efficiency. The development of the PPADS tool followed a five-step methodology: assessing the clinical environment, establishing the design criteria, developing the system design, implementing the system, and performing usability testing. Usability testing of the PPADS tool was performed on neonatologists and on parents of neonates who have graduated (survived) from a tertiary level NICU. The usability testing demonstrated the usefulness and ease of use of the tool.
|
843 |
Design and Development of a Quote Validation Tool for Arabic ScriptsAlshareef, Abdulrhman 20 December 2012 (has links)
Over the past decade, there has been a tremendous development in e-publishing tools. The Arab world tendency towards electronic publishing has facilitated the prosperity of Arabic e-publishing over the Internet. Likewise, it has enabled the ordinary user to deploy documents, letters, opinions, and ideas with freedom and ease of use. Although freedom of expression should be guaranteed to everyone, it may be used to disseminate false or distorted information. This may lead to the loss of ordinary user's confidence in e-content. However, the user's confidence in e-content will increase if the credibility of the content is emphasized. There are many factors that challenge this task including not only the rapidly growth of Arabic digital publishing, the absent from control over electronic content, and the lack of e-publishing regulations and laws, but also how to develop an efficient framework to confirm the digital content authenticity. Therefore, the need to monitor the credibility of Internet content while maintaining freedom of expression to its users has become an urgent matter of debate. A flexible framework needs to be developed that will overcome these issues and allow for a comprehensible and comfortable content validation environment that would satisfy the end users' desires. This thesis proposes a framework that serves to confirm fundamental text authenticity in Arabic scripts on the Internet. This framework will demonstrate the design and the development of new quotes verification algorithm and the necessary components of framework design, development and implementation based on Service Oriented architecture.
|
844 |
MPEG-V Based Web Haptic Authoring ToolGao, Yu 31 March 2014 (has links)
Nowadays the World Wide Web increasingly provides rich multimedia contents to its
users.In order to further enhance the experience of web-users, researchers have sought
solutions to integrate yet another modality into the web experience by augmenting web
content with haptic properties. In those applications, users are able to interact with
web virtual environments (such as games and e-learning systems) enriched with haptic
contents. However, it is not easy for designers without pro cient web programming
background and basic knowledge regarding haptics, to develop a web application with
haptic content enabled. Additionally, there is currently no standard to describe and reuse
a well-described haptic application which can be played in web browsers. In this thesis,
I present an MPEG-V based authoring tool for facilitating the development procedure
of haptics-enabled web applications. The system provides an interface for users to create
their own application, add custom 3D models, and modify their graphic and haptic
properties. Haptic properties include the speci cation of collision detection mechanism
and object surface properties which in turn directly a ect the force simulations. Finally,
the user is able to export a haptic-enabled 3D scene in a standard MPEG-V format
which can be reconstructed in a web haptic player. A detailed experiment is conducted
to evaluate the force simulations, application development process and design of user
interface. The results not only verify my proposed methodology, but also show the high
acceptance level by users with all levels of programming knowledge of the system.
|
845 |
Dynamic fixture planning in virtual environmentsKang, Xiu Mei 23 September 2010 (has links)
Computer-aided fixture planning (CAFP) is an essential part of Computer-aided design and manufacturing (CAD/CAM) integration. Proper fixture planning can dramatically reduce the manufacturing cost, the lead-time, and labor skill requirements in product manufacturing. However, fixture planning is a highly experience-based activity. Due to the extreme diversity and complexity of manufacturing workpieces and processes, there are not many fixture planning tools available for industry applications. Moreover, existing CAFP methods rarely consider integrating fixture environmental factors into fixture planning. Automatic fixture planning using VR can provide a viable way for industries.
This thesis develops automated approaches to fixture planning in a virtual environment (VE). It intends to address two important issues: automatic algorithms for fixture planning, and the VE to support high fidelity evaluation of fixture planning. The system consists of three parts including fixture assembly planning, feasibility analysis of assembly tools, and motion planning for fixture loading and unloading. The virtual fixture planning system provides the fixture designer a tool for fixture planning and evaluation. Geometrical algorithms are developed to facilitate the automatic reasoning.
A Web-based VE for fixture planning is implemented. The three-dimensional (3D) model visualization enables the fixture simulation and validation effectively to investigate existing problems. Approaches to construct desktop-based large VEs are also investigated. Cell segmentation methods and dynamic loading strategies are investigated to improve the rendering performance. Case studies of virtual building navigation and product assembly simulations are conducted.
The developed algorithms can successfully generate the assembly plan, validate the assembly tools, and generate moving paths for fixture design and applications. The VE is intuitive and sufficient to support fixture planning, as well as other virtual design and manufacturing tasks.
|
846 |
An energy efficient mass transportation model for Gauteng / Kadri Middlekoop NassiepNassiep, Kadri Middlekoop January 2011 (has links)
The demand for forensic social work as a specialist field is increasing rapidly, due to the increasing moral decline of the community and consequent higher demands set to generic social workers. Amendments to existing acts as well as the development of new legislation, lead to more opportunity for the prosecution of the perpetrator, and therefore a higher utilization of the forensic social worker.
A need was experienced to do research regarding the gaps experienced by social workers or any other workers who are currently executing forensic assessments with the sexually traumatised child.
The aim of the investigation was to determine which gaps social workers experience in the field when assessing a child forensically.
A recording procedure was used to obtain qualitative as well as quantitative data.
A purposive sampling was used were interviews were held with five participants to obtain the data. A selfdeveloped questionnaire was used as measuring instrument.
It is clear from the findings that there are definite gaps within the field of forensic social work and the need of further research within the field of forensic social work in South Africa is highlighted. / Thesis (MIng (Mechanical Engineering))--North-West University, Potchefstroom Campus, 2012.
|
847 |
Reasons for the non-use of Project Risk Tools and Techniques in the Manufacturing SectorRastrelli, Giulio, Ricca, Eugenio January 2015 (has links)
Project Risk Management (PRM) plays an important part in determining project success and it is considered an essential activity for companies. The literature provides a vast amount of tools and techniques created to help project managers to deal with project risks. However, in practice, project managers use few tools and techniques. The aim of this research is to understand the reasons for the non-use of PRM tools and techniques by project managers when dealing with risks in the Swedish manufacturing sector. In order to provide evidence on why project managers do not use PRM tools and techniques, this study identifies a list of tools and techniques to investigate, and a list of possible reasons. Both these lists derive from the existing literature and past research. This qualitative study is based on multiple case studies of seven companies with nine respondents. The companies are based in the Umeå region and operate in the manufacturing sector. This study has revealed that project managers, within the sample, prefer to use qualitative tools and techniques such as meetings and expert judgements when dealing with risks. On the contrary, most of the tools and techniques for quantitative risk analysis are not used. There is a lack of awareness regarding the existing tools and techniques, and in general regarding the basic concepts of Risk Management (RM). Project managers tend to heavily rely on intuition and past experience when dealing with project risks. Other reasons that account for the non-use of tools and techniques are lack of resources and an unwarranted use in relation to the project type. In some cases project managers might avoid or delay the management of negative risks and therefore to do not use tools and techniques. Furthermore, two more reasons emerge from the analysis of data, which contribute to a better understanding of the reasons behind the non-use of PRM tools and techniques. These reasons are small increment in quality of RM by using PRM tools and techniques and complacency by project managers when using PRM tools and techniques. This research extends prior literature by providing evidence on the use and non-use of PRM tools and techniques and the reasons for their non-use in a sector where there is a lack of research. Finally, two more reasons are discovered and can contribute to a better understanding of the existing gap between theory and practice of RM.
|
848 |
Rekonstruktion av logaritmer med tallinjer som medierande redskap / Reconstructing logarithms using number lines as mediated toolsFermsjö, Roger January 2014 (has links)
The aim of the research reported in this licentiate thesis was to create an environment that could support students’ learning about logarithms. To develop such a learning environment, Davydov’s ‘learning activity’ was used as a theoretical framework for the design. A new tool was created, that was used by the students to unfold and single out some of the unique properties of logarithms when solving different learning tasks. The construction of the model was inspired by Napiers original idea from 1614, i.e. exactly 400 years ago, by using two number lines; one arithmetic (i.e. based on addition) and one geometric (i.e. based on multiplication). The research approach used was learning study where teachers and researcher worked collaboratively in an iterative process to refine the research lesson. The study was conducted in six groups with six teachers in upper secondary school in a major city in Sweden. The sample comprised about 150 students and data were collected by filming lessons and by interviews with some of the students. The data were analysed using an analytic framework derived from ‘learning activity’ and the results show what supports, but also what does not support, the creation of an environment for supporting students’ learning of logarithms. The results from the study are related to former research regarding instrumental/procedural vis-à-vis relational/conceptual understanding and also about research about students’ ‘errors and misconceptions’. It is argued that the formal definition of logarithms, y = 10x <-> x = lgy (y > 0), should not be used to introduce the concept, instead a new way is proposed. One conclusion is that it is possible to reconstruct logarithms without using the definition as a tool. The results from the analysed lessons show how students looked for ways to solve learning tasks using the new tool. The definition and the identities regarding logarithms appear as bi-products of the students learning activity. When analysing students actions, they rarely over-generalised mathematical rules, e.g. used the distributive law, or separated log-expressions, e.g. adding log expressions part by part, that seemed to be an issue according to former research.
|
849 |
Construction of a support tool for the design of the activity structures based computer system architecturesMohamad, Sabah Mohamad Amin January 1986 (has links)
This thesis is a reapproachment of diverse design concepts, brought to bear upon the computer system engineering problem of identification and control of highly constrained multiprocessing (HCM) computer machines. It contributes to the area of meta/general systems methodology, and brings a new insight into the design formalisms, and results afforded by bringing together various design concepts that can be used for the construction of highly constrained computer system architectures. A unique point of view is taken by assuming the process of identification and control of HCM computer systems to be the process generated by the Activity Structures Methodology (ASM). The research in ASM has emerged from the Neuroscience research, aiming at providing the techniques for combining the diverse knowledge sources that capture the 'deep knowledge' of this application field in an effective formal and computer representable form. To apply the ASM design guidelines in the realm of the distributed computer system design, we provide new design definitions for the identification and control of such machines in terms of realisations. These realisation definitions characterise the various classes of the identification and control problem. The classes covered consist of: 1. the identification of the designer activities, 2. the identification and control of the machine's distributed structures of behaviour, 3. the identification and control of the conversational environment activities (i.e. the randomised/ adaptive activities and interactions of both the user and the machine environments), 4. the identification and control of the substrata needed for the realisation of the machine, and 5. the identification of the admissible design data, both user-oriented and machineoriented, that can force the conversational environment to act in a self-regulating manner. All extent results are considered in this context, allowing the development of both necessary conditions for machine identification in terms of their distributed behaviours as well as the substrata structures of the unknown machine and sufficient conditions in terms of experiments on the unknown machine to achieve the self-regulation behaviour. We provide a detailed description of the design and implementation of the support software tool which can be used for aiding the process of constructing effective, HCM computer systems, based on various classes of identification and control. The design data of a highly constrained system, the NUKE, are used to verify the tool logic as well as the various identification and control procedures. Possible extensions as well as future work implied by the results are considered.
|
850 |
Analysis of heat dissipation from railway and automotive friction brakesVoller, Gordon Paul January 2003 (has links)
The thesis presents research into the understanding and improvement of heat dissipation from friction brakes. The investigations involved two brake types, considered to be the most thermally loaded and therefore most challenging; axle mounted high speed railway and commercial vehicle disc brakes. All three modes of heat transfer (conduction, convection and radiation) and airflow characteristics have been analysed experimentally and theoretically in order to increase the understanding of heat dissipation. Despite the very practical aspects of this research, a 'generic heat transfer approach' was applied, enabling wider engineering applications of the results. Experimental analyses conducted on a specially developed Spin Rig allowed measurements of cooling and airflow characteristics for different designs. Methodologies have been developed to determine thermal contact resistance, heat transfer coefficients, emissivity and aerodynamic (pumping) losses. Established values and relationships compared very favourably with theoretical work. Analytical, FE and CFD analyses were employed to further investigate design variations and perform sensitivity studies. Inertia dynamometer route simulations provided disc temperatures for validation of the overall work. Recommendations have been made for optimising heat dissipation, by proposing practically acceptable and economically viable design solutions. A proposed ventilated disc design efficiency ratio allows large, high speed ventilated disc designs, to be efficiently and accurately evaluated and compared, providing a valuable disc design optimisation tool. The determination of the methodologies, parameters and functions defining cooling characteristics, enable heat dissipation to be predicted confidently and accurately for brakes and other engineering assemblies at early design stages.
|
Page generated in 0.0356 seconds