• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 77
  • 53
  • 45
  • 26
  • 7
  • 7
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 496
  • 209
  • 208
  • 208
  • 208
  • 208
  • 77
  • 77
  • 57
  • 52
  • 49
  • 42
  • 40
  • 39
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Anforderungen an die virtuelle Vernetzung von Personen im Alter von 55+

Fahrni, Renate. January 2008 (has links) (PDF)
Master-Arbeit Univ. St. Gallen, 2008.
162

The dominant role of users in the scientific instrument innovation process

January 1975 (has links)
[by] Eric von Hippel. / Includes bibliographical references. / The research reported in this paper was supported by the Office of National R&D Assessment, NSF (Grant no. DA-44366) and the Office of Experimental R&D Incentives, NSF (Grant no. CG-00002).
163

A tree-based measure for hierarchical data in mixed databases

Hassan, Diman January 2016 (has links)
The structure of the data in a mixed database can be a barrier when clustering that database into meaningful groups. A hierarchically structured database necessitates efficient distance measures and clustering algorithms to locate similarities between data objects. Therefore, existing literature proposes hierarchical distance measures to measure the similarities between the records in hierarchical databases. The main contribution of this research is to create and test a new distance measure for large hierarchical databases consisting of mixed data types and attributes, based on an existing tree-based (hierarchical) distance metric, the pq-gram distance metric. Several aims and objectives were pursued to fill a number of gaps in the current body of knowledge. One of these goals was to verify the validity of the pq-gram distance metric when applied to different data sets, and to compare and combine it with a number of different distance measures to demonstrate its usefulness across large mixed databases. To achieve this, further work focused on exploring how to exploit the existing method as a measure of hierarchical data attributes in mixed data sets, and to ascertain whether the new method would produce better results with large mixed databases. For evaluation purposes, the pq-gram metric was applied to The Health Improvement Network (THIN) database to determine if it could identify similarities between the records in the database. After this, it was applied to mixed data to examine different distance measures, which include non-hierarchical and other hierarchical measures, and to combine them to create a Combined Distance Function (CDF). The CDF improved the results when applied to different data sets, such as the hierarchical National Bureau of Economic Research of United States (NBER US) Patent data set and the mixed (THIN) data set. The CDF was then modified to create a New-CDF, which used only the hierarchical pq-gram metric to measure the hierarchical attributes in the mixed data set. The New-CDF worked well, finding the most similar data records when applied to the THIN data set, and grouping them in one cluster using the Balanced Iterative Reducing and Clustering using Hierarchies (BIRCH) clustering algorithm. The quality of the clusters was explored using two internal validation indices, Silhouette and C-Index, where the values showed good compactness and quality of the clusters obtained using the new method.
164

Real-time physiological measure and feedback of workload

Maior, Horia Alexandru January 2017 (has links)
Understanding and identifying individuals’ capabilities and limitations has always been a challenge within work contexts, but its importance cannot be underestimated. Humans have a limited mental capacity [142], which means that they can only perform a nite set of tasks at any given period of time. Identifying these limitations is a key factor in the reduction and prevention of what is referred to as Mental Workload Overload. These measures are used in research and industry to evaluate the interaction of users with new systems and tasks. Current techniques involve asking users to subjectively assess and self report their levels of workloads using techniques and questionnaires such as NASA-TLX and Instantaneous Self-Assessment (ISA). The subjective measures become highly important when it comes to evaluating more complex systems and tasks, where performance based measures become highly difficult to measure. Even though they are critical for evaluation of these systems, there are certain limitations that cannot be overlooked when using them. Firstly, subjective measures rely on the participants’ ability to judge and report the state throughout the task. This requires not only extra effort from the operator, but also skill and potential training. Secondly, subjective measures, if used in real-time have the potential to interrupt and negatively affect performance; if used post-task, they rely on the operators’ ability to recall what happened during certain moments in the past. Direct physiological measures offer an opportunity to capture workload whilst overcoming these limitations. However, new research is needed to understand how physiological data can be interpreted within the context of theories of mental workload. The research presented in this thesis explores the use of one particular physiological approach, functional Near Infrared Spectroscopy (fNIRS), to assess workload in controlled laboratory settings, to overcome the limitations and complement the use of subjective measures; a measure based on participants’ brain and physiological responses to task demand, that is independent of the task and/or the operator (without interrupting the task or relying on the operator skill to self report). We have examined the reliability of the technique, and significantly extended our understanding of how artefacts affect recordings during both - a Verbal memory task of remembering a seven digit number and a Spacial memory task of remembering a 6x6 shaped grid. Our results showed that artefacts have a significantly different impact during the two types of tasks, further contributing insights into the existing guidelines of using fNIRS to assess workload during typical human computer interaction evaluation settings. We have further evaluated the sensitivity of the tool and understand the potential implications of using fNIRS as a measure in real-time. Our findings validated fNIRS as a sensitive workload measure, having consistent results in line with subjective measures, confirming a correlation between fNIRS and subjective workload questionnaires NASA-TLX and ISA. Having shown the relationship between fNIRS and workload, the last part of this thesis explores the use of fNIRS as a novel approach to providing users with concurrent feedback of their Mental Workload based on the measurements obtained objectively from fNIRS. We compare this feedback to traditional methods of asking users to self-assess and report their own mental workload during an Air Traffic Controller simulation game. In line with previous work, we con rm that self-reporting methods affect both perceived and actual performance. Furthermore, we found that our objective concurrent feedback technique allowed participants to reflect metacognitively on their Mental Workload during tasks, without reducing either actual or perceived performance. fNIRS showed potential to be a useful and reliable additional channel of information about the user during interaction, without further restricting the user during a typical evaluation settings. We found it sensitive to workload, being able to distinguish between various levels of workload, and with great potential for real time, continuous use during tasks. Finally, we explored a new direction of using fNIRS’s assessment of workload in real time, and we investigated how users can use feedback of their current workload state during tasks. This proved to allow users to think metacognitively about their workload during tasks, without negatively affecting their performance or workload.
165

Genetic algorithms for workforce scheduling and routing problem

Algethami, Haneen January 2017 (has links)
The Workforce Scheduling and Routing Problem (WSRP) is described as the assignment of personnel to visits across various geographical locations. Solving this problem demands tackling scheduling and routing constraints while aiming to minimise the total operational cost. With current computational capabilities, small WSRPs are solvable using exact methods. However, it is difficult to solve when they are larger. The difficulty of WSRP is further increased when processing conflicting assignments or dealing with workers unavailability at customer's areas. Genetic Algorithms (GAs) have proved their effectiveness in these regards, because of their search capability to acquire good solutions in a reasonable computational time. A GA consists of many components, which can be chosen and combined in numerous procedures. In the case of solving scheduling and routing problems separately, different GAs have been proposed. When solving WSRP problem instances, it has been quite common to use the design components, intended for scheduling or routing problems. In this thesis, 42 real-world Home Health Care (HHC) planning problem datasets are used as instances of the WSRP. Different GA components are presented in this study, tailored for the combined settings. This has made major contributions to understanding how GAs works in a challenging real-world problem. Research interests in this work are categorised into two parts. The first part aims to understand how to employ different genetic operators effectively when solving WSRPs. The work intends to design and select the best combination of components that provide good solutions. Accordingly, seven well-known crossovers, three mutation operators and eight cost-based operators are implemented. In addition, two repair heuristics to tackle infeasibility. Nevertheless, a direct chromosome representation has resulted in poor solutions. Thus, there is a need for more tailored components for this problem. Therefore, an indirect chromosome representation, designed specifically to tackle WSRPs, is presented. The aim is to ensure initial solutions feasibility. Due to the quality of solutions, the GA introduced is considered an effective baseline evolutionary algorithm for WSRP. This work also suggested that each problem set requires different parameter settings. The second research interest intends to increase the GA efficiency. One approach is to investigate the effect of using adaptive components on the quality of WSRPs solutions. The aim is to adaptively alter parameter values instead of tuning an algorithm to a specific instance. Three aspects are adjusted during the run according to different rules: operator rates, population size, and crossover operator function. Thus, six variations of a diversity-based adaptive GA is presented. Not only the adaptive GA has improved the results, especially for large WSRP scenarios, but also it reduces the computational time. Another aspect investigated is the effect of using a group of crossover operators rather than using one operator throughout the search. Six crossover operators, well known and problem-specific are used as part of a multiple crossover GA framework. To evaluate an operator effectiveness, a reinforcement-learning model is developed with three performance measurements. The most successful variant of this algorithm finds the best-known results for the larger problem instances and matching the best-known results for some of the smaller ones. When combining this method with the adaptive GA, it provided some of the best results, as compared to established algorithms. The presented methods have contributed in reducing the operational costs for this constrained combinatorial optimisation problem.
166

Simulation of strategic management process of software projects : a dynamic approach

Masood, Uzzafer January 2014 (has links)
This research presents and validates a simulation model for the strategic management process of software projects. The proposed simulation model maps strategic decisions with parameters of strategic importance and links them to project management plans. Hence, the proposed simulation model is a complete framework for the analysis and the selection of strategic decisions for the development of software projects. The proposed framework integrates critical elements of software development projects, i.e. risk assessment, cost estimation and project management planning, for the analysis of strategic decisions which helps in choosing a strategic decision, among various strategic alternatives for the project, that suits the requirement of an organization the best. The simulation model captures the effects of strategic decisions on parameters of software projects in dynamic settings during the simulation of different phases of the development. The dynamic variations in project parameters affect project management plans. Capturing these variations of strategic parameters in dynamic settings brings out critical information about strategic decisions for the effective project management planning. This research work explains that the measure of risk and contingency estimates are fundamental, in-addition to risk assessment and cost estimation, for the strategic management of software development projects. Therefore, risk measure and contingency estimation models are developed for software projects. The proposed simulation model is generic, i.e. having generic components with plug-and-play interfaces; hence, it is independent of any risk assessment, cost estimation, risk measurement and contingency estimation models and project management tools for software development projects. This research presents a successful case study which shows that different strategic management decisions produce different sets of risk and cost options, as well as different project management plans for the development of software projects.
167

Robust execution of belief-desire-intention-based agent programs

Yao, Yuan January 2017 (has links)
Belief-Desire-Intention (BDI) agent systems are a popular approach to building intelligent agents for complex and dynamic domains. In the BDI approach, agents select plans to achieve their goals based on their beliefs. When BDI agents pursue multiple goals in parallel, the interleaving of steps in different plans to achieve goals may result in conflicts, e.g., where the execution of a step in one plan makes the execution of a step in another concurrently executing plan impossible. Conversely, plans may also interact positively with each other, e.g., where the execution of a step in one plan assists the execution of a step in other concurrently executing plans. To avoid negative interactions and exploit positive interactions, an intelligent agent should have the ability to reason about the interactions between its intended plans. We propose SAM, an approach to scheduling the progression of an agent’s intentions (intended plans) based on Monte-Carlo Tree Search and its variant Single-Player Monte-Carlo Tree Search. SAM is capable of selecting plans to achieve an agent’s goals and interleaving the execution steps in these plans in a domain-independent way. In addition, SAM also allows developers to customise how the agent’s goals should be achieved, and schedules the progression of the agent’s intentions in a way that best satisfies the requirements of a particular application. To illustrate the flexibility of SAM, we show how our approach can be configured to prioritise criteria relevant in a range of different scenarios. In each of these scenarios, we evaluate the performance of SAM and compare it with previous approaches to intention progression in both synthetic and real-world domains.
168

Gifting personalised trajectories in museums and galleries

Fosh, Lesley January 2016 (has links)
The designers of digital technologies for museums and galleries are increasingly interested in facilitating rich interpretations of a collection’s exhibits that can be personalised to meet the needs of a diverse range of individual visitors. However, it is commonplace to visit these settings in small groups, with friends or family. This sociality of a visit can significantly affect how visitors experience museums and their objects, but current guides can inhibit group interaction, especially when the focus is on personalisation towards individuals. This thesis develops an approach to tackling the combined challenge of fostering rich interpretation, delivering personalised content and supporting a social visit. Three studies were undertaken in three different museum and gallery settings. A visiting experience was developed for pairs of visitors to a sculpture garden, drawing upon concepts from the trajectories framework (Benford et al., 2009). Next, a study at a contemporary art gallery investigated how gift-giving could be used as a mechanism for personalisation between visitors who know each other well. Finally, the third study, at an arts and history museum, explored how gift-giving could be applied to small groups of friends and family. The thesis reports on how the approach enabled visitors to design highly personal experiences for one another and analyses how groups of visitors negotiated these experiences together in the museum visit, to reveal how this type of self-design framework for engaging audiences in a socially coherent way leads to rich, stimulating visits for the whole group and each individual member. The thesis concludes by recommending the design and gifting of museum and gallery interpretation experiences as a method for providing deeply personalised experiences, increasing visitor participation, and delivering meaningful group experiences.
169

Quotient inductive-inductive definitions

Dijkstra, Gabe January 2017 (has links)
In this thesis we present a theory of quotient inductive-inductive definitions, which are inductive-inductive definitions extended with constructors for equations. The resulting theory is an improvement over previous treatments of inductive-inductive and indexed inductive definitions in that it unifies and generalises these into a single framework. The framework can also be seen as a first approximation towards a theory of higher inductive types, but done in a set truncated setting. We give the type of specifications of quotient inductive-inductive definitions mutually with its interpretation as categories of algebras. A categorical characterisation of the induction principle is given and is shown to coincide with the property of being an initial object in the categories of algebras. From the categorical characterisation of induction, we derive a more type theoretic induction principle for our quotient inductive-inductive definitions that looks like the usual induction principles. The existence of initial objects in the categories of algebras associated to quotient inductive-inductive definitions is established for a class of definitions. This is done by a colimit construction that can be carried out in type theory itself in the presence of natural numbers, sum types and quotients or equivalently, coequalisers.
170

Exploring the use of brain-sensing technologies for natural interactions

Pike, Matthew January 2017 (has links)
Recent technical innovation in the field of Brain-Computer Interfaces (BCIs) has increased the opportunity for including physical, brain-sensing devices as a part of our day-to-day lives. The potential for obtaining a time-correlated, direct, brain-based measure of a participant's mental activity is an alluring and important development for HCI researchers. In this work, we investigate the application of BCI hardware for answering HCI centred research questions, in turn, fusing the two disciplines to form an approach we name - Brain based Human-Computer Interaction (BHCI). We investigate the possibility of using BHCI to provide natural interaction - an ideal form of HCI, where communication between man-and-machine is indistinguishable from everyday forms of interactions such as Speaking and Gesturing. We present the development, execution and output of three user studies investigating the application of BHCI. We evaluate two technologies, fNIRS and EEG, and investigate their suitability for supporting BHCI based interactions. Through our initial studies, we identify that the lightweight and portable attributes of EEG make it preferable for use in developing natural interactions. Building upon this, we develop an EEG based cinematic experience exploring natural forms of interaction through the mind of the viewer. In studying the viewers response to this experience, we were able to develop a taxonomy of control based on how viewers discovered and exerted control over the experience.

Page generated in 0.0286 seconds