• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22
  • 4
  • 3
  • 1
  • Tagged with
  • 44
  • 44
  • 21
  • 18
  • 12
  • 8
  • 8
  • 6
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Intelligent interface design for a question answering system

Antonio, Nicholas. January 2001 (has links)
Thesis (M.S.)--University of Florida, 2001. / Title from title page of source document. Document formatted into pages; contains x, 58 p.; also contains graphics. Includes vita. Includes bibliographical references.
12

Theological method and the research programme of Imre Lakatos

Gross, Don Paul. January 1992 (has links)
Thesis (M.A.)--Trinity Evangelical Divinity School, 1992. / Abstract. Includes bibliographical references (leaves 97-100).
13

Analytische und dialektische Vernunft Versuch einer Entwicklung der Begriffe von Kant bis zur Gegenwart am Leitfaden der Philosophiegeschichte /

Cassing, Wolfgang. January 1978 (has links)
Thesis--Münster. / Vita. Bibliography: p. 293-296.
14

Framework for an expert system generator

Cernik, Jacob A., January 2009 (has links)
Thesis (M.S.)--University of Akron, Dept. of Computer Science, 2009. / "May, 2009." Title from electronic thesis title page (viewed 11/18/2009) Advisor, Chien-Chung Chan; Committee members, Kathy J. Liszka, Zhong-Hui Duan; Department Chair, Wolfgang Pelz; Dean of the College, Chand Midha; Dean of the Graduate School, George R. Newkome. Includes bibliographical references.
15

Unified GUI adaptation in Dynamic Software Product Lines

Kramer, Dean January 2014 (has links)
In the modern world of mobile computing and ubiquitous technology, society is able to interact with technology in new and fascinating ways. To help provide an improved user experience, mobile software should be able to adapt itself to suit the user. By monitoring context information based on the environment and user, the application can better meet the dynamic requirements of the user. Similarly, it is noticeable that programs can require different static changes to suit static requirements. This program commonality and variability can benefit from the use of Software Product Line Engineering, reusing artefacts over a set of similar programs, called a Software Product Line (SPL). Historically, SPLs are limited to handling static compile time adaptations. Dynamic Software Product Lines (DSPL) however, allow for the program configuration to change at runtime, allow for compile time and runtime adaptation to be developed in a single unified approach. While currently DSPLs provide methods for dealing with program logic adaptations, variability in the Graphical User Interface (GUI) has largely been neglected. Due to this, depending on the intended time to apply GUI adaptation, different approaches are required. The main goal of this work is to extend a unified representation of variability to the GUI, whereby GUI adaptation can be applied at compile time and at runtime. In this thesis, an approach to handling GUI adaptation within DSPLs, providing a unified representation of GUI variability is presented. The approach is based on Feature-Oriented Programming (FOP), enabling developers to implement GUI adaptation along with program logic in feature modules. This approach is applied to Document- Oriented GUIs, also known as GUI description languages. In addition to GUI unification, we present an approach to unifying context and feature modelling, and handling context dynamically at runtime, as features of the DSPL. This unification can allow for more dynamic and self-aware context acquisition. To validate our approach, we implemented tool support and middleware prototypes. These different artefacts are then tested using a combination of scenarios and scalability tests. This combination first helps demonstrate the versatility and its relevance of the different approach aspects. It further brings insight into how the approach scales with DSPL size.
16

Distributed collaborative context-aware content-centric workflow management for mobile devices

Kocurova, Anna January 2013 (has links)
Ubiquitous mobile devices have become a necessity in today’s society, opening new opportunities for interaction and collaboration between geographically distributed people. With the increased use of mobile phones, people can collaborate while on the move. Collaborators expect technologies that would enhance their teamwork and respond to their individual needs. Workflow is a widely used technology that supports collaboration and can be adapted for a variety of collaborative scenarios. Although the originally computer-based workflow technology has expanded also on mobile devices, there are still research challenges in the development of user-focused device-oriented collaborative workflows. As opposed to desktop computers, mobile devices provide a different, more personalised user experience and are carried by their owners everywhere. Mobile devices can capture user context and behave as digitalised user complements. By integrating context awareness into the workflow technology, workflow decisions can be based on local, context information and therefore, be more adapted to individual collaborators’ circumstances and expectations. Knowing the current context of collaborators and their mobile devices is useful, especially in mobile peer-topeer collaboration where the workflow process execution can be driven by devices according to the situation. In mobile collaboration, team workers share pictures, videos, or other content. Monitoring and exchanging the information on the current state of the content processed on devices can enhance the overall workflow execution. As mobile devices in peer-to-peer collaboration are not aware of a global workflow state, the content state information can be used to communicate progress among collaborators. However, there is still a lack of integrating content lifecycles in process-oriented workflows. The aim of this research was therefore to investigate how workflow technology can be adapted for mobile peer-to-peer collaboration, in particular, how the level of context awareness in mobile collaborative workflows can be increased and how the extra content lifecycle management support can be integrated. The collaborative workflow technology has been adapted for mobile peerto- peer collaboration by integrating context and content awareness. In the first place, a workflow-specific context management approach has been developed that allows defining workflow-specific context models and supports the integration of context models with collaborative workflows. Workflow process has been adapted to make decisions based on context information. Secondly, extra content management support has been added to the workflow technology. A representation for content lifecycles has been designed, and content lifecycles have been integrated with the workflow process. In this thesis, the MobWEL workflow approach is introduced. The Mob- WEL workflow approach allows defining, managing and executing mobile context-aware content-centric workflows. MobWEL is a workflow execution language that extends BPEL, using constructs from existing workflow approaches, Context4BPEL and BPELlight, and adopting elements from the BALSA workflow model. The MobWEL workflow management approach is a technology-based solution that has been designed to provide workflow management support to a specific class of mobile applications.
17

On the performance of markup language compression

Kheirkhahzadeh, Antonio January 2015 (has links)
Data compression is used in our everyday life to improve computer interaction or simply for storage purposes. Lossless data compression refers to those techniques that are able to compress a file in such ways that the decompressed format is the replica of the original. These techniques, which differ from the lossy data compression, are necessary and heavily used in order to reduce resource usage and improve storage and transmission speeds. Prior research led to huge improvements in compression performance and efficiency for general purpose tools which are mainly based on statistical and dictionary encoding techniques. Extensible Markup Language (XML) is based on redundant data which is parsed as normal text by general-purpose compressors. Several tools for compressing XML data have been developed, resulting in improvements for compression size and speed using different compression techniques. These tools are mostly based on algorithms that rely on variable length encoding. XML Schema is a language used to define the structure and data types of an XML document. As a result of this, it provides XML compression tools additional information that can be used to improve compression efficiency. In addition, XML Schema is also used for validating XML data. For document compression there is a need to generate the schema dynamically for each XML file. This solution can be applied to improve the efficiency of XML compressors. This research investigates a dynamic approach to compress XML data using a hybrid compression tool. This model allows the compression of XML data using variable and fixed length encoding techniques when their best use cases are triggered. The aim of this research is to investigate the use of fixed length encoding techniques to support general-purpose XML compressors. The results demonstrate the possibility of improving on compression size when a fixed length encoder is used to compress most XML data types.
18

Model driven validation approach for enterprise architecture and motivation extensions

Essien, Joe January 2015 (has links)
As the endorsement of Enterprise Architecture (EA) modelling continues to grow in diversity and complexity, management of its schema, artefacts, semantics and relationships has become an important business concern. To maintain agility and flexibility within competitive markets, organizations have also been compelled to explore ways of adjusting proactively to innovations, changes and complex events also by use of EA concepts to model business processes and strategies. Thus the need to ensure appropriate validation of EA taxonomies has been considered severally as an essential requirement for these processes in order to exert business motivation; relate information systems to technological infrastructure. However, since many taxonomies deployed today use widespread and disparate modelling methodologies, the possibility to adopt a generic validation approach remains a challenge. The proliferation of EA methodologies and perspectives has also led to intricacies in the formalization and validation of EA constructs as models often times have variant schematic interpretations. Thus, disparate implementations and inconsistent simulation of alignment between business architectures and heterogeneous application systems is common within the EA domain (Jonkers et al., 2003). In this research, the Model Driven Validation Approach (MDVA) is introduced. MDVA allows modelling of EA with validation attributes, formalization of the validation concepts and transformation of model artefacts to ontologies. The transformation simplifies querying based on motivation and constraints. As the extended methodology is grounded on the semiotics of existing tools, validation is executed using ubiquitous query language. The major contributions of this work are the extension of a metamodel of Business Layer of an EAF with Validation Element and the development of EAF model to ontology transformation Approach. With this innovation, domain-driven design and object-oriented analysis concepts are applied to achieve EAF model’s validation using ontology querying methodology. Additionally, the MDVA facilitates the traceability of EA artefacts using ontology graph patterns.
19

On the performance of emerging wireless mesh networks

Bagale, Jiva Nath January 2015 (has links)
Wireless networks are increasingly used within pervasive computing. The recent development of low-cost sensors coupled with the decline in prices of embedded hardware and improvements in low-power low-rate wireless networks has made them ubiquitous. The sensors are becoming smaller and smarter enabling them to be embedded inside tiny hardware. They are already being used in various areas such as health care, industrial automation and environment monitoring. Thus, the data to be communicated can include room temperature, heart beat, user’s activities or seismic events. Such networks have been deployed in wide range areas and various levels of scale. The deployment can include only a couple of sensors inside human body or hundreds of sensors monitoring the environment. The sensors are capable of generating a huge amount of information when data is sensed regularly. The information has to be communicated to a central node in the sensor network or to the Internet. The sensor may be connected directly to the central node but it may also be connected via other sensor nodes acting as intermediate routers/forwarders. The bandwidth of a typical wireless sensor network is already small and the use of forwarders to pass the data to the central node decreases the network capacity even further. Wireless networks consist of high packet loss ratio along with the low network bandwidth. The data transfer time from the sensor nodes to the central node increases with network size. Thus it becomes challenging to regularly communicate the sensed data especially when the network grows in size. Due to this problem, it is very difficult to create a scalable sensor network which can regularly communicate sensor data. The problem can be tackled either by improving the available network bandwidth or by reducing the amount of data communicated in the network. It is not possible to improve the network bandwidth as power limitation on the devices restricts the use of faster network standards. Also it is not acceptable to reduce the quality of the sensed data leading to loss of information before communication. However the data can be modified without losing any information using compression techniques and the processing power of embedded devices are improving to make it possible. In this research, the challenges and impacts of data compression on embedded devices is studied with an aim to improve the network performance and the scalability of sensor networks. In order to evaluate this, firstly messaging protocols which are suitable for embedded devices are studied and a messaging model to communicate sensor data is determined. Then data compression techniques which can be implemented on devices with limited resources and are suitable to compress typical sensor data are studied. Although compression can reduce the amount of data to be communicated over a wireless network, the time and energy costs of the process must be considered to justify the benefits. In other words, the combined compression and data transfer time must also be smaller than the uncompressed data transfer time. Also the compression and data transfer process must consume less energy than the uncompressed data transfer process. The network communication is known to be more expensive than the on-device computation in terms of energy consumption. A data sharing system is created to study the time and energy consumption trade-off of compression techniques. A mathematical model is also used to study the impact of compression on the overall network performance of various scale of sensor networks.
20

An empirical examination of interdisciplinary collaboration within the practice of localisation and development of international software

Ressin, Malte January 2015 (has links)
Acceptance on international markets is an important selling proposition for software products and a key to new markets. The adaptation of software products for specific markets is called software localisation. Practitioner reports and research suggests that activities of developers and translators do not mesh seamlessly, leading to problems such as disproportionate cost, lack of quality, and delayed product release. Yet, there is little research on localisation as a comprehensive activity and its human factors. This thesis examines how software localisation is handled in practice, how the localisation process is integrated into development, and how software developers and localisers work individually and collaboratively on international software. The research aims to understand how localisation issues around the above-mentioned classifications of cost, quality and time issues are caused. Qualitative and quantitative data is gathered through semi-structured interviews and an online survey. The interviews focused on the individual experiences of localisation and development professionals in a range of relevant roles. The online survey measured cultural competence, attitude towards and self-efficacy in localisation, and properties of localisation projects. Interviews were conducted and analysed following Straussian Grounded Theory. The survey was statistically analysed to test a number of hypotheses regarding differences between localisers and developers, as well as relationships between project properties and software quality. Results suggest gaps in knowledge, procedure and motivation between developers and translators, as well as a lack of cross-disciplinary knowledge and coordination. Further, a grounded theory of interdisciplinary collaboration in software localisation explains how collaboration strategies and conflicts reciprocally affect each other and are affected by external influences. A number of statistically significant differences between developers and localisers and the relevance of certain project properties to localisation were confirmed. The findings give new insights into interdisciplinary issues in the development of international software and suggest new ways to handle interdisciplinary collaboration in general.

Page generated in 0.0511 seconds