• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 26
  • 1
  • 1
  • Tagged with
  • 35
  • 35
  • 33
  • 21
  • 21
  • 15
  • 12
  • 11
  • 10
  • 9
  • 8
  • 7
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Architecting the safety assessment of large-scale systems integration

Tong, Choon Yin. January 2009 (has links) (PDF)
Thesis (M.S. in Systems Engineering and Analysis)--Naval Postgraduate School, December 2009. / Thesis Advisor(s): Paulo, Eugene. Second Reader: Rhoades, Mark. "December 2009." Description based on title screen as viewed on January 27, 2010. Author(s) subject terms: Systems integration, System safety, System-of-Systems safety. Includes bibliographical references (p. 51-52). Also available in print.
2

A semantics-based approach to processing formal languages /

Wang, Qian. January 2007 (has links)
Thesis (Ph.D.)--University of Texas at Dallas, 2007. / Includes vita. Includes bibliographical references (leaves 137-146)
3

Ontology alignment : bridging the semantic gap /

Ehrig, Marc. January 2007 (has links) (PDF)
Zugl.: Karlsruhe, Univ., Diss. 2006.
4

A federated approach to enterprise integration

Fernandez, George. January 2006 (has links)
Thesis (Ph.D) - Swinburne University of Technology, Faculty of Information & Communication Technologies, 2006. / A thesis submitted in total fulfillment of the requirments of for the degree of Doctor of Philosophy, Faculty of Information and Communication Technologies, Swinburne University of Technology, 2006. Typescript. Bibliography p. 194-201.
5

Ontology alignment : bridging the semantic gap /

Ehrig, Marc. January 2007 (has links)
Univ., Diss.--Karlsruhe, 2005. / Literaturverz. S. [227] - 243 S.
6

Clearwater an extensible, pliable, and customizable approach to code generation /

Swint, Galen Steen. January 2006 (has links)
Thesis (Ph. D.)--Computing, Georgia Institute of Technology, 2007. / Calton Pu, Committee Chair ; Ling Liu, Committee Member ; Karsten Schwan, Committee Member ; Olin Shivers, Committee Member ; Donald F. Ferguson, Committee Member.
7

Achieving customer data integration through master data management in enterprise information management

Lerche, Stephen 19 June 2014 (has links)
M.Com. (Business Management) / Data and the use thereof, is considered to be a source of competitive advantage for organisations. In order to achieve this it needs to be managed appropriately, and existing literature considers enterprise information management (EIM) to be the foundation for organisations to manage information successfully. (For the purposes of this dissertation, data and information will be treated as analogous concepts.) Key contributing factors to the success of EIM have been identified as ensuring data governance is in place, and that there is a focus on data quality. Within EIM, a key set of data that must be managed effectively is customer data. In many organisations – including the financial services organisation that is the focus of this study – customer data is held in disparate systems across the organisation. Creating a single view of how customers interact with an organisation is deemed of crucial importance for future organisational growth, and the initiatives that organisations undertake to create this view is referred to in the literature as customer data integration (CDI). In order for CDI to be successful, master data management (MDM) needs to be addressed; this will ensure that core data is managed consistently across the disparate systems. This study sought to determine how the concepts of EIM, CDI and MDM were being applied in the organisation under review, and how closely this application matched the recommendations of the literature. In addition, the study sought to uncover additional factors that had an impact on customer data integration, and information management in general, in the organisation. What was found is that in the organisation in question, how information management is being addressed is consistent with the literature in some areas – primarily the importance of a single view of customer and the supporting roles of information governance and data quality – but divergent in others, the key area being that there is no EIM strategy in the organisation that drives a consolidated approach to information management. Organisational culture was also highlighted by the literature as being a critical influencer on how information is managed, and this was supported by the findings. Additional factors that were found to have a significant influence on data management – which were not highlighted by the literature – included the importance of processes, and especially for CDI the critical role played by legislation, in particular the Financial Intelligence Centre Act (FICA). An additional crucial factor, again not highlighted by the literature, is the difficulty organisations have in placing an actual financial value on the use of information. Although intrinsically it is understood that information is valuable, the difficulty in ascribing an explicit value to it is a key inhibitor (in conjunction with organisational culture), to the organisation initiating data management projects at a strategic level, and instead having to address data management as a component of projects driven by individual business units.
8

Integration of Service-Oriented Embedded Systems with External Systems in Software Product Lines

Johansson, Nils January 2024 (has links)
Developing software for complicated systems is often done by collaboration and consists of deliverables by a multitude of organisations. The deliverables can range from smaller devices and commercial-off-the-shelf software components, to larger systems. This is the situation during the development of the embedded system for large vehicles or machines. Many companies within the embedded industry are transitioning to using Service-Orientation to develop high-quality software and reduce costs. However, when integrating different external systems with an internal, service-oriented system there may arise difficulties since the communication patterns, i.e. interface, cannot be changed to fit the internal system. This study aims to develop a design solution that can be used to integrate different external systems with an internally developed service-oriented system in an entire software product line, including the handling of variability by parametrization. The solution is evaluated by software developers at a company in such a situation. To develop the design solution design science methodology is applied, which is an iterative process that continuously improves the candidate solution until satisfactory according to various stakeholders. The resultant design solution includes the use of wrappers-based interaction between systems, where so-called adapters are used when the internal system acts as a client to an external system, and using gateways for when the internal systems acts as a server to an external system. We also observe the need for a system integration view to describe the relations and available communication mechanisms between systems, i.e. the gateways and adapters. We conclude that to integrate a service-oriented software system with non-service-oriented systems, there can be benefits to using an abstraction layer between systems to protect the internally developed software architecture from being affected by the nature of the external system. Attempting to integrate external systems with an internal system as if also developed internally may become troublesome in terms of defining and upholding an appropriate service-oriented architecture. This is especially important when considering variability of the complete system, where different external systems are used or replaced in specific variants.
9

Cross-modality semantic integration and robust interpretation of multimodal user interactions. / CUHK electronic theses & dissertations collection

January 2010 (has links)
Multimodal systems can represent and manipulate semantics from different human communication modalities at different levels of abstraction, in which multimodal integration is required to integrate the semantics from two or more modalities and generate an interpretable output for further processing. In this work, we develop a framework pertaining to automatic cross-modality semantic integration of multimodal user interactions using speech and pen gestures. It begins by generating partial interpretations for each input event as a ranked list of hypothesized semantics. We devise a cross-modality semantic integration procedure to align the pair of hypothesis lists between every speech input event and every pen input event in a multimodal expression. This is achieved by the Viterbi alignment that enforces the temporal ordering and semantic compatibility constraints of aligned events. The alignment enables generation of a unimodal paraphrase that is semantically equivalent to the original multimodal expression. Our experiments are based on a multimodal corpus in the navigation domain. Application of the integration procedure to manual transcripts shows that correct unimodal paraphrases are generated for around 96% of the multimodal inquiries in the test set. However, if we replace this with automatic speech and pen recognition transcripts, the performance drops to around 53% of the test set. In order to address this issue, we devised the hypothesis rescoring procedure that evaluates all candidates of cross-modality integration derived from multiple recognition hypotheses from each modality. The rescoring function incorporates the integration score, N-best purity of recognized spoken locative references (SLRs), as well as distances between coordinates of recognized pen gestures and their interpreted icons on the map. Application of cross-modality hypothesis rescoring improved the performance to generate correct unimodal paraphrases for over 72% of the multimodal inquiries of the test set. / We have also performed a latent semantic modeling (LSM) for interpreting multimodal user input consisting of speech and pen gestures. Each modality of a multimodal input carries semantics related to a domain-specific task goal (TG). Each input is annotated manually with a TG based on the semantics. Multimodal input usually has a simpler syntactic structure and different order of semantic constituents from unimodal input. Therefore, we proposed to use LSM to derive the latent semantics from the multimodal inputs. In order to achieve this, we characterized the cross-modal integration pattern as 3-tuple multimodal terms taking into account SLR, pen gesture type and their temporal relation. The correlation term matrix is then decomposed using singular value decomposition (SVD) to derive the latent semantics automatically. TG inference on disjoint test set based on the latent semantics achieves accurate performance for 99% of the multimodal inquiries. / Hui, Pui Yu. / Adviser: Helen Meng. / Source: Dissertation Abstracts International, Volume: 73-02, Section: B, page: . / Thesis (Ph.D.)--Chinese University of Hong Kong, 2010. / Includes bibliographical references (leaves 294-306). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [201-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstract also in Chinese.
10

Assessing operational impact in enterprise systems with dependency discovery and usage mining

Moss, Mark Bomi 15 July 2009 (has links)
A framework for monitoring the dependencies between users, applications, and other system components, combined with the actual access times and frequencies, was proposed. Operating system commands were used to extract event information from the end-user workstations about the dependencies between system, application and infrastructure components. Access times of system components were recorded, and data mining tools were leveraged to detect usage patterns. This information was integrated and used to predict whether or not the failure of a component would cause an operational impact during certain time periods. The framework was designed to minimize installation and management overhead, to consume minimal system resources (e.g. network bandwidth), and to be deployable on a variety of enterprise systems, including those with low-bandwidth and partial-connectivity characteristics. The framework was implemented in a test environment to demonstrate the feasibility of this approach. The system was tested on small-scale (6 computers in the GT CERCS Laboratory over 35 days) and large-scale (76 CPR nodes across the entire GT campus over 4 months) data sets. The average size of the impact topology was shown to be approximately 4% of the complete topology, and this size reduction was related to providing system administrators the capability to better identify those users and resources most likely to be affected by a designated set of component failures during a designated time period.

Page generated in 0.132 seconds