• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 16
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 22
  • 22
  • 8
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Integrating research root cause analysis tools into a commercial IT service manager

Li, Xiaochun 13 December 2011 (has links)
IT environments are turning more complex by the day and this trend is poised to rise in the coming years. To manage IT resources and maximize productivity better, large organizations are striving for better methods to control their current environments. They also have to prepare for future complexity growth as their environments cater to the growing IT needs. In the current economic recession, organizations are not only threatened by the growing complexity, but also have to cope with limited personnel due to financial constraints. Organizations are ardent about obtaining new technology to have firmer control on different platforms, vendors, and solutions at a reasonable cost. At the same time, this new technology must deliver quality services that can effectively fulfill customer needs. To deal with IT management challenges, CA developed Spectrum Service Assurance Manager (SAM), a product by CA Inc. (formerly Computer Associates) to solve complex IT environment service management problems. SAM can provide organizations with a wide-ranging view of their multi-faceted IT environments by providing vital pieces of information that no other software can perceive. Thus, SAM can monitor and manage systems, databases, networks, applications, and end-user experiences. Although, this technology is able to detect many errors and problems, it still lacks a good mechanism to diagnose the detected problems and uncover their root causes for end users to fix. Four research groups from Universities of Alberta, Toronto, Victoria and Waterloo—under the auspices of the Consortium for Software Engineering Research—built different tools for root-cause analysis and detection. To integrate these solutions, these research groups worked together with CA Inc. to produce a web-based integration tool to integrate these add-ons into the main SAM application. The resulting framework does not affect any of SAM’s existing features as the additions only involve a new web communication layer that acts from the core of the software to detect and present root causes. The detection tools only parse the log files for vital information and thus the core functionality of the software remains unaffected. My contributions to this research project are presented in this thesis. In the beginning of this thesis, I report on background research on SAM and describe how it is going to solve the increasing complexity problem in IT environments. Later on, I propose two software integration approaches to integrate root cause diagnosis tools with SAM and briefly describe CA’s latest software integration framework Catalyst. Towards the end of this thesis, I compare our integration solution with Catalyst, and discuss advantages and disadvantages of these integration solutions. / Graduate
2

Implementation of a demand planning system using advance order information

Haberleitner, Helmut, Meyr, Herbert, Taudes, Alfred 08 July 2010 (has links) (PDF)
In times of demand shocks, when quantitative forecasting based on historical time series becomes obsolete, the only information about future demand is "advance demand information", i.e. interpreting early customer bookings as an indicator of not yet known demand. This paper deals with a forecasting method which selects the optimal forecasting model type and the level of integration of advance demand information, depending on the patterns of the particular time series. This constitutes the applicability of the procedure within an industrial application where a large number of time series is automatically forecasted in a flexible and data-driven way. The architecture of such a planning system is explained and using real-world data from a make-to-order industry it is shown that the system is flexible enough to cover different demand patterns and is well-suited to forecast demand shocks. (authors' abstract)
3

Integrating Push Technology with the Ericsson Mobile Positioning Center / Ericsson Mobile Positioning Center integrerat med Pushteknik / Integrating Push Technology with the Ericsson Mobile Positioning Center

Boström, Stellan January 2001 (has links)
Push is an Internet technology, which allow people to subscribe to a content- or service provider that automatically update the subscriber?s computer or Personal Digital Assistant (PDA) with the latest information without having the subscriber to first request for new information. The Ericsson Mobile Positioning Center (MPC) is a gateway that provides geographical positions of mobile stations to various applications. This Master Thesis gives the reader an overview of these technologies and presents an alternative way in integrating a third part Push-solution with the MPC. The integration proposal is also evaluated against the current Push functionality that Ericsson has developed and integrated into the MPC. / Push är en Internetteknik som ger Internetanslutna användare möjlighet att abbonnera på automatisk nyhetsuppdatering från olika informationskällor direkt till deras dator utan att användaren behöver efterfråga densamma upprepade gånger. Ericsson Mobile Positioning Center (MPC) är en gateway vilken förser olika nyhetstjänster den geografiska positionen av en mobiltelefon. Denna Magisteruppsats ger en översikt inom båda dessa områden samt presenterar ett förslag på hur dessa tekniker kan integreras. Integrationsförslaget jämförs även med den lösning Ericsson själva har valt att implementera. / Stellan Boström Adress: Vendesgatan 1B Tel: 044-218793 / 0733-228105 E-mail: stellan_bostrom@hotmail.com
4

Describing Integrated Power Electronics Modules using STEP AP210

Wu, Yingxiang 25 May 2004 (has links)
The software environment for power electronics design is comprised of tools that address many interrelated disciplines including circuits design, physical layout, thermal management, structural mechanics, and electromagnetics. This usually results in a number of separate models that provide various views of a design, each of which is usually stored separately in proprietary formats. The problem is that the relationships between views (e.g., the circuit design that defines the functional connectivity between components, and the physical layout that provides physical paths to implement connections), are not explicitly captured. This makes it difficult to synchronize and maintain data consistency across all models as changes are made to the respective views. This thesis addresses this problem by describing power electronics modules using STEP AP210, the STandard for the Exchange of Product data, Application Protocol 210; which has been designated as ISO 10303-210. A multidisciplinary model was implemented for an integrated power electronics module (IPEM). It consists of two views of the IPEM: a functional network definition of the IPEM, and a physical implementation that satisfies the functional connectivity requirements. The relationships between these two views are explicitly recorded in the model. These relationships allow for the development of a method which verifies whether the connectivity data in both views are consistent. Finally, this thesis provides guidance for deploying STEP AP210 to unify multidisciplinary data resources during the design of integrated power electronics. / Master of Science
5

Software Systems In-House Integration : Observations and Guidelines Concerning Architecture and Process

Land, Rikard January 2006 (has links)
<p>Software evolution is a crucial activity for software organizations. A specifc type of software evolution is the integration of previously isolated systems. The need for integration is often a consequence of different organizational changes, including merging of previously separate organizations. One goal of software integration is to increase the value to users of several systems by combining their functionality, another is to reduce functionality overlap. If the systems are completely owned and controlled in-house, there is an additional advantage in rationalizing the use of internal resources by decreasing the amount of software with essentially the same purpose. Despite in-house integration being common, this topic has received little attention from researchers. This thesis contributes to an increasing understanding of the problems associated with in-house integration and provides guidelines to the more efficient utilization of the existing systems and the personnel.</p><p>In the thesis, we combine two perspectives: software architecture and processes. The perspective of software architecture is used to show how compatibility analysis and development of integration alternatives can be performed rapidly at a high level of abstraction. The software process perspective has led to the identification of important characteristics and practices of the integration process. The guidelines provided in the thesis will help those performing future in-house integration to make well-founded decisions timely and efficiently.</p><p>The contributions are based on several integration projects in industry, which have been studied systematically in order to collect, evaluate and generalize their experiences.</p>
6

Software Systems In-House Integration : Observations and Guidelines Concerning Architecture and Process

Land, Rikard January 2006 (has links)
Software evolution is a crucial activity for software organizations. A specifc type of software evolution is the integration of previously isolated systems. The need for integration is often a consequence of different organizational changes, including merging of previously separate organizations. One goal of software integration is to increase the value to users of several systems by combining their functionality, another is to reduce functionality overlap. If the systems are completely owned and controlled in-house, there is an additional advantage in rationalizing the use of internal resources by decreasing the amount of software with essentially the same purpose. Despite in-house integration being common, this topic has received little attention from researchers. This thesis contributes to an increasing understanding of the problems associated with in-house integration and provides guidelines to the more efficient utilization of the existing systems and the personnel. In the thesis, we combine two perspectives: software architecture and processes. The perspective of software architecture is used to show how compatibility analysis and development of integration alternatives can be performed rapidly at a high level of abstraction. The software process perspective has led to the identification of important characteristics and practices of the integration process. The guidelines provided in the thesis will help those performing future in-house integration to make well-founded decisions timely and efficiently. The contributions are based on several integration projects in industry, which have been studied systematically in order to collect, evaluate and generalize their experiences.
7

Enterprise Software System Integration : An Architectural Perspective

Johnson, Pontus January 2002 (has links)
QC 20100621
8

Software integration for automated stability analysis and design optimization of a bearingless rotor blade

Gündüz, Mustafa Emre 06 April 2010 (has links)
The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Such a design process takes dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used to probe the design space of several local minima and maxima. After analysis of numerous samples, an optimum configuration of the design that is more stable than that of the initial design is reached. The process requires several software tools: CATIA as the CAD tool, ANSYS as the FEA tool, VABS for obtaining the cross-sectional structural properties, and DYMORE for the frequency and dynamic analysis of the rotor. MATLAB codes are also employed to generate input files and read output files of DYMORE. All these tools are connected using ModelCenter.
9

Enterprise Application Integration : Grundlagen, Integrationsprodukte, Anwendungsbeispiele /

Kaib, Michael. January 2004 (has links) (PDF)
Univ., Diss.--Marburg, 2002. / Literaturverz. S. [227] - 256.
10

Astronomy Software Integration with OpenSpace

Bihi, Aniisa, Granström, Johanna January 2020 (has links)
This thesis aimed to create a messaging protocol for OpenSpace to interoperate with other astronomy software. The goal was to create a messaging standard that was not language-dependent and could be implemented by any astronomy software. To establish an asynchronous communication between OpenSpace and connected software, the Transmission Control Protocol (TCP), threading, and Peer-To-Peer (P2P) were the techniques mainly used. TCP was used to achieve reliable communication between software connected to the network. The enabling of two-way communication was solved by threading. P2P was used as a network communication architecture to share resources between the connected software. By using Unicode characters expressed through UTF-8, the Unicode Standard was used to encode messages sent. The messages are structured by combinations of different sizes of bytes and are sent and received as binary strings. All messages contain a header and the data being sent. Different message types were created to specify which type of data is sent. The protocol works primarily between OpenSpace and Glue but is not limited to these software. The implementation serves as the basis of the messaging protocol for OpenSpace, where Glue represents future software integrations.

Page generated in 0.1179 seconds