• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 516
  • 171
  • 59
  • 31
  • 27
  • 20
  • 19
  • 11
  • 8
  • 6
  • 4
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 1025
  • 1025
  • 476
  • 455
  • 404
  • 284
  • 182
  • 159
  • 153
  • 141
  • 139
  • 117
  • 112
  • 98
  • 86
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Automated Analysis of Unified Modeling Language (UML) Specifications

Tanuan, Meyer C. January 2001 (has links)
The Unified Modeling Language (UML) is a standard language adopted by the Object Management Group (OMG) for writing object-oriented (OO) descriptions of software systems. UML allows the analyst to add class-level and system-level constraints. However, UML does not describe how to check the correctness of these constraints. Recent studies have shown that Symbolic Model Checking can effectively verify large software specifications. In this thesis, we investigate how to use model checking to verify constraints of UML specifications. We describe the process of specifying, translating and verifying UML specifications for an elevator example. We use the Cadence Symbolic Model Verifier (SMV) to verify the system properties. We demonstrate how to write a UML specification that can be easily translated to SMV. We propose a set of rules and guidelines to translate UML specifications to SMV, and then use these to translate a non-trivial UML elevator specification to SMV. We look at errors detected throughout the specification, translation and verification process, to see how well they reveal errors, ambiguities and omissions in the user requirements.
182

Increasing the Semantic Similarity of Object-Oriented Domain Models by Performing Behavioral Analysis First

Svetinovic, Davor January 2006 (has links)
The main goal of any object-oriented analysis (OOA) method is to produce a model that aids in understanding and communicating knowledge about a modeled domain. A higher degree of similarity among independently produced domain models provides an indication of how well the domain was understood by the different analysts, i. e. , more similar models indicate a closer and a more common understanding of a domain. A common understanding is of critical importance for effective knowledge communication and sharing. <br /><br /> The core of any OOA method is discovering and understanding concepts and their relationships in a domain. The main artifact produced by an OOA method is a domain model of the domain. A domain model often serves as the main source of design concepts during objectoriented design (OOD). This thesis evaluates two OOA methods by comparing the degree of similarity of the resulting domain models. <br /><br /> In particular, this work compares the semantic similarity of domain models extracted from use cases by <ol> <li>specification of sequence diagrams and then domain models, and </li> <li>specification of unified use case statecharts and then domain models. </li> </ol> The thesis makes case studies out of the application of the first method to 31 instances of large Voice-over-IP (VoIP) system and its information management system (IMS) and to 3 small elevator systems, and out of the application of the second method to 46 instances of the same large VoIP system and its IMS and to 12 instances of a medium-sized elevator system. <br /><br /> From an analysis of data from these case studies, the thesis concludes that there is an increase of 10% in the semantic similarity of domain models produced using the second method, but at the cost of less than or equal to 25% more analysis time.
183

A Comparison between Structured and Object-Oriented Systems Analysis and Development: Modeling Tools and Artifact

Lin, Chien-hung 07 July 2010 (has links)
Since the Software Engineering Institute published the Capability Maturity Method Integration (CMMI) in 2003, many software firms have implemented it to enhance its software quality assurance and international collaboration. Analysis and design phases are two important phases for the software development. The artifacts of these two phases mainly represented using the structured technique or the object-oriented technique. This study proposed a methodology which provides guidelines to compare the artifacts of these two techniques for an embedded system. The research methodology is articulated using the design science research methodology. A usability valuation with a real-world embedded system case is performed to demonstrate its applicability. The results provide evidences to enhance our understanding about the strength and weakness of these two nalysis and design techniques.
184

A Study of Class Normalization

Chiu, Jui-Yuan 04 July 2005 (has links)
Class normalization is a process that can be used to organize the structure of object schema to increase the cohesion of classes while minimizing the coupling between them. This research proposes a method to apply the rule of class normalization to class modeling during the object-oriented systems analysis and design process. A real-world case is presented to illustrate the concepts, application, and the advantages of using the proposed method. Utilizing this method in class modeling can help the system developer ensure the class diagram in third object normal form and thereby enhance the effectiveness of system development.
185

A Study of Reverse Engineering in Software Modeling

Chen, Po-hsun 26 May 2006 (has links)
The system design document can provide maintenance workers a quick understanding of the system operation process and review detail which helps current increasingly complicated information quite a lot, especially the in the aspect of the comprehension of original design concept to system. In addition, the system design document of platform independent can even quickly switch the platform for system without designing system again. It largely increases the operation of information system across platform. Besides the source code, system design document is also one of the most important asset; for instance, when the source code is lost, the designer can quickly duplicate a set of system with the same function under a basis of system design document. Due to the constant revolution of the platform technology to the current software, as well as the prevalent of visualized rapid application development tool, there is no complete design document attached in a set of the developed information system, or the document has been missing. It is necessary to find a way for the source code of system to reversely generate system design document. The study is based on object-oriented technology and object-oriented model to address a concept that platform independent system design document can be reversely generated from the object-oriented source code and to generalize a set of the reverse modeling method from this study. Last, a implemented case would be carried out and verified by the method mentioned above. Through this method, the maintenance workers could therefore quickly transform codes to system design document and then increase the operation efficiency of system maintenance.
186

A Methodology for Class Normalization Analysis and Refinement

Chen, Chia-Hao 21 June 2007 (has links)
Object-Oriented analysis and design approach has become the mainstream of today¡¦s systems development technique. The Class Diagram in Unified Modeling Language (UML) is the major tool for modeling the class structure in the object-oriented system analysis and design process. Once the class diagram is constructed, class normalization needs to be performed to eliminate the anomalies for the designing a database. However, the detailed guideline for performing class normalization is lacking. Therefore, this study presents a class normalization methodology based on object normal forms proposed by Ambler (1996). Two real-world cases are presented to illustrate the concepts, application, and the advantages of using the proposed method. Using this methodology in class modeling will help system developers normalize the class design in advance, and thereby enhance the efficiency and effectiveness of system development.
187

The Application of Immune Algorithm to Distribution Systems Operation

Wu, Chia-Jean 15 June 2001 (has links)
With the rapid growth of load demand, the distribution system is becoming very complicated such that the operation efficiency and service quality are deteriorated during recent years. Engineers have to solve the problems by applying new technologies to enhance the efficiency of distribution system. In this thesis, an immune algorithm(IA) based on weighting selection as a decision maker is proposed to reach the desired switching operations such that transformer and feeder loading balance can be achieved. The IA antigen and antibody are equivalent to the objective and the feasible solution for a conventional optimization method. The concept of the information entropy is also introduced as a measure of diversity for the population to avoid falling into a local optimal solution. This algorithm prevents the possibility of stagnation in the iteration process and achieves the fast convergence for the global optimization. With the object-orient programming(OOP), this research project is to create the relationship of distribution element objects and encapsulation of data with all 22KV underground systems in Taichung district. The OOP does provide an effective tool for the management of distribution system database and the fault detection, isolation, and service restoration(FDIR) function of feeders and main transformers. According to the attributes of line switches, we can create the 22KV distribution system configuration with the topology processor. In order to calculate the current flows of line switches, this project will also execute the three phase load flow program with the customer information system(CIS), load survey, outage management information system(OMIS), and the data of all feeders and main transformers. In this thesis, the IA is used to solve the optimal switching problem by considering the customer load characteristics for the normal operation and the overload contingency of the distribution system. The efficiency of immune algorithm to solve the problem is verified by comparing to the computing time of the conventional binary integer programming for decision making of switching operation. A Taichung district distribution system is selected for computer simulation to demonstrate the effectiveness of the proposed methodology for solving the optimal switching operation of distribution system. The result of this thesis will be an important reference for distribution automation in Taiwan.
188

Package Implementation : A Methodology For Requirement Gap Analysis

Hsu, Cheng-Yi 24 June 2003 (has links)
The global business is facing the intensive competitive environment. To overcome the competition, most enterprises contentiously try different information systems to strengthen their competition and to extent their businesses. In order to execute the enterprises marketing strategies effectively, business organization applies various software from in-house development to standard package software either made in Taiwan or made in nations. However, while implementing the information system into specific project, this software caused many obstructions in both project time framing design and project cost control. To exam this problem, we found there is no appropriate requirement and package software or tool to manage the system gaps analysis. The main purpose of this thesis is to solve the present package software problem and develop a method to meet the enterprise needs in system difference analysis. We use the Unified Module Language (UML), use case diagram, drawing , activity diagram, use case description and decision table to complete the difference analysis between package software and specific project. Applying this method, the user not only can control the difference between the requirement system and object system precisely, but also build a communication channel among the users, system analysis, and programmers. In addition, the method provides a trend of customized system to succeed the project on line
189

Simulation of anisotropic wave propagation in Vertical Seismic Profiles

Durussel, Vincent Bernard 30 September 2004 (has links)
The influence of elastic anisotropy on seismic wave propagation is often neglected for the sake of simplicity. However, ignoring anisotropy may lead to significant errors in the processing of seismic data and ultimately in a poor image of the subsurface. This is especially true in wide-aperture Vertical Seismic Profiles where waves travel both vertically and horizontally. Anisotropy has been neglected in wavefront construction methods of seismic ray-tracing until Gibson (2000), who showed they are powerful tools to simulate seismic wave propagation in three-dimensional anisotropic subsurface models. The code is currently under development using a C++ object oriented programming approach because it provides high flexibility in the design of new components and facilitates debugging and maintenance of a complex algorithm. So far, the code was used to simulate propagation in homogeneous or simple heterogeneous anisotropic velocity models mainly designed for testing purposes. In particular, it has never been applied to simulate a field dataset. We propose here an analytical method involving little algebra and that allows the design of realistic heterogeneous anisotropic models using the C++ object oriented programming approach. The new model class can model smooth multi-layered subsurface with gradients or models with many dip variations. It has been used to model first arrival times of a wide-aperture VSP dataset from the Gulf of Mexico to estimate the amount of anisotropy. The proposed velocity model is transversely isotropic. The anisotropy is constant throughout the model and is defined via Thomsen's parameters. Values in the final model are epsilon = 0.055 and delta = -0.115. The model is compatible with the a priori knowledge of the local geology and reduces the RMS average time difference between measured and computed travel times by 51% in comparison to the initial isotropic model. These values are realistic and are similar to other measurements of anisotropy in the Gulf of Mexico.
190

Untangling the threads reduction for a concurrent object-based programming model /

Adams, William Edward, January 2000 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 2000. / Vita. Includes bibliographical references (leaves 288-291). Available also in a digital version from Dissertation Abstracts.

Page generated in 0.1014 seconds