• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17
  • 5
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 40
  • 40
  • 16
  • 15
  • 14
  • 11
  • 11
  • 10
  • 9
  • 9
  • 9
  • 7
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Knowledge integration in distributed data mining

Sutiwaraphun, Janjao January 2001 (has links)
No description available.
2

Implementing software architecture practices in a new environment

Taylor, Paul Leonardo 2009 August 1900 (has links)
During a discussion with the head of the software infrastructure team about the need for software architecture practices at Temple-Inland Company, the manager responded by noting since the company is not a software development company “there are no real benefits to implementing software development practices in the company”. This is an approach taken by many companies whose software development activity is primarily undertaken to support business activities such as the case with manufacturing or financial companies. This paper examines the process of implementing software architectural practices into an organization. The information contained here should be useful to small startup software companies who might assume that it is too costly to incorporate software architectural practices into their current development process. This paper should also benefit large organizations who primarily view software as solutions for short term immediate support and not in terms longer term strategic goals. Software development teams with projects that suffer from cost overruns, scheduling problems and user dissatisfaction should also find this information useful. / text
3

The Use of GIS for Integrated Watershed Analysis

Dietz, Robert W. 30 April 2000 (has links)
Practitioners of watershed management are increasingly turning to computer models to understand and make decisions about the diverse problems that occur in the watershed. Such models can provide insight into how human interactions with the landscape affect water quality and quantity. Additional modeling tools trace how those effects ripple through ecosystems, economies, and other systems. In the past, models were stand-alone and process-specific, aimed at solving problems related to a narrow discipline. For example, hydrologic models analyzed the quantity of flow through waterways. Separate ecological models probed the cycling of nutrients in those waterways. An emerging trend for watershed-based models is to link them to a geographic information system (GIS), which provides the basis for integrating data, algorithms, and methods from each discipline of interest. This integration capability makes GIS a very powerful tool for the watershed manager. The GIS in this study within the Upper Roanoke River Watershed integrates modeling efforts from the fields of hydrology, economics, and ecology. The main goal of this study is to demonstrate the effectiveness of GIS as an integrating and computational aid for making sound decisions about a watershed. A secondary goal is to include GIS functionality in a prototype software application for evaluating the effects of land management decisions. The application, named DesktopL2W, can be a significant tool for choosing how and where development should occur within the boundaries of a watershed. The three major results of the study are: (1) a library of spatial data that is valuable for watershed analysis; (2) a set of procedures for undertaking a GIS integration project; and (3) the DesktopL2W software product with its usefulness to planners and others who are interested in how development affects the watershed. In addition, discussion of technical issues, such as selection of data formats and spatial and temporal resolution, provides insight into the complexities associated with a GIS integration effort. / Master of Science
4

The Case Study of CMMI Implementation of A+ Technology Company

Chuang, Hsin-Ning 20 March 2008 (has links)
ABSTRACT The Capability Maturity Model Integration, (CMMI) has been officially announced in 2002 by the Carnegie Mellon® Software Engineering Institute (SEI). The SEI¡¦s core purpose is to help organizations continuously carry on to improve their flow of software engineering capabilities and to appraise the maturity rank of contractor's organization. CMMI has provided an elastic framework to practice for the software project and to ensure the software quality. Due to CMMI does not limit the method of practice, but only defines the goal which must be achieved. Therefore, the different technique, organization, and work domain all may according to their characteristics to adjust their flow to comply with the CMMI requirement. Meanwhile it has already been generally valued and adopted by both of domestic and foreign companies. At present, there are 24 domestic companies have obtained the CMMI authentication. At the same time, the Government wants to promote the competitiveness of domestic software industry to the global information service market. During CMMI¡¦s promotion, she has aggressively offered all kind of policies and estimates to invest about seventy millions new Taiwan Dollars every year with the expectation to impel 50 third level of authentication in 2007; to achieve 100 third level in 2010 as well as 5 fifth level to encourage the domestic software organization to obtain the CMMI authentication. Obviously the software quality is now one of the key impetus of Taiwan industry and more over the most important development tendency and the critical technology to increase international competitive strength of Taiwan software industry in the future. The software flow improvement is the enterprise to guarantee the quality of their provided service/product. It is an important investment of competitiveness promotion which will involve the interference of the transformation of organization, the resistance of inside and outside staff etc., hence the organization must to have the commitment of policy, the intense attempt and the motive to invest the essential resources to determine the execution of software flow improvement, thus to guarantee the success of the software flow improvement. By analyzing the implemented benefits and the various difficulties of initial stages and common bitter experience in each domain flow of the 24 domestic CMMI certificated companies to make the case study of CMMI implementation to offer the reference templates of domestic enterprises which are now complying with CMMI or those plan to induct CMMI in the future. This research analyzed the substantive benefits of the domestic 24 CMMI certified companies from the six performances, such as customer, finance, the quality, the flow, the organization and the staff after carry out CMMI. We found the mostly positive confirmation after CMMI implemented. Yet comparing the performance of finance and the staff, with approximately 50% thought it revealed less distinguished deviation in either enhancing "the capability of commercial negotiation of project taken¡¨ or decreasing the bustle of team member after CMMI carry out. This article referred to the lesson learn of domestic enterprises which got the CMMI authentication inducting A + technology which was inducting CMMI and the local consultancy companies to collect the 35 usual problems of the 4 aspects of in the staff, the force team, the inside and outside resources, and the environment of organization. Also to procure 18 bitter experiences of domain flow of both maturity second level (ML2) and third level (ML 3) in specific tasks with the common troubles in general practices of A+ technology, as well as the difficulties in particular tasks under execution, then to analyze the change tendency of the various stages of CMMI to accumulate the valuable experience assets of the domestic enterprises to implement CMMI. In this research integrated the motives; the purpose; the flow and invested resources; the key successful factors; the frequent troubles and the solution, including the concrete effectiveness and the future plan of the 3 examples to be the important reference model for those enterprises are to conduct CMMI authentication and to effectively create the systemized software development / maintain flow to promote the project productivity, the competitiveness of organization and software quality. Keywords : CMMI (Capability Maturity Model Integration); The Improvement of Software Flow; Benefit appraisal
5

Integration of facies models in reservoir simulation

Chang, Lin 22 February 2011 (has links)
The primary controls on subsurface reservoir heterogeneities and fluid flow characteristics are sedimentary facies architecture and petrophysical rock fabric distribution in clastic reservoirs and in carbonate reservoirs, respectively. Facies models are critical and fundamental for summarizing facies and facies architecture in data-rich areas. Facies models also assist in predicting the spatial architectural trend of sedimentary facies in other areas where subsurface information is lacking. The method for transferring geological information from different facies models into digital data and then generating associated numerical models is called facies modeling or geological modeling. Facies modeling is also vital to reservoir simulation and reservoir characterization analysis. By extensively studying and reviewing the relevant research in the published literature, this report identifies and analyzes the best and most detailed geologic data that can be used in facies modeling, and the most current geostatistical and stochastic methods applicable to facies modeling. Through intensive study of recent literature, the author (1) summarizes the basic concepts and their applications to facies and facies models, and discusses a variety of numerical modeling methods, including geostatistics and stochastic facies modeling, such as variogram-based geostatistics modeling, object-based stochastic modeling, and multiple-point geostatistics modeling; and (2) recognizes that the most effective way to characterize reservoir is to integrate data from multiple sources, such as well data, outcrop data, modern analogs, and seismic interpretation. Detailed and more accurate parameters using in facies modeling, including grain size, grain type, grain sorting, sedimentary structures, and diagenesis, are gained through this multidisciplinary analysis. The report concludes that facies and facies models are scale dependent, and that attention should be paid to scale-related issues in order to choose appropriate methods and parameters to meet facies modeling requirements. / text
6

Sequence Diagrams Integration via Typed Graphs: Theory and Implementation

LIANG, HONGZHI 03 September 2009 (has links)
It is widely accepted within the software engineering community that the support for integration is necessary for requirement models. Several methodologies, such as the role-based software development, that have appeared in the literature are relying on some kind of integration. However, current integration techniques and their tools support are insufficient. In this research, we discuss our solution to the problem. More precisely, we present a general integration approach for scenario-based models, particularly for UML Sequence Diagrams, based on the colimit construction known from category theory. In our approach, Sequence Diagrams are represented by SD-graphs, a special kind of typed graphs. The merge algorithm for SD-graphs is an extension of existing merge operations on sets and graphs. On the one hand, the merge algorithm ensures traceability and guarantees key theoretical properties (e.g., “everything is represented and nothing extra is acquired” during the merge). On the other hand, our formalization of Sequence Diagrams as SD-graphs retains the graphical nature of Sequence Diagrams, yet is amenable to algebraic manipulations. Another important property of our process is that our approach is applicable to other kinds of models as long as they can be represented by typed graphs. A prototype Sequence Diagram integration tool following the approach has been implemented. The tool is not only a fully functional integration tool, but also served as a test bed for our theory and provided feedback for our theoretical framework. To support the discovery and specification of model relationships, we also present a list of high-level merge patterns in this dissertation. We believe our theory and tool are beneficial to both academia and industry, as the initial evaluation has shown that the ideas presented in this dissertation represent promising steps towards the more rigorous management of requirement models. We also present an approach connecting model transformation with source transformation and allowing an existing source transformation language (TXL) to be used for model transformation. Our approach leverages grammar generators to ease the task of creating model transformations and inherits many of the strengths of the underlying transformation language (e.g., efficiency and maturity). / Thesis (Ph.D, Computing) -- Queen's University, 2009-08-28 13:03:08.607
7

A Study of Model Integration in Conjunction with the eXtensible Model Definition Format

Fife, Melanie A. 11 May 2006 (has links) (PDF)
A considerable amount of research has been done to connect or integrate separate numerical models. The Environmental Modeling Research Laboratory (EMRL) has developed a generic file format, the eXtensible Model Data Format (XMDF). One of the objectives of the XMDF project is to facilitate the spatial interpolation and data-sharing necessary when linking models. The objective of this research is to investigate how model linking capabilities can be added to the XMDF by defining a Model Linkage Object (MLO) that is compatible with current model linking frameworks. A potential design for an MLO is defined in this thesis after a detailed review of existing model linking frameworks. The major model-linking systems studied in this research are FRAMES, DIAS, and OpenMI. Other software was also researched and a brief description of each is provided. To limit the scope of this research the MLO is analyzed for a two-dimensional mesh using only two of the model linking systems. FRAMES was chosen because the Army Corps of Engineers has already done extensive work with this software. OpenMI was selected because of its ability to link models during runtime.
8

Avaliação de processos de Teste pelo Modelo de Maturidade TMMi em pequenas empresas

Costa, Daniella de Oliveira 22 August 2016 (has links)
Submitted by Luciana Ferreira (lucgeral@gmail.com) on 2016-09-05T12:01:51Z No. of bitstreams: 2 Dissertação - Daniella de Oliveira Costa - 2016.pdf: 1198730 bytes, checksum: 2df25fb32f76f409f638956bd43c727e (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2016-09-05T12:02:27Z (GMT) No. of bitstreams: 2 Dissertação - Daniella de Oliveira Costa - 2016.pdf: 1198730 bytes, checksum: 2df25fb32f76f409f638956bd43c727e (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2016-09-05T12:02:27Z (GMT). No. of bitstreams: 2 Dissertação - Daniella de Oliveira Costa - 2016.pdf: 1198730 bytes, checksum: 2df25fb32f76f409f638956bd43c727e (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2016-08-22 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / Background. Test process requires constant follow-ups to evolve their methodologies. To support the improvement of processes, we have the Test Maturity Models such as TMM and TMMi. However, the TMMi model does not provide instruments or framweworks that allows companies to check the adherence about the model, it is necessary the expert advice. For shortage of certifiers in the country, the cost becomes high and hinders to obtainment certification for small businesses. Aim. Propose a methodology of low cost to assess of test process, which shall support in the improvement and quality of testing procedures employed in Small Software Companies Methodology. An exploratory and qualitative research, conducted by: (i) Research instruments or frameworks which are available to assess testing process relative to TMMi levels through Systematic Review; and (ii) abstraction of evidence of the studies, collaborating with test process assessments. From the defined contributions and mandatory guidelines TAMAR define an assessment process with the focus to meet the limitations of small companies. Results. An evaluation process covering Planning activities; Preparation; Application; Analysis of the results; and Closing. The assessment instrument provided a new approach in the presentation of the issues. The questions were given by affinity groups, focusing the respondent to a specific stage of the test process. Conclusion. Perform routine to prepare those involved brought a brief overview of the maturity model and reducing uncertainty for indication of evidence in addition to the new approach to provision of the issues by affinity groups. The validation has shown that the instrument is simple and assistance provided throughout the process enables the evaluation of small business process. / Os processos de teste necessitam de constantes acompanhamentos para evolução de suas metodologias. Para apoiar na melhoria dos processos, temos os Modelos de Maturidade de Teste, tais como TMM e TMMi. No entanto, o modelo TMMi não disponibiliza um instrumento que permite as empresas verificarem a aderência ao modelo, sendo necessário a contratação de consultoria especializada. Por carência de certificadores no país, o custo torna-se elevado e dificulta a obtenção da certificação por empresas de pequeno porte. Objetivo. Propor uma metodologia de avaliação de processos de teste de baixo custo, que contribua na evolução e qualidade dos processos de teste empregados em Pequenas Empresas de Software Metodologia. Uma pesquisa exploratória e qualitativa, realizada pela: (i) investigação de quais instrumentos estão disponíveis para avaliar processos de teste com relação aos níveis do TMMi, por meio de Revisão Sistemática; e (ii) abstração das evidências dos estudos, que colaboravam com avaliações de processo de teste. A partir das contribuições delimitadas e as orientações obrigatórias do TAMAR, construiu-se um processo de avaliação com o foco de atender as limitações das pequenas empresas. Resultados. Um processo de avaliação que abrange atividades de Planejamento; Preparação da avaliação; Aplicação; Análise dos Resultados; e Fechamento da avaliação. O instrumento de avaliação previu uma nova abordagem na apresentação das questões. As questões foram dadas por grupos de afinidade, concentrando o respondente à uma etapa específica do processo de teste. Conclusões. Realizar a rotina de preparar os envolvidos trouxe uma breve visão do modelo de maturidade e reduzindo as incertezas para indicação das evidências, além da nova abordagem para disposição das questões por grupos de afinidade. A validação do processo de avaliação permitiu verificar que o instrumento é simples e o acompanhamento fornecido no decorrer do processo viabiliza a avaliação dos processos de pequenas empresas.
9

A ROADMAP TO STANDARDIZING THE IRIG 106 CHAPTER 10 COMPLIANT DATA FILTERING AND OVERWRITNG SOFTWARE PROCESS

Berard, Alfredo, Manning, Dennis, Kim, Jeong Min 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / High speed digital recorders have revolutionized the way Major Range and Test Facility Bases collect instrumentation data. One challenge facing these organizations is the need for a validated process for the separation of specific data channels and/or data from multiplexed recordings. Several organizations within Eglin Air Force Base have joined forces to establish the requirements and validate a software process compliant with the IRIG-106 Chapter 10 Digital Recording Standard (which defines allowable media access, data packetization, and error controls mechanics). This paper describes a roadmap to standardizing the process to produce this software process, Data Overwriting and Filtering Application (DOFA).
10

Functional Ontologies and Their Application to Hydrologic Modeling: Development of an Integrated Semantic and Procedural Knowledge Model and Reasoning Engine

Byrd, Aaron R. 01 August 2013 (has links)
This dissertation represents the research and development of new concepts and techniques for modeling the knowledge about the many concepts we as hydrologists must understand such that we can execute models that operate in terms of conceptual abstractions and have those abstractions translate to the data, tools, and models we use every day. This hydrologic knowledge includes conceptual (i.e. semantic) knowledge, such as the hydrologic cycle concepts and relationships, as well as functional (i.e. procedural) knowledge, such as how to compute the area of a watershed polygon, average basin slope or topographic wetness index. This dissertation is presented as three papers and a reference manual for the software created. Because hydrologic knowledge includes both semantic aspects as well as procedural aspects, we have developed, in the first paper, a new form of reasoning engine and knowledge base that extends the general-purpose analysis and problem-solving capability of reasoning engines by incorporating procedural knowledge, represented as computer source code, into the knowledge base. The reasoning engine is able to compile the code and then, if need be, execute the procedural code as part of a query. The potential advantage to this approach is that it simplifies the description of procedural knowledge in a form that can be readily utilized by the reasoning engine to answer a query. Further, since the form of representation of the procedural knowledge is source code, the procedural knowledge has the full capabilities of the underlying language. We use the term "functional ontology" to refer to the new semantic and procedural knowledge models. The first paper applies the new knowledge model to describing and analyzing polygons. The second and third papers address the application of the new functional ontology reasoning engine and knowledge model to hydrologic applications. The second paper models concepts and procedures, including running external software, related to watershed delineation. The third paper models a project scenario that includes integrating several models. A key advance demonstrated in this paper is the use of functional ontologies to apply metamodeling concepts in a manner that both abstracts and fully utilizes computational models and data sets as part of the project modeling process.

Page generated in 0.1304 seconds