• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2477
  • 653
  • 573
  • 287
  • 170
  • 126
  • 90
  • 50
  • 47
  • 46
  • 41
  • 40
  • 39
  • 27
  • 26
  • Tagged with
  • 5506
  • 670
  • 521
  • 516
  • 513
  • 467
  • 450
  • 437
  • 417
  • 406
  • 397
  • 397
  • 371
  • 349
  • 343
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

[en] GEMA, A NEW FRAMEWORK FOR PROTOTYPING, DEVELOPMENT AND INTEGRATION OF MULTIPHYSICS AND MULTISCALE SIMULATIONS IN MULTIDISCIPLINARY GROUPS / [pt] GEMA, UM NOVO FRAMEWORK PARA A PROTOTIPAÇÃO, DESENVOLVIMENTO E INTEGRAÇÃO DE SIMULAÇÕES MULTIFÍSICAS E MULTIESCALAS EM GRUPOS MULTIDISCIPLINARES

CARLOS AUGUSTO TEIXEIRA MENDES 29 August 2016 (has links)
[pt] A exploração e produção de petróleo é uma tarefa complexa onde a utilização de modelos físicos é fundamental para minimizar riscos exploratórios e maximizar o retorno do capital investido durante a etapa de produção dos campos descobertos. Com o passar do tempo, estes modelos vêm se tornando cada vez mais complexos, apresentando uma grande tendência de integração entre simuladores distintos e dando origem à necessidade de novas simulações multifísicas, onde modelos físicos isolados são resolvidos conjuntamente de maneira acoplada. Este trabalho apresenta o framework GeMA (Geo Modelling Analysis framework), uma biblioteca para suporte ao desenvolvimento de novos simuladores multifísicos, permitindo tanto o acoplamento de novos modelos construídos tendo o framework como base, quanto a integração com simuladores já existentes. Seu objetivo é promover a utilização de técnicas de engenharia de software tais como extensibilidade, reusabilidade, modularidade e portabilidade na construção de modelos físicos para engenharia, permitindo que engenheiros estejam livres para se concentrarem na formulação física do problema, uma vez que o framework se encarrega do gerenciamento de dados e das funções de suporte necessárias, agilizando a produção de código. Construído para auxiliar durante todo o fluxo de trabalho de uma simulação multifísica, a arquitetura do framework suporta múltiplos paradigmas de simulação e acoplamento de físicas, com especial ênfase no método de elementos finitos, sendo capaz de representar o domínio espacial através de múltiplas discretizações (malhas) e efetuar a troca de valores entre as mesmas. O framework implementa ainda conceitos importantes de extensibilidade, através do uso combinado de plugins e interfaces abstratas, bem como orquestração configurável e prototipação rápida através do uso da linguagem Lua. Além da descrição do framework, este trabalho apresenta ainda um conjunto de testes aplicados para testar sua corretude e expressividade, com especial ênfase em um modelo 2D de modelagem de bacias acoplando cálculo não linear de temperatura baseado em elementos finitos, compactação mecânica, maturação e geração de hidrocarbonetos. / [en] Petroleum exploration and production is a complex task where the use of physical models is imperative to minimize exploration risks and maximize the return on the invested capital during the production phase of new oil fields. Over time, these models have become more and more complex, giving rise to a tendency of integration between several simulators and the need for new multiphysics simulations, where single-physics models are solved together in a coupled way. This work presents the GeMA (Geo Modelling Analysis) framework, a library to support the development of new multiphysics simulators, allowing both the coupling of new models built with the framework as a base and the integration with pre-existing simulators. Its objective is to promote the use of software engineering techniques, such as extensibility, reusability, modularity and portability in the construction of engineering physical models, allowing engineers to focus on the physical problem formulation since the framework takes care of data management and other necessary support functions, speeding up code development. Built to aid during the entire multiphysics simulation workflow, the framework architecture supports multiple simulation and coupling paradigms, with special emphasis given to finite element methods. Being capable of representing the spatial domain by multiple discretizations (meshes) and exchanging values between them, the framework also implements some important concepts of extensibility, through the combined use of plugins and abstract interfaces, configurable orchestration and fast prototyping through the use of the Lua language. This work also presents a set of test cases used to assess the framework correctness and expressiveness, with particular emphasis given to a 2D basin model that couples FEM non-linear temperature calculations based on finite elements, mechanical compaction and hydrocarbon maturation and generation.
62

An implementation framework for additive manufacturing

Mellor, Stephen January 2014 (has links)
The study presents a normative framework for the Additive Manufacturing (AM) implementation process in the UK manufacturing sector. The motivations for the study include the lack of socio-technical studies on the AM implementation process and the need for existing and potential future project managers to have an implementation model to guide their efforts in implementing these relatively new and potentially disruptive technologies. The study has been conducted through case research with the primary data collected through the in-depth semi-structured interviews with AM project managers. Seven case studies were conducted representing AM implementation practice at different stages of the implementation cycle. The first stage involved a pilot study at a post-implementer to identify the main areas of interest for AM implementation research. The second involved a wider study of AM implementers at the post-implementation stage with cross case analysis of implementation practice. The final stage involved an investigation into pre-implementation of AM, applying the proposed framework in three companies yet to fully implement AM as a production method. Contribution towards the existing body of literature was in the form of a normative framework for AM implementation in a variety of industrial sectors. The framework describes the main activities in the implementation process and supports a taxonomy of implementers.
63

Automated visual inspection in small and medium sized enterprises

Panayiotou, Panayiotis January 2000 (has links)
No description available.
64

A COMPONENT RANKING FRAMEWORK FOR MORE RELIABLE SOFTWARE

Chaudhari, Dhyanesh 10 September 2013 (has links)
Software components are meant to be reusable and flexible by design. These characteristics and others continue attracting software developers to adapt a component (typically designed elsewhere) into their systems. However, software components are also vulnerable to reliability and security problems due to existence of non-obvious faults. We believe that a systematic approach to detect failures of a component and prioritize components using such failures can help developers decide on appropriate solutions to improve reliability. In this thesis, we present a framework that can help developers in detecting and ranking component failures systematically so that more reliable software can be achieved. Our proposed framework can allow monitoring critical components within a system under instrumentation, detecting failures based on specifications and using failure data and input from developers to rank the components. The proposed approach provides information for developers who could decide if the reliability could be improved by trivial code modification or require advanced reliability techniques. A prototype is designed along with a number of failure scenarios to detect specific failure types within a component. Four major failure types (value, timing, commission, and omission) are detected and used to rank software components. We conducted an experimental evaluation using two subject systems to assess the effectiveness of the proposed framework and to measure its performance overhead. Our experimental results show that the approach can benefit system developers by prioritizing components for effective maintenance with a minimal overhead. / Thesis (Master, Electrical & Computer Engineering) -- Queen's University, 2013-09-09 23:08:02.035
65

Issues in public information systems development : the impact of regionalised organisational structure

Folkerd, Christopher January 2011 (has links)
This thesis highlights the critical impact the effects of regionalised organisational structure and external political pressures have on the development of public sector information systems. Through the extension of a socio-technical systems (STS) model which encompasses these effects, a tool is provided for their investigation and evaluation in past and present information system (IS) developments. The foundations for this model were derived through an in-depth study of a large scale, national public IS development. Despite a large volume of research into the development and implementation of information systems, a high incidence of failure of such projects is still observed. With information systems now commonly integrated into many facets of an organisation’s business processes the costs and consequences of such failures can be far reaching. Given the additional scope and scale of many national public sector projects such consequences can be profound. While public sector IS failure has been studied in the literature, its focus is observed to be primarily that of an examination of e-government systems, neglecting the back-end (non-public facing) support systems. The focus of such studies is predominantly on the public’s interface and interaction with these systems together with their adoption and acceptance by the public. This view is a valid contribution but it does not inform the literature on the full range of unique problems that can be encountered across a complete IS development lifecycle within the public sector. Seeking to investigate these matters further, a collaboration was formed with a UK public body to facilitate the examination of the issues affecting the development and implementation of a national IS project. Onsite observations, interviews and document sampling were used across the development cycle to gather information from the perspectives of the stakeholders involved. The analysis of the data collected from this exercise highlighted a number of factors that were observed to have a significant effect on the project’s ultimate failure. Examination of this analysis from an STS perspective allowed for the extension of an existing STS model. It was extended to encompass the significant adverse effects that an organisational regionalised structure and external political pressure placed on the development of public information systems.
66

Verification and validation of a DEM-CFD model and multiscale modelling of cohesive fluidization regimes

Gupta, Prashant January 2015 (has links)
Fluidization of solid particles using gas flow is an important process in chemical and pharmaceutical industries. The dynamics of fluidisation are intricately related to particle scale physics. Fluid-particle interactions dominate gas-solid fluidization behaviour for particles with average size and density greater than 10-4 m and 103 kg/m3, respectively, classified as Geldart B and D particles. Inter-particle forces, such as cohesion, play an increasingly important role in the fluidization dynamics of smaller particles, which are classified as Geldart A and C. In particular, interesting fluidization regimes have been noticed for weakly cohesive Geldart A particles, exhibiting a window of uniform fluidization before the onset of bubbling behaviour. Despite widespread industrial interests, the fundamental understanding of the mechanisms that underlie these fluidization regimes is poor. The present study aims to improve the understanding of fluidization dynamics of Geldart A regimes using numerical simulations. A DEM-CFD model was employed to capture the widely separated spatial and temporal scales associated with fluidization behaviour. The model couples the locally averaged Navier-Stokes equation for fluid with a discrete description of the particles. The methodology and its computer implementation are verified and validated to assess the extent of fluidization physics that it is able to capture. Verification cases check the implementation of the inter-phase momentum transfer term, drag model implementation and pressure-velocity coupling. The test cases are employed in order to cover a wide range of flow conditions. Robust validation tests for complex fluidization phenomena such as bubbling, spouting and bidisperse beds have been conducted to assess the predictive capabilities of the DEM-CFD solver. The simulation results for time and spatially averaged fluidziation behaviour are compared to experimental measurements obtained from the literature, and are shown to have capture fluidization physics qualitatively. Robust features of bubbling fluidization, such as minimum fluidization velocity, frequency of pressure drop fluctuations, segregation rates and solid circulation patterns were captured. Furthermore, the DEM-CFD model is critically assessed in terms of model conceptualization and parameter estimation, including those for drag closures, particle-wall boundary conditions, bed height and particle shape effects. The validation studies establish modelling best-practice guidelines and the level of discrepancy against the analytical solutions or experimental measurements. Having developed the model and established its predictive capability, it is used to probe the hydrodynamics of weakly cohesive particles. Cohesive interactions are captured by employing a pair-wise van derWaals force model. The cohesive strength of the granular bed is quantified by the ratio of the maximum van der Waals force to the particle gravitational force, defined as the granular Bond number. The Bond number of the bed is increased systematically from 0-10 to examine the role of cohesion in the fluidization behaviour of fine powders while keeping the particle size and density constant across all the simulations. The idea was to segregate the hydrodynamics associated with size and density of the particles from the inter-particle interactions. The size and density of the particles are carefully chosen at a scale where inter-particle forces are present but minimal [Seville et al., 2000]. The Geldart A fluidization behaviour is captured for granular beds with Bond numbers ranging from 1 to 3. Many robust features of Geldart A fluidization, such as pressure drop overshoot, delay in the onset of bubbling, macroscopic Umf predictions and uniform bed expansion are captured in the DEM-CFD framework. The expanded bed was characterized according to criteria that the particles are highly immobile in this regime and the expanded porosity is related to inlet velocity by Richardson–Zaki correlations. Sudden jumps in the magnitudes of global granular temperature were found near the regime transitions. This observation was used an indicator of the onset of bubbling and quantification of minimum bubbling velocity (Umb). The window of the expanded bed regime (quantified as Umb - Umf) was shown to be an increasing function of cohesive strength of the bed. Furthermore, the stability of the expanded bed was probed by studying the response of the expanded bed to sudden inertial and voidage shocks. A kinematic wave, generated as a response to the voidage shock, was shown to slow down with increasing cohesion and decreasing hydrodynamic forces. Furthermore, predictions of Umb by DEM-CFD simulations for weakly cohesive beds were compared against empirical correlations by Valverde [2013] with an excellent match. Stress analysis of the expanded bed revealed the presence of tensile stresses. As the inlet velocity is increased beyond the minimum fluidization velocity, a longitudinal shift of these negative stresses is observed until they reach the top of the bed. Negative stresses were seen at the bed surface at the onset of bubbling. The role of cohesion stresses in the formation of expanded bed and suppression of bubbling was highlighted. Finally, the microstructure of the expanded bed was probed at different local micro and mescoscopic length scales. Evidence of clustering, agglomeration and cavities were presented in the expanded bed. Expanded bed expansion was shown to have mesostructural inhomogeneities present, which is contrary to the belief of homogeneous expansion.
67

Unpleasant shocks or welcome surprises? What information is conveyed in merger announcements?

Tanyeri, Ayse Basak January 2006 (has links)
Thesis advisor: Edward J. Kane / This paper investigates two issues: how much merger announcements surprise the market and what market responses to the announcement reveal about the motives underlying the proposed deal. Using a simultaneous-equations framework, we model investor anticipations in the first equation and abnormal returns in the second equations. Ouranalysis indicates that investors can successfully predict bidders but not target candidates. Cumulative abnormal returns to bidders whose candidacy was widely anticipated in the market prove significantly larger in magnitude than returns to bidders whose candidacy wasn't anticipated. Bidder abnormal returns differ insignificantly from zero when market expectations are met, whereas bidder returns prove significantly positive when markets are surprised that the firm made a bid. This favorable market response to the surprise in bidder identity suggests that to an important extent managerial merger motives serve shareholder interests. / Thesis (PhD) — Boston College, 2006. / Submitted to: Boston College. Carroll School of Management. / Discipline: Finance.
68

"PARFAIT: uma contribuição para a reengenharia de software baseada em linguagens de padrões e frameworks" / "PARFAIT: an contribution for the software reengineering based on patterns languages and frameworks"

Cagnin, Maria Istela 17 June 2005 (has links)
A necessidade de evolução de sistemas legados tem aumentado significativamente com o surgimento de novas tecnologias. Para apoiar essa tendência, diversos métodos de reengenharia têm sido propostos. No entanto, poucos possuem apoio computacional efetivo, alguns utilizam padrões de projeto ou padrões específicos de reengenharia, e nenhum utiliza framework baseado em linguagem de padrões. Este trabalho está inserido no domínio de Sistemas de Informação. Propõe a elaboração de um arcabouço de reengenharia ágil baseado em framework, que realiza a engenharia reversa do sistema legado com o apoio de linguagem de padrões de análise, fornecendo entendimento e documentação necessários para instanciar o framework. O entendimento do sistema legado também é apoiado pela sua execução, por meio de casos de teste. Esses casos de teste são utilizados posteriormente para validar o sistema alvo. O framework, cuja construção é baseada em linguagem de padrões, é utilizado para obter o projeto e a implementação do sistema alvo. Para permitir a reengenharia com o apoio do arcabouço definido, um processo ágil de reengenharia foi criado. Como no desenvolvimento de software, grande parte do tempo da reengenharia é despendido com atividades de VV&T. Para minimizar esse problema, uma abordagem de reúso de teste é proposta. Essa abordagem agrega recursos de teste aos padrões da linguagem de padrões de análise, permitindo o reúso, não somente das soluções de análise, como também dos recursos de testes associados. O uso de framework na reengenharia de software colabora para a sua evolução, pois o domínio ao qual pertence pode evoluir, já que nem todos os requisitos do domínio do framework podem ter sido elicitados durante o seu desenvolvimento. Assim, nesta tese é proposto também um processo de evolução de frameworks de aplicação. Os processos e a abordagem propostos são associados ao arcabouço definido para apoiar sua efetividade. Além disso, para avaliar o processo ágil de reengenharia, que fornece reúso em diversos níveis de abstração, um pacote de experimentação também é parcialmente definido. Estudos de caso e exemplos de uso foram conduzidos com os produtos definidos. Ressalta-se que outros estudos devem ser conduzidos para permitir a determinação de resultados com significância estatística. / The need to evolve legacy systems has increased significantly with the advent of new technologies. To support this tendency, several reengineering methods have been proposed. However, few have effective computing support, some use design patterns or reengineering specific patterns and none use pattern language-based frameworks. This thesis's theme belongs to the Information Systems domain. An agile framework based reengineering infrastructure is proposed for the legacy system reverse engineering with the support of an analysis pattern language; also provided the understanding and documentation necessary for framework instantiation. The legacy system understanding is also supported by its execution with test cases. These are also subsequently used to validate the target system. The framework, whose construction is based on the analysis pattern language, is used to obtain the target system design and implementation. To allow the reengineering with the infrastructure support, an agile reengineering process has been created. As in software development, a large portion of the reengineering time is spent with VV&T activities. To minimize this problem, a testing reuse approach is proposed in this thesis. This approach aggregates test resources to the patterns of the analysis pattern language allowing reuse, not only of the analysis solutions, but also of the associated test resources. The framework used in software reengineering contributes to its evolution, as the domain to which they belong may evolve, and some of the framework domain requirements might not have been elicited during its development. Thus, in this thesis, a process for application framework evolution is also proposed. The processes and the approach are associated to the infrastructure defined to support its effectiveness. Furthermore, to evaluate the agile reengineering process that provides reuse at several abstraction levels, an experimentation package is also partially defined. Case studies and examples of use have been conducted with the products defined. We stress that other studies have to be done to enable the determination of results with statistical significance.
69

Generating Members of a Software Product Line Using Combinatory Logic

Hoxha, Armend 04 May 2015 (has links)
A Product Line Family contains similar applications that differ only in the sets of sup-ported features from the family. To properly engineer these product lines, programmers design a common code base used by all members of the product line. The structure of this common code base is often an Object-Oriented (OO) framework, designed to contain the detailed domain-specific knowledge needed to implement these applications. However, these frameworks are often quite complex and implement detailed dynamic behavior with complex coordination among their classes. Extending an OO framework to realize a single product line instance is a unique exercise in OO programming. The ultimate goal is to develop a consistent approach, for managing all instances, which relies on configuration rather than programming. In this thesis, we show the novel application of Combinatory Logic to automatically syn-thesize correct product line members using higher-level code fragments specified by means of combinators. Using the same starting point of an OO framework, we show how to design a repository of combinators using FeatureIDE, an extensible framework for Feature-Oriented Software Development. We demonstrate a proof of concept using two different Java-based frameworks: a card solitaire framework and a multi-objective optimization algorithms framework. These case studies rely on LaunchPad, an Eclipse plugin developed at WPI that extends FeatureIDE. The broader impact of this work is that it enables framework designers to formally en-code the complex functional structure of an OO framework. Once this task is accomplished, then, generating product line instances becomes primarily a configuration process, which enables correct code to be generated by construction based on the combinatory logic.
70

Framework functest : aplicando padrões de software na automação de testes funcionais

Oliveira, Rafael Braga de 28 December 2007 (has links)
Made available in DSpace on 2019-04-05T23:09:34Z (GMT). No. of bitstreams: 0 Previous issue date: 2007-12-28 / The functional testing automation has become a real interest for software development teams, mainly because of the great cost reduction and of the increase of productivity observed on medium and long terms with the use of this practice. This article proposes a framework to improve reusability and manutenability of automated test suites. The proposal was developed on SERPRO and has been used in real projects. The framework, called FuncTest, apply software patterns and the Data-driven and Keyword-driven techniques to organize automated test suites. The efforts to improvement the FuncTest intend to adapt it for generating tests automatically using Model-based Testing technique. / A execução automatizada de testes funcionais tem se tornado um evidente atrativo para empresas de desenvolvimento de software. Tal fato se deve principalmente à grande redução de custo e ao aumento de produtividade observados a médio e a longo prazos com o uso desta prática. Este trabalho propõe um framework para ampliar a reusabilidade e a manutenibilidade de suítes de teste automatizadas. A solução foi desenvolvida no SERPRO e utilizada em projetos reais. O framework, denominado FuncTest, utiliza padrões de software e aplica as técnicas Data-driven e Keyword-driven na estruturação de suítes de teste automatizadas. As iniciativas de aperfeiçoamento do framework visam adaptá-lo para a geração automática de testes usando-se a técnica Model-based Testing.

Page generated in 0.0804 seconds