• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 164
  • 35
  • 26
  • 16
  • 11
  • 10
  • 9
  • 7
  • 6
  • 6
  • 6
  • 3
  • 3
  • 2
  • 2
  • Tagged with
  • 395
  • 395
  • 128
  • 82
  • 77
  • 70
  • 50
  • 43
  • 43
  • 42
  • 40
  • 31
  • 31
  • 30
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

The Impacts of Real-time Knowledge Based Personal Lighting Control on Energy Consumption, User Satisfaction and Task Performance in Offices

Gu, Yun 01 May 2011 (has links)
Current building design and engineering practices emphasizing on energy conservation can be improved further by developing methods focusing on building occupants’ needs and interests in conservation. Specifically, the resulting energy effective building performance improvements cannot reach the desired goals, if the resulting indoor environmental conditions do not meet thermal, visual and air quality needs of the occupants. To meet both energy conservation and human performance requirements simultaneously requires to give the occupants information regarding indoor environmental qualities and energy implications of possible individual decisions. This requires that building control components and systems must enable occupants to understand how the building operates and how their own actions meet both their needs and the energy and environmental goals of the building project. The goal of the research and experiments of this dissertation is to explore if real-time information regarding visual comfort requirements to meet a variety of tasks and to simultaneously conserve energy, improves occupant behavior to meet both objectives. Two workplaces in Robert L. Preger Intelligent Workplace were equipped to test the performance of 60 invited participants in conducting computer based tasks and a paper based task, under three difference lighting controls: 1) Centralized lighting control with no user choice 2) User control of - blind positions for daylight shading - ceiling based lighting fixture luminance output level - task lighting: on/off 3) User control the three components (as listed under point 2 above), with provided simultaneous information regarding energy and related CO2 emissions implications, appropriate light levels meeting tasks requirements, and best choices in order to meet both task requirements and energy conservation goals/objectives. The main findings of the experiments are that real-time information (listed under point 3 above) enables users to meet the visual quality requirements for both computer tasks and the paper task, and to conserve significant amounts of electricity for lighting. Furthermore, the 60 invited participants were asked to identify the importance of the four types of provided information tested in point 3 above. While individual users identified the importance of different information categories, the overall assessment were considered to be significant.
142

Veiklos žinių baze grindžiamas UML klasių diagramos generavimo metodas / Method of generation uml 2.0 class diagram based on enterprise model

Ambraziūnas, Martas 23 June 2014 (has links)
Darbe tyrinėjami žiniomis grindžiamos IS inžinerijos principai bei veiklos metamodelio vieta IS kūrime, siekiant išplėsti veiklos metamodelio galimybės, sudarant sąlygas automatiškai generuoti UML 2.0 klasių diagramas IS projektavimo etapui. Darbo metu buvo nustatyta, jog pasirinktas metamodelis neturi tam tikrų elementų, kurie būtini klasių diagramos generavimui. Metamodelis buvo papildytas šiais elementais. Klasių diagramos generavimui buvo sukurtas algoritmas, kuris išbandytas su testiniais duomenimis. / The work covers Knowledge-Based IS engineering and an enterprise metamodel’s place in creation of IS. The main purpose is to traverse possibility of generating UML 2.0 class diagram’s based on enterprise model. To achieve this goal there should be made these tasks: • familiarize with enterprise metamodels and UML class diagram; • determine and add missing elements in EMM for generation of class diagram; • create class diagram generation algorithm and implement it; • create prototype of using generation algorithm in creation of IS; After analysis was found that EMM doesn’t have some elements that are crucial for generation of class diagrams. For this reason enterprise metamodel was appended by two new elements. Created generation algorithm was tested with some case study data to check theoretical assumptions about relationships between EMM and class diagram. This work proved that there is possible to generate class diagrams from EMM. Although some new elements should be added for it.
143

Integrating Manufacturing Issues into Structural Optimization

Barton, Andrew Barton January 2002 (has links)
This dissertation aims to advance the field of structural optimization by creating and demonstrating new methodologies for the explicit inclusion of manufacturing issues. The case of composite aerospace structures was a main focus of this work as that field provides some of the greatest complexities in manufacturing yet also provides the greatest incentives to optimize structural performance. Firstly, the possibilities for modifying existing FEA based structural optimization methods to better capture manufacturing constraints are investigated. Examples of brick-based topology optimization, shell-based topology optimization, parametric sizing optimization and manufacturing process optimization are given. From these examples, a number of fundamental limitations to these methods were observed and are discussed. The key limitation that was uncovered related to a dichotomy between analytical methods (such as FEA) and CAD-type methods. Based on these observations, a new Knowledge-Based framework for structural optimization was suggested whereby manufacturing issues are integrally linked to the more conventional structural issues. A prototype system to implement this new framework was developed and is discussed. Finally, the validity of the framework was demonstrated by application to a generic composite rib design problem.
144

The need for realignment of primary science assessment to contemporary needs : assessment of learning and assessment for learning

irislee12001@yahoo.com.sg, Iris Chai Hong Lee January 2007 (has links)
The ultimate purpose of this study was to investigate how to best prepare Singapore students for the Knowledge-based Economy (KBE). Investigating the possible need for the realignment of the primary science assessment to the KBE was of utmost interest as assessment was viewed as the driver of the actual curriculum. This was a mixed methods design study (Creswell, 2005). Fifteen teachers were first interviewed to ascertain the major features of primary school science assessment in both Perth, Western Australia and Singapore. A list of twelve questions was prepared for the eight teacher interviewees in Singapore and the seven teacher interviewees in Perth. The NUD*IST program was used to help organise trends in these teacher responses. Definitions of KBE skills were synthesised from literature reviews and validated by the fifteen teachers for the subsequent survey. The survey involved a list of demographic questions and two matrices. The first matrix required the teachers to rate, on a four-point scale, the use of the eleven assessment modes for the twelve 'process' and KBE skills. The second matrix was a frequency check to determine if the teachers had used a particular mode to assess a particular skill. One hundred and forty-five usable surveys were analysed. The Rasch analysis was performed through RUMM2020 program and unfolding model was sought through the program RateFOLD. The interviews first established that KBE skills and a variety of assessment modes were needed for today's classes. The survey confirmed these needs and found that the paper and pencil test was the most frequently used assessment procedures in Singapore and Perth. In both interviews and the survey, teachers were requested to match the skill(s) to the appropriate assessment mode(s) though the details and justification of such tasks were explained by the teachers in the interviews. In the process, other factors such as 'time constraints' and 'ranking of the teachers' were uncovered as 'hindrances' to teachers assessing the students appropriately for learning. The problems identified by the Singapore teachers were a lack of time, overloaded syllabi and the crucial perceived need of assessment of learning (high-stake summative tests). The results of both the interviews and survey supported the need for a variety of assessment modes (Gray & Sharp, 2001; Hackling, 2004; National Research Council, 1996, 2001 & 2003; Sebatane, 1998; Sterenberg, 1998) to help students learn science in today's contemporary classes. The Singapore teachers in this study were also appealing for help from the policy-makers to use a variety of assessment modes as the system that stipulated the use of the paper and pencil testing was beyond their control and jurisdiction. Recommendations that stemmed from this study include allowing teachers to use a variety of assessments to assess the students' learning in the high-stake Primary School Leaving Examination (PSLE) and not just the paper and pencil mode that has been in used for at least the last thirty years. There are important implications as the learning theories that are currently used to support the assessment of learning are no longer sufficient nor in total alignment with the needs for today's class. For example, a behaviourist taxonomy of skills emphasises the measurable output and not the process of learning. Socio-constructivist approaches that focus on the individual constructing meaning in hislher context such as the use of ongoing formative assessment to encourage feedback (Black & Wiliam, 1998a & b) may assist in engaging the students in lifelong learning which is required in the KBE. Lastly, the significance of this study lies in two aspects, the practical and the scholarly. This study provides the evidence for the need primary science assessment to be more aligned to contemporary needs. This in turn will assist in better preparing the young of Singapore, who are the nation's only natural resource, for the workforce. This study also aims to contribute to the body of knowledge in three ways. Firstly, KBE needs will be connected to the primary science classroom via assessment of skills. Secondly, both KBE and process skills were found to be more appropriately assessed by assessment modes such as portfolio and paper and pencil respectively, as demonstrated through the analysis by Rasch and unfolding models. Thirdly, the gap between the implemented and official curriculum will be narrowed with this proposed change in assessment processes.
145

Remote sensing, geographic information systems (GIS) and Bayesian knowledge-based methods for monitoring land condition

Caccetta, Peter A. January 1997 (has links)
This thesis considers various aspects of the use of remote sensing, geographical information systems and Bayesian knowledge-based expert system technologies for broad-scale monitoring of land condition in the Western Australian wheat belt.The use of remote sensing technologies for land condition monitoring in Western Australia had previously been established by other researchers, although significant limitations in the accuracy of the results remain. From a monitoring perspective, this thesis considers approaches for improving the accuracy of land condition monitoring by incorporating other data into the interpretation process.Digital elevation data provide one potentially useful source of information. The use of digital elevation data are extensively considered here. In particular, various methods for deriving variables relating to landform from digital elevation data and remotely sensed data are reviewed and new techniques derived.Given that data from a number of sources may need to be combined in order to produce accurate interpretations of land use/condition, methods for combining data are reviewed. Of the many different approaches available, a Bayesian approach is adopted.The approach adopted is based on relatively new developments in probabilistic expert systems. This thesis demonstrates how these new developments provide a unified framework for uniting traditional classification methods and methods for integrating information from other spatial data sets, including data derived from digital elevation models, remotely sensed imagery and human experts.Two applications of the techniques are primarily considered. Firstly, the techniques are applied to the task of salinity mapping/ monitoring and compared to existing techniques. Large improvements are apparent. Secondly, the techniques are applied to salinity prediction, an application not previously considered by ++ / other researchers in this domain. The results are encouraging. Finally limitations of the approach are discussed.
146

Fuzzy framework for robust architecture identification in concept selection

Patterson, Frank H. 07 January 2016 (has links)
An evolving set of modern physics-based, multi-disciplinary conceptual design methods seek to explore the feasibility of a new generation of systems, with new capabilities, capable of missions that conventional vehicles cannot be empirically redesigned to perform. These methods provide a more complete understanding of a concept's design space, forecasting the feasibility of uncertain systems, but are often computationally expensive and time consuming to prepare. This trend creates a unique and critical need to identify a manageable number of capable concept alternatives early in the design process. Ongoing efforts attempting to stretch capability through new architectures, like the U.S. Army's Future Vertical Lift effort and DARPA's Vertical Takeoff and Landing (VTOL) X-plane program highlight this need. The process of identifying and selecting a concept configuration is often given insufficient attention, especially when a small subset of favorable concept families is not immediately apparent. Commonly utilized methods for concept generation, like filtered morphological analysis, often identify an exponential number of alternatives. Simple approaches to concept selection then rely on designers to identify a relatively small subset of alternatives for comparison through simple methods regularly related to decision matrices (Pugh, TOPSIS, AHP, etc.). More in-depth approaches utilize modeling and simulation to compare concepts with techniques such as stochastic optimization or probabilistic decision making, but a complicated setup limits these approaches to just a discrete few alternatives. A new framework to identify and select promising, robust concept configurations utilizing fuzzy methods is proposed in this research and applied to the example problem of concept selection for DARPA's VTOL Xplane program. The framework leverages fuzzy systems in conjunction with morphological analysis to assess large design spaces of potential architecture alternatives while capturing the inherent uncertainty and ambiguity in the evaluation of these early concepts. Experiments show how various fuzzy systems can be utilized for evaluating criteria of interest across disparate architectures by modeling expert knowledge as well as simple physics-based data. The models are integrated into a single environment and variations on multi-criteria optimization are tested to demonstrate an ability to identify a non-dominated set of architectural families in a large combinatorial design space. The resulting framework is shown to provide an approach to quickly identify promising concepts in the face of uncertainty early in the design process.An evolving set of modern physics-based, multi-disciplinary conceptual design methods seek to explore the feasibility of a new generation of systems, with new capabilities, capable of missions that conventional vehicles cannot be empirically redesigned to perform. These methods provide a more complete understanding of a concept's design space, forecasting the feasibility of uncertain systems, but are often computationally expensive and time consuming to prepare. This trend creates a unique and critical need to identify a manageable number of capable concept alternatives early in the design process. Ongoing efforts attempting to stretch capability through new architectures, like the U.S. Army's Future Vertical Lift effort and DARPA's Vertical Takeoff and Landing (VTOL) X-plane program highlight this need. The process of identifying and selecting a concept configuration is often given insufficient attention, especially when a small subset of favorable concept families is not immediately apparent. Commonly utilized methods for concept generation, like filtered morphological analysis, often identify an exponential number of alternatives. Simple approaches to concept selection then rely on designers to identify a relatively small subset of alternatives for comparison through simple methods regularly related to decision matrices (Pugh, TOPSIS, AHP, etc.). More in-depth approaches utilize modeling and simulation to compare concepts with techniques such as stochastic optimization or probabilistic decision making, but a complicated setup limits these approaches to just a discrete few alternatives. A new framework to identify and select promising, robust concept configurations utilizing fuzzy methods is proposed in this research and applied to the example problem of concept selection for DARPA's VTOL Xplane program. The framework leverages fuzzy systems in conjunction with morphological analysis to assess large design spaces of potential architecture alternatives while capturing the inherent uncertainty and ambiguity in the evaluation of these early concepts. Experiments show how various fuzzy systems can be utilized for evaluating criteria of interest across disparate architectures by modeling expert knowledge as well as simple physics-based data. The models are integrated into a single environment and variations on multi-criteria optimization are tested to demonstrate an ability to identify a non-dominated set of architectural families in a large combinatorial design space. The resulting framework is shown to provide an approach to quickly identify promising concepts in the face of uncertainty early in the design process.
147

Transferência de conhecimento nas franquias brasileiras

Gigliotti, Batista Salgado 24 February 2010 (has links)
Made available in DSpace on 2010-04-20T20:14:39Z (GMT). No. of bitstreams: 1 61080100004.pdf: 593587 bytes, checksum: 030afcd109be1e0d7328ca64d03cb45b (MD5) Previous issue date: 2010-02-24T00:00:00Z / O sistema de franquia tem sido objeto de muitos estudos acadêmicos nos últimos anos pela sua crescente adoção como estratégia de expansão da empresa. Resumidamente, franquia é o sistema pelo qual a franqueadora cede ao franqueado a licença de uso de sua marca e concede o direito ao franqueado de acessar seu conhecimento no negócio. A transferência de conhecimento nesse processo é desejada e cultivada pelos agentes como mecanismo para alcançar o sucesso da parceria. O presente estudo exploratório de caso buscou analisar de que forma essa transferência ocorre nas marcas brasileiras, identificando os principais desafios enfrentados pelo franqueador e pelo franqueado nessa ação. Além disso, o trabalho procurou mostrar as habilidades principais que os agentes devem possuir para a efetiva sedimentação do aprendizado. As teorias de apoio à análise foram a RBV-Resource Based View, Capacidades Dinâmicas, Teoria da Agência, além de estudos sobre Gestão do Conhecimento e de Franquias. A pesquisa foi feita através de entrevistas com executivos e franqueados de três empresas franqueadoras brasileiras dos segmentos de escola de idiomas, de acessórios femininos e de produtos naturais. Como principais resultados, podem ser apontados: (a) os motivadores para a entrada no sistema são a expansão de qualidade, por parte do franqueador, e o suporte oferecido, por parte do franqueado; (b) o conhecimento da operacionalização do elemento-chave do sucesso da marca torna-se explícito com mais facilidade que o conhecimento dos elementos de gestão da unidade franqueada; (c) apesar de manuais e do treinamento em classe serem meios usuais de transferência de conhecimento explícito, sua sedimentação se dá através do treinamento em campo; (d) geralmente o conhecimento explícito é passado em grande quantidade, em pouco tempo e em um momento de maior ansiedade do franqueado, prejudicando o aprendizado; (e) os meios mais relevantes para transferência de conhecimento tácito são o contato diário com o franqueador, os encontros entre franqueados e as visitas do franqueador à unidade, porém o bom resultado desta depende do perfil e da capacitação do visitante; (f) não foi identificado um processo-padrão para a transferência do conhecimento tácito; (g) apesar da disposição de compartilhar conhecimento entre os agentes, há uma frustração quanto à sua adoção efetiva; (h) o perfil ideal dos agentes da transferência de conhecimento contém características intangíveis, tornando mais complexa a tarefa de sua seleção. O presente trabalho contribui também como base para novos estudos referentes à transferência de conhecimento em franquias, como, por exemplo, a sugestão de pesquisas que relacionem os resultados obtidos com o desempenho do franqueador, da unidade franqueada ou mesmo do sistema de franquia de uma determinada marca.
148

Uma proposta de algoritmo memético baseado em conhecimento para o problema de predição de estruturas 3-D de proteínas

Correa, Leonardo de Lima January 2017 (has links)
Algoritmos meméticos são meta-heurísticas evolutivas voltadas intrinsecamente à exploração e incorporação de conhecimentos relacionados ao problema em estudo. Nesta dissertação, foi proposto um algoritmo memético multi populacional baseado em conhecimento para lidar com o problema de predição de estruturas tridimensionais de proteínas voltado à modelagem de estruturas livres de similaridades conformacionais com estruturas de proteínas determinadas experimentalmente. O algoritmo em questão, foi estruturado em duas etapas principais de processamento: (i) amostragem e inicialização de soluções; e (ii) otimização dos modelos estruturais provenientes da etapa anterior. A etapa I objetiva a geração e classificação de diversas soluções, a partir da estratégia Lista de Probabilidades Angulares, buscando a definição de diferentes grupos estruturais e a criação de melhores estruturas a serem incorporadas à meta-heurística como soluções iniciais das multi populações. A segunda etapa consiste no processo de otimização das estruturas oriundas da etapa I, realizado por meio da aplicação do algoritmo memético de otimização, o qual é fundamentado na organização da população de indivíduos em uma estrutura em árvore, onde cada nodo pode ser interpretado como uma subpopulação independente, que ao longo do processo interage com outros nodos por meio de operações de busca global voltadas a características do problema, visando o compartilhamento de informações, a diversificação da população de indivíduos, e a exploração mais eficaz do espaço de busca multimodal do problema O algoritmo engloba ainda uma implementação do algoritmo colônia artificial de abelhas, com o propósito de ser utilizado como uma técnica de busca local a ser aplicada em cada nodo da árvore. O algoritmo proposto foi testado em um conjunto de 24 sequências de aminoácidos, assim como comparado a dois métodos de referência na área de predição de estruturas tridimensionais de proteínas, Rosetta e QUARK. Os resultados obtidos mostraram a capacidade do método em predizer estruturas tridimensionais de proteínas com conformações similares a estruturas determinadas experimentalmente, em termos das métricas de avaliação estrutural Root-Mean-Square Deviation e Global Distance Total Score Test. Verificou-se que o algoritmo desenvolvido também foi capaz de atingir resultados comparáveis ao Rosetta e ao QUARK, sendo que em alguns casos, os superou. Corroborando assim, a eficácia do método. / Memetic algorithms are evolutionary metaheuristics intrinsically concerned with the exploiting and incorporation of all available knowledge about the problem under study. In this dissertation, we present a knowledge-based memetic algorithm to tackle the threedimensional protein structure prediction problem without the explicit use of template experimentally determined structures. The algorithm was divided into two main steps of processing: (i) sampling and initialization of the algorithm solutions; and (ii) optimization of the structural models from the previous stage. The first step aims to generate and classify several structural models for a determined target protein, by the use of the strategy Angle Probability List, aiming the definition of different structural groups and the creation of better structures to initialize the initial individuals of the memetic algorithm. The Angle Probability List takes advantage of structural knowledge stored in the Protein Data Bank in order to reduce the complexity of the conformational search space. The second step of the method consists in the optimization process of the structures generated in the first stage, through the applying of the proposed memetic algorithm, which uses a tree-structured population, where each node can be seen as an independent subpopulation that interacts with others, over global search operations, aiming at information sharing, population diversity, and better exploration of the multimodal search space of the problem The method also encompasses ad-hoc global search operators, whose objective is to increase the exploration capacity of the method turning to the characteristics of the protein structure prediction problem, combined with the Artificial Bee Colony algorithm to be used as a local search technique applied to each node of the tree. The proposed algorithm was tested on a set of 24 amino acid sequences, as well as compared with two reference methods in the protein structure prediction area, Rosetta and QUARK. The results show the ability of the method to predict three-dimensional protein structures with similar foldings to the experimentally determined protein structures, regarding the structural metrics Root-Mean-Square Deviation and Global Distance Total Score Test. We also show that our method was able to reach comparable results to Rosetta and QUARK, and in some cases, it outperformed them, corroborating the effectiveness of our proposal.
149

Uma proposta de algoritmo memético baseado em conhecimento para o problema de predição de estruturas 3-D de proteínas

Correa, Leonardo de Lima January 2017 (has links)
Algoritmos meméticos são meta-heurísticas evolutivas voltadas intrinsecamente à exploração e incorporação de conhecimentos relacionados ao problema em estudo. Nesta dissertação, foi proposto um algoritmo memético multi populacional baseado em conhecimento para lidar com o problema de predição de estruturas tridimensionais de proteínas voltado à modelagem de estruturas livres de similaridades conformacionais com estruturas de proteínas determinadas experimentalmente. O algoritmo em questão, foi estruturado em duas etapas principais de processamento: (i) amostragem e inicialização de soluções; e (ii) otimização dos modelos estruturais provenientes da etapa anterior. A etapa I objetiva a geração e classificação de diversas soluções, a partir da estratégia Lista de Probabilidades Angulares, buscando a definição de diferentes grupos estruturais e a criação de melhores estruturas a serem incorporadas à meta-heurística como soluções iniciais das multi populações. A segunda etapa consiste no processo de otimização das estruturas oriundas da etapa I, realizado por meio da aplicação do algoritmo memético de otimização, o qual é fundamentado na organização da população de indivíduos em uma estrutura em árvore, onde cada nodo pode ser interpretado como uma subpopulação independente, que ao longo do processo interage com outros nodos por meio de operações de busca global voltadas a características do problema, visando o compartilhamento de informações, a diversificação da população de indivíduos, e a exploração mais eficaz do espaço de busca multimodal do problema O algoritmo engloba ainda uma implementação do algoritmo colônia artificial de abelhas, com o propósito de ser utilizado como uma técnica de busca local a ser aplicada em cada nodo da árvore. O algoritmo proposto foi testado em um conjunto de 24 sequências de aminoácidos, assim como comparado a dois métodos de referência na área de predição de estruturas tridimensionais de proteínas, Rosetta e QUARK. Os resultados obtidos mostraram a capacidade do método em predizer estruturas tridimensionais de proteínas com conformações similares a estruturas determinadas experimentalmente, em termos das métricas de avaliação estrutural Root-Mean-Square Deviation e Global Distance Total Score Test. Verificou-se que o algoritmo desenvolvido também foi capaz de atingir resultados comparáveis ao Rosetta e ao QUARK, sendo que em alguns casos, os superou. Corroborando assim, a eficácia do método. / Memetic algorithms are evolutionary metaheuristics intrinsically concerned with the exploiting and incorporation of all available knowledge about the problem under study. In this dissertation, we present a knowledge-based memetic algorithm to tackle the threedimensional protein structure prediction problem without the explicit use of template experimentally determined structures. The algorithm was divided into two main steps of processing: (i) sampling and initialization of the algorithm solutions; and (ii) optimization of the structural models from the previous stage. The first step aims to generate and classify several structural models for a determined target protein, by the use of the strategy Angle Probability List, aiming the definition of different structural groups and the creation of better structures to initialize the initial individuals of the memetic algorithm. The Angle Probability List takes advantage of structural knowledge stored in the Protein Data Bank in order to reduce the complexity of the conformational search space. The second step of the method consists in the optimization process of the structures generated in the first stage, through the applying of the proposed memetic algorithm, which uses a tree-structured population, where each node can be seen as an independent subpopulation that interacts with others, over global search operations, aiming at information sharing, population diversity, and better exploration of the multimodal search space of the problem The method also encompasses ad-hoc global search operators, whose objective is to increase the exploration capacity of the method turning to the characteristics of the protein structure prediction problem, combined with the Artificial Bee Colony algorithm to be used as a local search technique applied to each node of the tree. The proposed algorithm was tested on a set of 24 amino acid sequences, as well as compared with two reference methods in the protein structure prediction area, Rosetta and QUARK. The results show the ability of the method to predict three-dimensional protein structures with similar foldings to the experimentally determined protein structures, regarding the structural metrics Root-Mean-Square Deviation and Global Distance Total Score Test. We also show that our method was able to reach comparable results to Rosetta and QUARK, and in some cases, it outperformed them, corroborating the effectiveness of our proposal.
150

Discussão, sistematização e modelamento do processo de realização de estudos de simulação / Process discussion, systematization and modelling of simulation studies accomplishment

Elidio de Carvalho Lobão 16 October 2000 (has links)
Atualmente pode-se verificar um aumento na demanda por estudos de simulação causado principalmente por dois fatores: necessidade de otimização ele produtos e processos decorrente do crescimento do grau de competição entre as empresas, e aumento da facilidade do uso dos sistemas simuladores gerado pela incorporação de novos recursos como inteligência artificial, programação gráfica, realidade virtual, etc. Esta situação induz a uma falsa impressão que este tipo de sistema simulador diminui a demanda de conhecimentos que o analista ou projetista de simulação precisa dominar para obter resultados válidos e consistentes da realização do estudo de simulação. O objetivo deste trabalho é construir uma base de conhecimentos a respeito da realização de estudos de simulação que permita sistematizar uma metodologia apropriada para realização de estudos de simulação de eventos discretos, e realizar uma investigação sobre os conhecimentos que o analista/projetista de um estudo de simulação deve dominar para obter resultados consistentes do mesmo. / Recently an increase demand by simulation studies can be verified, caused mainly by two factor: - need of products and processes optimization due to the growth of the competition degree among the companies, and - an increase of the easiness of the systems simulators use generated by the incorporation of new resources as artificial intelligence, graphic programming, virtual reality, etc. This situation induces to a false impression that this type of system simulator decreases the demand of knowledge that the simulation analyst or planner needs to dominate to obtain valid and consistent results of the accomplishment of the simulation study. The objective of this work is to build a base of knowledge regarding the accomplishment of simulation studies that allows to systematize a methodology adapted for accomplishment of studies of simulation of discreet events, and to accomplish an investigation on the knowledge that the analyst/planner of a simulation study should dominate to obtain consistent results of the same.

Page generated in 0.069 seconds