• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 254
  • 51
  • 34
  • 27
  • 27
  • 8
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 504
  • 504
  • 115
  • 79
  • 76
  • 68
  • 68
  • 57
  • 47
  • 44
  • 36
  • 36
  • 36
  • 35
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Modelo híbrido de banco de dados relacional, de alto desempenho e capacidade de armazenamento, para aplicacoes voltadas a engenharia nuclear / Relational database hybrid model, of high performance and storaging capacity, for nuclear engineering applications

GOMES NETO, JOSE 09 October 2014 (has links)
Made available in DSpace on 2014-10-09T12:54:29Z (GMT). No. of bitstreams: 0 / Made available in DSpace on 2014-10-09T14:07:20Z (GMT). No. of bitstreams: 1 12769.pdf: 5367552 bytes, checksum: 1c6f3e52f8be9724413e2b8f8460395f (MD5) / Dissertação (Mestrado) / IPEN/D / Instituto de Pesquisas Energéticas e Nucleares - IPEN/CNEN-SP
92

The desgner’s perception and expert’s evaluation: testing techniques for problem exploration on a design methodology framework

VASCONCELOS, Luis Arthur de, CAMPOS, Fábio Ferreira da Costa 31 January 2012 (has links)
Submitted by Amanda Silva (amanda.osilva2@ufpe.br) on 2015-03-09T14:33:30Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) dissertacao_lalv.pdf: 3428243 bytes, checksum: 4cfc2cf496677e2d7b64975aee50848d (MD5) / Made available in DSpace on 2015-03-09T14:33:30Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) dissertacao_lalv.pdf: 3428243 bytes, checksum: 4cfc2cf496677e2d7b64975aee50848d (MD5) Previous issue date: 2012 / CAPES; CNPq / Understanding methodology in design as a schematized process composed by a set of steps in order to support problem solving, data mining or search for information activities are commonly performed on an initial step of this process, which can be called as problem exploration. This work focuses on identifying the influences that executing the problem exploration step can cause to the design team, as well to the final developed solutions. To achieve this objective, an initial investigation was performed in order to search for similar studies that could test these influences. After confirming the absence of such researches, an experiment was conducted to test techniques for problem exploration as a way to identify the possible influences of this step within the design process. The experiment was carried out in an undergraduate design class during four months and involved five groups of about six members each, who should develop concepts for problems about they did not have any previous information. Except for the problem exploration step, which could be performed differently, the five groups executed an identical framework of methods for problem solving. All documentation produced by the design teams was evaluated for a selected group of experts. As results for the experiment, it was concluded that the use of such problem exploration techniques gave more confidence to designers when responding to how well informed they were about the problem after confronting it, and although different groups performed distinct procedures as regards the tested step, no difference on the experts’ evaluation for the alternatives could be perceived. This way utilizing problem exploration techniques caused no influences on the final solutions developed.
93

Monitoramento de icebergs no noroeste do mar de Weddell, Antártica, e sua associação com a circulação oceânica regional

Collares, Lorena Luiz January 2011 (has links)
Dissertação(mestrado) - Universidade Federal do Rio Grande, Programa de Pós-Graduação em Oceanografia Física, Química e Geológica, Instituto de Oceanografia, 2011. / Submitted by Cristiane Silva (cristiane_gomides@hotmail.com) on 2013-03-12T12:42:52Z No. of bitstreams: 1 2011_lorena_collares.pdf: 4980623 bytes, checksum: fe164915b150fc2e5673516c2d5626b7 (MD5) / Approved for entry into archive by Bruna Vieira(bruninha_vieira@ibest.com.br) on 2013-07-15T17:50:41Z (GMT) No. of bitstreams: 1 2011_lorena_collares.pdf: 4980623 bytes, checksum: fe164915b150fc2e5673516c2d5626b7 (MD5) / Made available in DSpace on 2013-07-15T17:50:41Z (GMT). No. of bitstreams: 1 2011_lorena_collares.pdf: 4980623 bytes, checksum: fe164915b150fc2e5673516c2d5626b7 (MD5) Previous issue date: 2011 / Os icebergs representam uma distinta feição no Oceano Austral. As correntes oceânicas, o gelo marinho, a batimetria e os ventos são responsáveis por determinar a trajetória destes grandes blocos de gelo. Desta forma, informações sobre a distribuição e a concentração dos icebergs podem auxiliar no melhor entendimento da circulação oceânica e atmosférica nas regiões polares. Diferentes métodos de observação de icebergs têm sido utilizados ao longo do tempo para o entendimento desta componente da criosfera. Duas metodologias despontam para tal objetivo, plataformas de coleta de dados (PCDs) rastreadas via sistema satelital ARGOS e as imagens de radar. A fim de monitorar o deslocamento de icebergs, no noroeste do Mar de Weddell, foram utilizados dados de posição de PCDs fixadas em três icebergs (em 19 de fevereiro de 2009) nas proximidades da ilha James Ross. Imagens Advanced Synthetic Aperture Radar (ASAR) foram utilizadas como medida complementar no rastreamento de icebergs durante os anos de 2008 e 2009. A partir dos resultados foi possível associar a deriva dos icebergs monitorados aos principais sistemas de correntes e frentes desta região, como a Corrente Costeira Antártica, a Frente de Talude Antártico e a Frente de Weddell. Mais especificamente, pode-se observar aspectos da circulação regional, tal como a identificação de uma célula de circulação anticiclônica no entorno da ilha James Ross e a deriva de icebergs em direção ao Estreito de Bransfield. Um estudo de caso demonstrou a recirculação de um iceberg no interior do Estreito de Bransfield e sua desintegração associada. A estimativa média da taxa de desintegração dos icebergs monitorados foi de 19%, associadas com um fluxo de volume de água doce para o oceano de aproximadamente 0.57 m3 s-1 e 0.94 m3s-1, respectivamente durante o período de observações nos anos de 2008 e 2009. A velocidade média de deriva calculada através do monitoramento via PCDs e imagens ASAR foi de 3.04 ±1.9 cm s-1 e 5.97 ± 2.8cm s-1, respectivamente. / Icebergs represent a distinctive feature of the Southern Ocean. Ocean currents, sea ice, bathymetry and winds determine the icebergs trajectory and its drift. Thus, information about icebergs distribution and concentration help to better understand the ocean and atmospheric circulation in Polar Regions. Several methods to observe icebergs have been used to comprehend the behavior and the role of this component of the cryosphere. Two methodologies are emerging for this purpose recently, such as icebergs tagging (for satellite tracking) and orbital radar images. In order to monitor the displacement of icebergs in the northwestern Weddell Sea, we used data from three icebergs tagged with Data Collection Platforms - DCPs (19/02/2009) in the vicinity of the James Ross Island. Additionally, ASAR images were used as a complementary measure to track the icebergs during the years 2008 and 2009 in the same area. Observing the results, it was possible to associate the icebergs drift with the main currents and fronts systems found in this region, as the Antarctic Coastal Current, Antarctic Slope Front and Weddell Front. More specifically, one can observe the regional circulation, such as the identification of an anticyclonic circulation cell around the James Ross Island and icebergs drifting into the Bransfield Strait. A case study demonstrated the recirculation of iceberg within the Bransfield Strait and its corresponding loss of mass. The icebergs disintegration estimated was 19%, associated with a freshwater volume flow toward the ocean of approximately 0,57 m3 s-1 and 0,94 m3 s-1 , respectively during the observation period, for the years 2008 and 2009. The drift rates determined by monitoring icebergs via DCPs and ASAR images were, respectively, 3,04 ±1,9 cm s-1 and 5,97 ± 2,8cm s-1.
94

Modelo híbrido de banco de dados relacional, de alto desempenho e capacidade de armazenamento, para aplicacoes voltadas a engenharia nuclear / Relational database hybrid model, of high performance and storaging capacity, for nuclear engineering applications

GOMES NETO, JOSE 09 October 2014 (has links)
Made available in DSpace on 2014-10-09T12:54:29Z (GMT). No. of bitstreams: 0 / Made available in DSpace on 2014-10-09T14:07:20Z (GMT). No. of bitstreams: 1 12769.pdf: 5367552 bytes, checksum: 1c6f3e52f8be9724413e2b8f8460395f (MD5) / O objetivo deste trabalho é apresentar o banco de dados relacional, denominado FALCAO, que foi criado e implementado com a função de armazenar as variáveis monitoradas no reator de pesquisa IEA-R1, localizado no Instituto de Pesquisas Energéticas e Nucleares, IPEN CNEN/SP. O modelo lógico de dados e sua influência direta na integridade da informação fornecida são cuidadosamente considerados. São apresentados os conceitos e etapas de normalização e desnormalização, incluindo as entidades e relacionamentos do modelo lógico de dados. São também apresentadas as influências dos relacionamentos e regras do modelo de dados nos processos de aquisição, carga e disponibilização da informação final, sob a óptica do desempenho, visto que estes processos ocorrem em lotes e em pequenos intervalos de tempo. A aplicação SACD, através de suas funcionalidades, apresenta as informações armazenadas no banco FALCAO de maneira prática e otimizada. A implementação do banco de dados FALCAO ocorreu com o êxito esperado, mostrando-se indispensável ao cotidiano dos pesquisadores envolvidos por conta da substancial melhoria dos processos e da confiabilidade associada a estes. / Dissertação (Mestrado) / IPEN/D / Instituto de Pesquisas Energéticas e Nucleares - IPEN/CNEN-SP
95

Elaboração e validação de instrumentos para a coleta de dados na assistencia de enfermagem

Hermida, Patricia Madalena Vieira 02 March 2005 (has links)
Orientador: Izilda Esmenia Muglia Araujo / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Ciencias Medicas / Made available in DSpace on 2018-08-04T03:17:10Z (GMT). No. of bitstreams: 1 Hermida_PatriciaMadalenaVieira_M.pdf: 65723353 bytes, checksum: c593d4c8e6b2027b31d50a7f16d47a6d (MD5) Previous issue date: 2005 / Resumo: A Sistematização da Assistência de Enfennagem, realizada por meio do Processo de Enfennagem, tem sido ensinada nos cursos de graduação em enfennagem e utilizada em algumas instituições de saúde. É uma atividade privativa do enfenneiro, regulamentada pelo Conselho Federal de Enfennagem (Resolução COFEN n° 272/2002). Essa metodologia assistencial é composta por algumas fases, as quais diferem entre si, de acordo com vários autores. A fase da coleta de dados é descrita como a primeira delas, a partir da qual é possível fazer julgamentos, definindo os diagnósticos de enfennagem e, conseqüentemente, as intervenções de enfennagem para cada paciente. Este estudo teve como objetivos elaborar, validar e verificar a confiabilidade de dois instrumentos de coleta de dados para a assistência de enfennagem, guiado pelo referencial teórico de Dorothéa Orem. Para a validação de conteúdo, os instrumentos foram encaminhados para apreciação de quatro juízes, assim como seus respectivos guias de apoio à decisão. O teste de concordância entre observadores foi realizado com uma amostra de 18 sujeitos internados num hospital universitário, pela pesquisadora e uma enfenneira convidada. A concordância entre os juízes foi analisada pelo Teste Qui-Quadrado de Cochran, que apontou diferença estatisticamente significativa no número de respostas quanto à objetividade do instrumento Entrevista de Enfennagem e em relação à organização, objetividade, clareza e facilidade de leitura do instrumento Exame Físico de Enfennagem. Entretanto, apesar dessa diferença, o número de respostas positivas foi maior, indicando que os instrumentos apresentam os quesitos pertinentes. A análise descritiva da concordância entre os juízes quanto ao guia de apoio à decisão do instrumento Entrevista de Enfennagem mostra que, dos cinco critérios avaliados, a objetividade obteve 50% de respostas negativas, e, no guia de apoio do instrumento Exame Físico de Enfermagem esse mesmo percentual foi atribuído à objetividade e a clareza, dados que sugeriram a realização de algumas alterações apontadas pelos juízes para aperfeiçoá-los. A concordância entre observadores foi analisada pelo coeficiente de concordância Kappa, mostrando um total de 95,3% e 85,7% de coeficiente de concordância entre bom e excelente, para os instrumentos Entrevista e Exame Físico de Enfennagem, respectivamente. Os resultados demonstram que os instrumentos pennitem coletar os dados dos pacientes de fonna sistemática e voltados para o autocuidado, embasados no referencial teórico de Orem. Além disso, tais instrumentos apresentam validade e confiabilidade. Os registros dos dados coletados possibilitam a avaliação periódica do paciente, a continuidade do cuidado e contribuem para a melhoria da assistência de enfermagem / Abstract: Systematization of Nursing Care using the Nursing Process is being taught in nursing graduate courses and is used in some health institutes. It is a nursing activity regulated by the Federal Nursing Council (Resolution COFEN n° 272/2002). This caregiving methodology consists of some phases that differ according to the authors. The data collection phase is described as the first phase since it allows the definition of nursing diagnoses and consequently the nursing interventions needed for each patient. The purpose of this study was to elaborate, validate and verify the reliability of two data collecting instruments related to nursing care based on the Dorothea Orem theory. In order to validate the content, the instruments were referred to four judges with their respective decision guides. The concordance test was performed by a researcher and a nurse using a sample of 18 patients in a university hospital. The concordance between the judges was analyzed using Cochran's chi-square test, which indicated a significant difference in the number of answers regarding the objective of the instrument Nursing Interview and the organization, objectivity, clarity and facility in reading the instrument of Nursing Physical Exam. Despite this difference, the number of positive answers was greater, indicating that the instruments presented pertinent queries. The descriptive analysis of the concordance between the judges in relation to the decision guide of the instrument Nursing Interview demonstrated that out of the five criteria evaluated, objectivity obtained 50% negative responses and with regard to the decision guide of Nursing Physical Exam, the same percentage applied to objectivity and clarity, indicating the need for an improvement in the guides. The Kappa concordance coefficient was applied to analyze the concordance between the observers and demonstrated 95.3% and 85.7% concordance that ranged from good to excellent for the instruments Interview and Physical Examination of Nursing, respectively. The results demonstrated that the instruments were capable of colIecting data in a systematic manner and with a focus on self-care based on the Orem theory. Moreover, these instruments presented validity and reliability. Data records not only make it possible to have periodic assessment of the patients and continued care but also contribute towards lmprovmg nursmg care / Mestrado / Enfermagem e Trabalho / Mestre em Enfermagem
96

Development of a semantic data collection tool. : The Wikidata Project as a step towards the semantic web.

Ubah, Ifeanyichukwu January 2013 (has links)
The World Wide Web contains a vast amount of information. This feature makes it a very useful part of our everyday activities but the information contained within is made up of an exponentially increasing repository of semantically unstructured data. The semantic web movement involves the evolution of the existing World Wide web in order to enable computers make meaning of and understand the data they process and consequently increase their processing capabilities. Over the past decade a number of new projects implementing the semantic web technology have been developed albeit still in their infancy. These projects are based on semantic data models and one such is the Wikidata project. The Wikidata project is targeted at providing a more semantic platform for editing and sharing data throughout the Wikipedia and Wikimedia communities. This project studies how the Wikidata project facilitates such a semantic platform for the Wikimedia communities and includes the development of an application utilizing the semantic capabilities of Wikidata. The objective of the project is to develop an application capable of retrieving and presenting statistical data and also be able to make missing or invalid data on Wikidata detectable. The result is an application currently aimed at researchers and students who require a convenient tool for statistical data collection and data mining projects. Usability and performance tests of the application are also conducted with the results presented in the report. Keywords: Semantic web, World Wide Web, Semantic data model, Wikidata, data mining.
97

Knowledge management in law firms in Botswana

Fombad, Madeleine C. 10 June 2010 (has links)
The literature reveals enormous potential of knowledge management for law firms, yet research in knowledge management seems fragmented with extensive theoretical discussions but little empirical evidence. The aim of this study is to empirically determine the guidelines and techniques of knowledge management in law firms in Botswana in the light of the rapid changing legal environment. It identified the different categories of knowledge existing in the law firms in Botswana and considered the factors that would motivate or inhibit the adoption of knowledge management. It also identified the tools and technologies for knowledge management and agents and institutions necessary for knowledge management in law firms in the country. The study adopted the triangulation of qualitative and quantitative methods of data collection and analysis. Open and closed ended questionnaires and interview schedules were used to collect both qualitative and quantitative data that was analysed. The survey research design was adopted and census of all the lawyers in the country undertaken. Out of the 217 questionnaires distributed to the 115 registered firms, 140 completed questionnaires were returned, giving a return rate of 64.5%. From the study, it has emerged that law firms in Botswana are significantly affected by the changes in the legal environment. The adoption of formal knowledge management in law firms in Botswana is still however, at an initial stage. Most of the law firms do not have knowledge management policies and guidelines and there are still many challenges to the effective implementation of knowledge management. Nevertheless, it is clear that there is a growing awareness of the key role, importance and potential of knowledge management in an increasingly competitive environment as a means of making law firms more innovative and cost effective. Guidelines for knowledge management in law firms were established and several suggestions on how it can be successfully implemented made in the hope that this would not only improve the awareness and utilisation of knowledge management in the country but could also be adopted in other African countries whose legal environment is similar to that in Botswana. / Thesis (DPhil)--University of Pretoria, 2010. / Information Science / unrestricted
98

Bar code data collection system implementation and laboratory exercise

Librescu, Joseph 24 October 2009 (has links)
<p>Bar coding is an alternative to manual data collection systems, and it provides the means to collect and report information quickly and accurately. Bar coding is widely used in manufacturing environments to track work-in-process, inventory, time and attendance, documents, capital assets, and product quality. In this project, an integrated data collection system using bar code technology is designed and installed in the Automatic Data Collection System Laboratory (ADCSL), at Virginia Tech. In addition, a series of lab exercises and technology demonstrations are developed to better introduce the technology into the Industrial and Systems Engineering curricula. This can provide engineers choosing this major, the necessary background for making meaningful use of bar coding technology once they graduate.</p> / Master of Science
99

Effects of a Prototypical Training Program on the Implementation of Systematic Observational Data Collection on Iep Objectives for the Core Deficits of Autism Spectrum Disorders

Harkins, Jessica L. 05 1900 (has links)
Legal mandates and best practice recommendations for the education of students with autism spectrum disorders (ASD) emphasize the importance of systematic, ongoing observational data collection in order to monitor progress and demonstrate accountability. The absence of such documentation in decision-making on instructional objectives indicates a weakness in bridging the research-to-practice gap in special education. Utilizing a multiple baseline design across participants, the current study evaluated the effects of a prototypical teacher training program (i.e., workshop, checklist, in-classroom training with feedback, and maintenance with a thinned schedule of feedback) on the frequency of data collection on core deficits of ASD and the use of data-based decision-making. Results indicate increases in daily mean frequency of data collection following intervention. Maintenance and generalization indicates variable responding across participants. Effect size (Cohen's d) indicates a large, clinically significant effect of the training program. Results are discussed in relation to training models, maintenance, and future research.
100

Development of a high altitude balloon payload data collection, telemetry, and recovery system

King, Nathan Michael 01 May 2010 (has links)
High altitude balloons are an effective, inexpensive and readily available conduit for conducting near space and low Reynolds number experimentation. Experiments are being developed that will use high altitude balloons as carriers for near space and low Reynolds test vehicles. The first step in developing this capability is to create a system that is able to log collected data and track and control a high altitude balloon payload. It is also beneficial that this system be flexible enough to accept different sensor types, communication methods and connection and release linkages. By combining the flexibility of microcontroller biased circuitry and the availability of commercial off the shelf products an economical design solution to this problem has been be achieved. Analysis of this system has been performed and the design has been fabricated, tested and specially modified to withstand the extreme conditions of high altitude flight.

Page generated in 0.1262 seconds