• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1614
  • 710
  • 185
  • 116
  • 60
  • 59
  • 50
  • 29
  • 23
  • 19
  • 17
  • 12
  • 12
  • 11
  • 11
  • Tagged with
  • 3338
  • 962
  • 889
  • 464
  • 371
  • 344
  • 344
  • 316
  • 315
  • 287
  • 281
  • 278
  • 274
  • 271
  • 258
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Tendência da produção científica em Comunicação no Brasil / Trends in scientific production in Brazil in Communication

Roberto Farias Silva 16 July 2004 (has links)
Esta dissertação tem como objetivo mensurar a produção científica dos doutores em comunicação em Instituições de Ensino Superior no Brasil, entre os anos de 1990 e 2000. Para realizar tal intento, partimos do levantamento dos doutores formados em Programas de Pós-Graduação em Comunicação, Ciências da Informação e Multimeios, com propósito de analisar a produção e a atividade profissional desses durante a formação e no período posterior à titulação. Como instrumento de acesso aos currículos profissionais desses doutores, utilizamos a Plataforma Curricular Lattes (CV-Lattes), do Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq), fundação vinculada ao Ministério da Ciência e Tecnologia (MCT), para o apoio à pesquisa brasileira. Obtendo as características das teses dos doutorados; a natureza das principais atividades profissionais; a dinâmica de formação do cientista; as características divulgadas do conhecimento em comunicação e as ponderadas de produção científica e acadêmica do doutorando e do doutor em Comunicação.
162

Modelagem de ferramentas avaliativas de profissionais atuantes em pesquisa científica / Modeling of evaluation tools in scientific research professionals activity

GIMENES, CELSO H. 12 November 2015 (has links)
Submitted by Claudinei Pracidelli (cpracide@ipen.br) on 2015-11-12T10:49:11Z No. of bitstreams: 0 / Made available in DSpace on 2015-11-12T10:49:11Z (GMT). No. of bitstreams: 0 / Várias metodologias buscam conhecer e medir o desempenho dos indivíduos na organização, estabelecendo uma comparação entre o comportamento esperado e o apresentado por esses indivíduos. Este trabalho tem o propósito de desenvolver um método que integre a avaliação de desempenho de modo objetivo, fatores motivacionais, o planejamento estratégico das instituições de P&D e os critérios de avaliação do CNPq. A partir da revisão bibliográfica, aplicações práticas, e indicadores de produtividade acadêmica, chegou-se ao modelo proposto, com o qual se almeja contribuir para a melhoria das instituições de P&D. O modelo é flexível, pois permite avaliar a produção acadêmica na área de serviço, ensino, produção e desenvolvimento, podendo ser aplicado/adaptado em qualquer área do conhecimento. Com base em uma análise quantitativa e qualitativa dos dados obtidos, conclui-se que a metodologia multidimensional de avaliação de desempenho pode por meio de indicadores, de forma objetiva, avaliar a produção acadêmica de pesquisadores e tecnologistas científicos. / Tese (Doutorado em Tecnologia Nuclear) / IPEN/T / Instituto de Pesquisas Energeticas e Nucleares - IPEN-CNEN/SP
163

Percurso e controvérsias: o jornalismo científico para além das páginas de educação do Jornal A Crítica

Figueiredo, Ana Celia Ossame de 21 December 2015 (has links)
Submitted by Swane Vicente (swane_vicente@hotmail.com) on 2016-07-08T14:33:50Z No. of bitstreams: 1 Dissertação Ana Célia Ossame de Figueiredo.pdf: 1798114 bytes, checksum: 17e94c133336979ec563e7505b617132 (MD5) / Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2016-07-11T14:18:23Z (GMT) No. of bitstreams: 1 Dissertação Ana Célia Ossame de Figueiredo.pdf: 1798114 bytes, checksum: 17e94c133336979ec563e7505b617132 (MD5) / Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2016-07-11T14:24:09Z (GMT) No. of bitstreams: 1 Dissertação Ana Célia Ossame de Figueiredo.pdf: 1798114 bytes, checksum: 17e94c133336979ec563e7505b617132 (MD5) / Made available in DSpace on 2016-07-11T14:24:09Z (GMT). No. of bitstreams: 1 Dissertação Ana Célia Ossame de Figueiredo.pdf: 1798114 bytes, checksum: 17e94c133336979ec563e7505b617132 (MD5) Previous issue date: 2015-12-21 / CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / This paper investigates the production of materials of Science Journalism area (JC) in the journal Education page CRITICAL, with the aim of showing the route of this process from the selected information as news, the preparation of the agenda, the interview capture, writing the text before publication in order to identify the controversies arising from the final product, the report said. With support Bruno Latour's work and the Actor-Network Theory (ART) were identified networks of human actors and established non-human, woven or driven to produce the journalistic text on the scientific method, which is defined by Latour as the best alternative to show the construction of a network involving the production of science, the network identified in human and non-human actors formed for the production of journalism Scientific text.This network includes the interviewee and his interest in disclosing the information, the press, which offers the agenda, the decision of page editor to accept it and the reporter who conducted the work of interviewing and writing the final text to be edited, paginated graphically and published. This studyen compasses144 Education pageis sues publishe don Fridays, from 2012, 2013 and 2014; reinterpretations of the selected material; and rounds of talks with respondents at the time to know their perception about the published reports, stage/stage where disputes arise. Thus, we reveal the existence of the "black box" of journalistic facts, which present us with a new way of understanding the production of the facts that have become articles published on that page. / Este trabalho investigou a produção de matérias da área de Jornalismo Científico (JC) na página de Educação do jornal A CRÍTICA, com o objetivo de mostrar o percurso desse processo desde a informação selecionada como notícia, a elaboração da pauta, captação da entrevista, escrita do texto até à publicação, buscando identificar as controvérsias resultantes do produto final, a reportagem. Com o suporte teórico da obra de Bruno Latour e a Teoria Ator-Rede (TAR) foram identificadas as redes de atores humanos e não-humanos constituídas, tecidas ou acionadas para a produção do texto jornalístico na modalidade cientifica, a qual é definida por Latour como a melhor alternativa para mostrar a construção de uma rede envolvendo a produção da ciência, identificamos a rede de atores humanos e não-humanos formada para a produção do texto de Jornalismo Científico. Essa rede inclui o entrevistado e o seu interesse em divulgar a informação, a assessoria de imprensa, que oferece a pauta, a decisão do editor da página em aceitá-la e do repórter que procedeu ao trabalho de entrevistar e escrever o texto final a ser editado, paginado graficamente e publicado. Este estudo engloba 144 edições da página de Educação publicadas às sextas-feiras, no período de 2012, 2013 e 2014; releituras do material selecionado; e rodadas de conversas com entrevistados à época para conhecer a percepção deles a respeito das matérias publicadas, estágio/fase em que as controvérsias emergem. Dessa forma, revelamos a existência da chamada “caixa-preta”, dos fatos jornalísticos, que nos apresentam uma nova maneira de compreender a produção dos fatos que se tornaram matérias publicadas na referida página.
164

The estimation and presentation of standard errors in a survey report

Swanepoel, Rene 26 May 2006 (has links)
The vast number of different study variables or population characteristics and the different domains of interest in a survey, make it impractical and almost impossible to calculate and publish standard errors for each estimated value of a population variable or characteristic and each domain individually. Since estimated values are subject to statistical variation (resulting from the probability sampling), standard errors may not be omitted from the survey report. Estimated values can be evaluated only if their precision is known. The purpose of this research project is to study the feasibility of mathematical modeling to estimate the standard errors of estimated values of population parameters or characteristics in survey data sets and to investigate effective and user-friendly presentation methods of these models in reports. The following data sets were used in the investigation: • October Household Survey (OHS) 1995 - Workers and Household data set • OHS 1996 - Workers and Household data set • OHS 1997 - Workers and Household data set • Victims of Crime Survey (VOC) 1998 The basic methodology consists of the estimation of standard errors of the statistics considered in the survey for a variety of domains (such as the whole country, provinces, urban/rural areas, population groups, gender and age groups as well as combinations of these). This is done by means of a computer program that takes into consideration the complexity of the different sample designs. The direct calculated standard errors were obtained in this way. Different models are then fitted to the data by means of regression modeling in the search for a suitable standard error model. A function of the direct calculated standard error value served as the dependent variable and a function of the size of the statistic served as the independent variable. A linear model, equating the natural logarithm of the coefficient of relative variation of a statistic to a linear function of the natural logarithm of the size of the statistic, gave an adequate fit in most of the cases. Well-known tests for the occurrence of outliers were applied in the model fitting procedure. For each observation indicated as an outlier, it was established whether the observation could be deleted legitimately (e.g. when the domain sample size was too small, or the estimate biased). Afterwards the fitting procedure was repeated. The Australian Bureau of Statistics also uses the above model in similar surveys. They derived this model especially for variables that count people in a specific category. It was found that this model performs equally well when the variable of interest counts households or incidents as in the case of the VOC. The set of domains considered in the fitting procedure included segregated classes, mixed classes and cross-classes. Thus, the model can be used irrespective of the type of subclass domain. This result makes it possible to approximate standard errors for any type of domain with the same model. The fitted model, as a mathematical formula, is not a user-friendly presentation method of the precision of estimates. Consequently, user-friendly and effective presentation methods of standard errors are summarized in this report. The suitability of a specific presentation method, however, depends on the extent of the survey and the number of study variables involved. / Dissertation (MSc (Mathematical Statistics))--University of Pretoria, 2007. / Mathematics and Applied Mathematics / unrestricted
165

An action spectrum apparatus

Brooks, Donald Elliott January 1967 (has links)
An instrument is described which is capable of measuring the action spectrum of the removal of CO inhibition of respiration by light. In the method employed here, a cell suspension in a CO-O₂ atmosphere is alternately exposed to two wavelengths of light. Their photochemical effects are balanced using an 0₂ electrode as the null detector. The light intensities at the balance points from a series of wavelength pairs are used to determine the ratios of the extinction coefficients of the CO - oxidase complex, at the various wavelengths, to the extinction coefficient at a standard wavelength. An action spectrum for Bakers yeast is shown. / Science, Faculty of / Physics and Astronomy, Department of / Graduate
166

Automating Laboratory Operations by Intergrating Laboratory Information Management Systems (LIMS) with Analytical Instruments and Scientific Data Management System (SDMS)

Zhu, Jianyong 06 1900 (has links)
Submitted to the faculty of the University Graduate School in partial fulfillment of the requirements for the degree Master of Science in the School of Informatics, Indiana University June 2005 / The large volume of data generated by commercial and research laboratories, along with requirements mandated by regulatory agencies, have forced companies to use laboratory information management systems (LIMS) to improve efficiencies in tracking, managing samples, and precisely reporting test results. However, most general purpose LIMS do not provide an interface to automatically collect data from analytical instruments to store in a database. A scientific data management system (SDMS) provides a “Print-to-Database” technology, which facilitates the entry of reports generated by instruments directly into the SDMS database as Windows enhanced metafiles thus to minimize data entry errors. Unfortunately, SDMS does not allow performing further analysis. Many LIMS vendors provide plug-ins for single instrument but none of them provides a general purpose interface to extract the data from SDMS and store in LIMS. In this project, a general purpose middle layer named LabTechie is designed, built and tested for seamless integration between instruments, SDMS and LIMS. This project was conducted at American Institute of Technology (AIT) Laboratories, an analytical laboratory that specializes in trace chemical measurement of biological fluids. Data is generated from 20 analytical instruments, including gas chromatography/mass spectrometer (GC/MS), high performance liquid chromatography (HPLC), and liquid chromatography/mass spectrometer (LC/MS), and currently stored in NuGenesis SDMS iv (Waters, Milford, MA). This approach can be easily expanded to include additional instruments.
167

Is It a Small World after all: An Examination of Scientific Collaborations in Public Administration

Orr, James Earl, Jr 15 December 2012 (has links)
Peer reviewed journal articles are one way in which scholars communicate with each other and the public. Such publications create networks of collaboration. This study uses social network analysis techniques and theory to examine the network of collaborations that occur in public administration. Social network analysis is a perspective that takes into account the structure of relationships that can exist among individuals, organizations or other entities (Wellman, 2008). The small world theory is the specific theoretical framework that guides this study. The small world theory is based on the notion that despite a population being very large, individuals in that population are still connected with each other within a few steps. The author constructs a scientific network of research collaborations by assigning a relationship to two actors who have co-published an article together in the Public Administration Review, American Review of Public Administration, or The Review of Public Personnel Administration during the time periods of January 2003- December 2011. The results of this analysis reveal that the public administration network consists primarily of faculty members. The network also exhibits a high degree of clustering and several cliques. On average, individuals in the network are only slightly farther apart from each other than what would be expected in a small world network. This research contributes to public administration by introducing scientific networks of collaboration to public administration. The field has not ignored who publishes in its journals, but it has not used network analysis techniques to examine such publications. This study demonstrates how network analysis techniques and methodology can be used to examine a large network. Finally, this research contributes to the small world theory by applying it to scientific networks in public administration.
168

Using Virtual Environments to Visualize Atmospheric Data: Can It Improve a Meteorologist'S Potential to Analyze the Information?

Ziegeler, Sean Bernard 11 May 2002 (has links)
Conventional analysis of atmospheric data includes three-dimensional desktop-computer displays. One disadvantage is that it can reduce the ability to zoom in and see small-scale features while concurrently viewing other faraway features. This research intends to determine if using virtual environments to examine atmospheric data can improve a meteorologist's ability to analyze the given information. In addition to possibly enhancing small-scale analysis, virtual environments technology offers an array of possible improvements. Presented is the theory on developing an experiment to establish the extent to which virtual environments assist meteorologists in analysis. Following is the details of an implementation of such an experiment. Based on the quantitative results obtained, the conclusion is that immersion can significantly increase the accuracy of a meteorologist's analysis of an atmospheric data set.
169

ProLAS: a Novel Dynamic Load Balancing Library for Advanced Scientific Computing

Krishnan, Manoj Kumar 13 December 2003 (has links)
Scientific and engineering problems are often large, complex, irregular and data-parallel. The performance of many parallel applications is affected by factors such as irregular nature of the problem, the difference in processor characteristics and runtime loads, the non-uniform distribution of data, and the unpredictable system behavior. These factors give rise to load imbalance. In general, in order to achieve high performance, dynamic load balancing strategies are embedded into solution algorithms. Over time, a number of dynamic load balancing algorithms have been implemented into software tools and successfully used in scientific applications. However, most of these dynamic load balancing tools use an iterative static approach that does not address irregularities during the application execution, and the scheduling overhead incurred is high. During the last decade, a number of dynamic loop scheduling strategies have been proposed to address causes of load imbalance in scientific applications running in parallel and distributed environments. However, there is no single strategy that works well for all scientific applications, and it is up to the user to select the best strategy and integrate it into the application. In most applications using dynamic load balancing, the load balancing algorithm is directly embedded in the application, with close coupling between the data structures of the application and the load balancing algorithm. This typical approach leads to two disadvantages. First, the integration of each newly developed load balancing algorithm into the application needs to be performed from scratch. Second, it is unlikely that the user has incorporated the optimal load balancing algorithm into the application. Moreover, in a certain application (of various problem sizes and number of processors), it is difficult to assess in advance the advantage of incorporating one load balancing algorithm versus another. To overcome these drawbacks, there is a need for developing an application programming interface (API) for dynamic load balancing scientific applications using the recently developed dynamic loop scheduling algorithms. This thesis describes the design and development of such an API, called ProLAS, which is scalable, and independent of data structures of a host application. ProLAS performance is evaluated theoretically and experimentally (after being used in scientific applications). A qualitative and quantitative analysis of ProLAS is presented by comparing its performance with the state of the art technology in dynamic load balancing tools (e.g. CHARM++ library) for parallel applications. The analysis of the experimental results of using ProLAS in a few scientific aplications indicate that it consistently outperforms the existing technology in dynamic load balancing.
170

Assessing impact of instruction treatments on positive test selection in hypothesis testing

Carruth, Daniel Wade 09 August 2008 (has links)
The role of factors previously implicated as leading to confirmation bias during hypothesis testing was explored. Confirmation bias is a phenomenon in which people select cases for testing when the expected results of the case are more likely to support their current belief than falsify it. Klayman (1995) proposed three primary determinants for confirmation bias. Klayman and his colleagues proposed that a general positive testing strategy leads to the phenomenon of confirmation bias. According to Klayman’s account, participants in previous research were not actively working to support their hypothesis. Rather, they were applying a valid hypothesis testing strategy that works well outside of laboratory tasks. In laboratory tasks, such as Wason’s 2-4-6 task (Wason, 1960), the strategy failed because the nature of the task takes advantage of particular flaws in the positive testing behavior participants learned through their experience with the real-world. Given Klayman’s proposed set of determinants for the positive testing strategy phenomenon, treatments were developed that would directly violate the assumptions supporting application of the positive testing strategy. If participants were able to identify and act on these violations of the assumptions, the number of positive tests was expected to be reduced. The test selection portion of the Mynatt, Doherty, and Tweney (1977) microworld experiment was modified with additional instruction conditions and a new scenario description to investigate the impact of the treatments to reduce confirmation bias in test selection. Despite expectations, the thematic content modifications and determinant-targeting instruction conditions had no effect on participant positive test selection.

Page generated in 0.0424 seconds