• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 204
  • 100
  • 35
  • 32
  • 31
  • 7
  • 6
  • 6
  • 5
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 523
  • 523
  • 84
  • 81
  • 66
  • 60
  • 46
  • 46
  • 39
  • 38
  • 37
  • 36
  • 35
  • 31
  • 31
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Padrões espaciais e variáveis climatológicas associados à dengue no município de Ribeirão Preto entre 2001 e 2010 / Spatial patterns and climatological variables associated with dengue in the city of Ribeirão Preto between 2001 and 2010

Gustavo Bussi Caminiti 03 July 2015 (has links)
A dengue é a mais importante, dentre as arboviroses, que afeta o homem e constitui-se em sério problema de Saúde Pública. No Brasil a dengue encontra-se presente em todos os 27 estados da Federação, distribuída por 3.794 municípios, sendo responsável por cerca de 60% das notificações nas Américas. Um dos municípios do estado de São Paulo, com um dos maiores índices de casos confirmados, é Ribeirão Preto. O presente trabalho teve como objetivo caracterizar o padrão espacial e associar os casos de dengue às variáveis climatológicas no município de Ribeirão Preto no período entre 2001 a 2010. Estudo com delineamento híbrido, ecológico e de tendência temporal. A população constituiu-se dos casos confirmados de dengue de residentes. Os dados foram coletados junto ao Sistema Nacional de Agravos de Notificação, Instituto Brasileiro de Geografia e Estatística, DATASUS e Instituo Agronômico de Campinas. A geocodificação foi realizada por meio do método por endereçamento. O comportamento dos padrões dos pontos dos casos geocodificados, foi descrito d]segundo o Estimador de Intensidade por Kernel e analisado segundo o método do vizinho mais próximo. Para o cálculo da correlação linear bivariada entre os casos novos de dengue e as variáveis climatológicas foram utilizados os Coeficiente de Correlação Linear de Spearman e Pearson, segundo conceito time lag. O projeto foi aprovado pelo Comitê de Ética em Pesquisa da Escola de Enfermagem de Ribeirão Preto da Universidade de São Paulo. Os mapas temáticos, complementados pela análise de vizinhança, mostraram concentração dos casos em regiões do município de alta densidade demográfica, habitações horizontalizadas, menores condições socioeconômicas, com visível acúmulo de materiais recicláveis não acondicionados, favorecendo a formação de depósitos de água. O método time lag mostrou que as condições climáticas apresentaram relação positiva com a transmissão de dengue entre dois e quatro meses de sua ocorrência. Os resultados aqui apresentados podem ser utilizados como estratégia para planejamento de ações de organismos públicos, visando à melhoria da saúde da população / Dengue is the most important among the arboviruses, which affects man and represents a severe Public Health Problem. In Brazil, dengue is present in all 27 states of the Federation, distribution across 3,794 cities, and is responsible for about 60% of the notifications in the Americas. One of the cities in the State of São Paulo with one of the highest confirmed case ratios is Ribeirão Preto. This study aimed to characterize the spatial pattern and associate the cases of dengue with the climatological variables in the city of Ribeirão Preto between 2001 and 2010. Research with a hybrid, ecological and temporal trend design. The population included the confirmed cases of dengue among residents. The data were collected from the National Disease Notification System, the Brazilian Institute for Geography and Statistics, DATASUS and Instituo Agronômico de Campinas. For the purpose of geocoding, the addressing method was used. The behavior patterns of the geocoded case points were described using Kernel density estimation and analyzed according to the nearest neighbor method. To calculate the bivariate linear correlation between new dengue cases and the climatological variables, Spearman and Pearson\'s Linear Correlation Coefficients were used, according to the time lag concept. Approval for the project was obtained from the Institutional Review Board at the University of São Paulo at Ribeirão Preto College of Nursing. The thematic maps, complemented by the neighborhood analysis, showed a concentration of cases in regions of the city that show high demographic density, horizontal habitations, lower socioeconomic conditions, with a visible accumulation of non-conditioned recyclable material, favoring the creation of water deposits. The time lag method showed that the climatic conditions were positively related with the transmission of dengue between two and four months after its occurrence. The results presented here can be used as a strategy to plan public actions to improve the population\'s health
102

Multi-scale analysis of the energy performance of supermarkets

Spyrou, Maria S. January 2015 (has links)
The retail sector accounts for more than 3% of the total electricity consumption in the UK and approximately 1% of total UK CO2 emissions. The overarching aim of this project was to understand the energy consumption of the Tesco estate (the market leader), identify best practice, and find ways to identify opportunities for energy reduction. The literature review of this work covered the topic of energy consumption in the retail sector, and reviewed benchmarks for this type of buildings from the UK, Europe and the US. Related data analysis techniques used in the industry or presented in the literature were also reviewed. This revealed that there are many different analysis and forecasting techniques available, and that they fall into two different categories: techniques that require past energy consumption data in order to calculate the future consumption, such as statistical regression, and techniques that are able to estimate the energy consumption of buildings, based on the specific building's characteristics, such as thermal simulation models. These are usually used for new buildings, but they could also be used in benchmarking exercises, in order to achieve best practice guides. Gaps in the industry knowledge were identified, and it was suggested that better analytical tools would enable the industry to create more accurate energy budgets for the year ahead leading to better operating margins. Benchmarks for the organisation's buildings were calculated. Retail buildings in the Tesco estate were found to have electrical intensity values between 230 kWh/m2 and 2000 kWh/m2 per year. Still the average electrical intensity of these buildings in 2010-11 was found to be less than the calculated UK average of the 2006-07 period. The effect of weather on gas and electricity consumption was investigated, and was found to be significant (p < 0.001). There was an effect related to the day-of-the-week, but this was found to be more related to the sales volume on those days. Sales volume was a proxy that was used to represent the number of customers walking through the stores. The built date of the building was also considered to be an interesting factor, as the building regulations changed significantly throughout the years and the sponsor did not usually carry out any fabric work when refurbishing the stores. User behaviour was also identified as an important factor that needed to be investigated further, relating to both how the staff perceives and manages the energy consumption in their work environment, as well as how the customers use the refrigeration equipment. Following a statistical analysis, significant factors were determined and used to create multiple linear regression models for electricity and gas demands in hypermarkets. Significant factors included the sales floor area of the store, the stock composition, and a factor representing the thermo-physical characteristics of the envelope. Two of the key findings are the statistical significance of operational usage factors, represented by volume of sales, on annual electricity demand and the absence of any statistically significant operational or weather related factors on annual gas demand. The results suggest that by knowing as little as four characteristics of a food retail store (size of sales area, sales volume, product mix, year of construction) one can confidently calculate its annual electricity demands (R2=0.75, p < 0.001). Similarly by knowing the size of the sales area, product mix, ceiling height and number of floors, one can calculate the annual gas demands (R2=0.5, p < 0.001). Using the models created, along with the actual energy consumption of stores, stores that are not as energy efficient as expected can be isolated and investigated further in order to understand the reason for poor energy performance. Refrigeration data from 10 stores were investigated, including data such as the electricity consumption of the pack, outside air temperature, discharge and suction pressure, as well as percentage of refrigerant gas in the receiver. Data mining methods (regression and Fourier transforms) were employed to remove known operational patterns (e.g. defrost cycles) and seasonal variations. Events that have had an effect on the electricity consumption of the system were highlighted and faults that had been identified by the existing methodology were filtered out. The resulting dataset was then analysed further to understand the events that increase the electricity demand of the systems in order to create an automatic identification method. The cases analysed demonstrated that the method presented could form part of a more advanced automatic fault detection solution; potential faults were difficult to identify in the original electricity dataset. However, treating the data with the method designed as part of this work has made it simpler to identify potential faults, and isolate probable causes. It was also shown that by monitoring the suction pressure of the packs, alongside the compressor run-times, one could identify further opportunities for electricity consumption reduction.
103

WHY WE SING ALONG: MEASURABLE TRAITS OF SUCCESSFUL CONGREGATIONAL SONGS

Read, Daniel 01 January 2017 (has links)
Songwriters have been creating music for the church for hundreds of years. The songs have gone through many stylistic changes from generation to generation, yet, each song has generated congregational participation. What measurable, traceable qualities of congregational songs exist from one generation to the next? This document explores the history and development of Congregational Christian Song (CCS), to discover and document the similarities between seemingly contrasting styles of music. The songs analyzed in this study were chosen because of their wide popularity and broad dissemination among non-denominational churches in the United States. While not an exhaustive study, this paper reviews over 200 songs spanning 300 years of CCS. The findings of the study are that songs that have proven to be successful in eliciting participation all contain five common elements. These elements encourage congregations to participate in singing when an anticipation cue is triggered and then realized. The anticipation/reward theory used in this study is based on David Huron’s ITPRA (Imagination-Tension-Prediction-Reaction-Appraisal) Theory of Expectation. This thesis is designed to aid songwriters and music theorists to quickly identify whether a CCS can be measured as successful (i.e., predictable).
104

Posouzení eliminace léčiv při úpravě pitné vody umělou infiltrací / Assessment of drugs elimination in the treatment of drinking water by artificial recharge

Chupík, Jan January 2019 (has links)
The Káraný waterworks supplies drinking water to approximately one third of its total consumption in Prague. It uses two main ways to produce drinking water: artificial infiltration and bank infiltration. Two-year monitoring of the content of 90 drugs and metabolites evaluates the occurrence of these substances in the Jizera River and in both production processes. The results of the monitoring point to a systematic occurrence of drugs in the Jizera River under Mladá Boleslav in concentrations ranging from tens of ng / l to hundreds of ng / l (Acesulfan and Oxypurinol). Artificial infiltration failed to remove six drugs from water (Primidon, Sulfamethaxxazole, Carbamazepine, Lamotrigine, Ibuprofen, Gabapentin, Acesulfan and Oxypurinol). Only four drugs (Ibuprofen, Caffeine, Oxypurinol and Acesulfan) were found in the results of monitoring from bank infiltration. This makes bank infiltration a more effective method of drug elimination than artificial infiltration. Keywords: drugs, drinking water, statistical analysis, monitoring
105

Characterization and Correlation Analysis of Pharmaceutical Gelatin

Felix, Pascal Georges 18 November 2003 (has links)
The properties of the aged gel and subsequent softgels were examined using mechanical and chemical testing methods. Our hypothesis was that a negligible variation will exist between the aged gel of the same type. The greater difference is expected to be seen between the types of gels described as 150 Bloom (alkaline treated collagen) and 195 Bloom (acid treated collagen). The types of gelatin used were the acid processed (195 Acid Bone) and alkaline processed (150 Lime Bone). Because of the differences expressed as the result of their manufacture sequence (namely their molecular weights), it follows that physical attributes will further contribute to their distinction. In addition to observing different characteristics between the types of gels, we aged the gelatin and produced softgel capsules to qualify and quantify the changes that occur as a function of time. Two production lots of over 1 million softgel capsules were executed to produce a population that lends itself to statistical analysis. Softgel capsules were manufactured with gelatin which was aged at intervals of 0-8 hrs, 32-40 hrs, 66-72 hrs and 88-96 hrs. The manufacturing process made use of this strategy for the acid and alkaline treated gelatin where a total of eight lots were made (4 acid and 4 alkaline). One hundred thousand softgels were manufactured for the acid processed gelatin, per lot. Additionally, one hundred and fifty thousand softgels were manufactured for the alkaline processed gelatin per lot. The results of the different tests provided trends that were not solely a function of time. Gel extensibility for both gel types showed a decrease in the amount of force needed to rupture the gelatin ribbon, as a function of time. The resilience of the tested ribbon remained constant throughout the aging process. The burst strength was the only test showing an inverse relationship between the two gel types. The amount of force needed to rupture the 150 Bloom softgels decrease in time whereas the amount of force needed to rupture the 195 Bloom softgels increase with time. The rheological testing was described in the literature as being associated with the molecular weight distribution. Such association was seen in our research and both the results of the rheological and the molecular weight tests decreased with the aging process.
106

Portable X-ray Fluorescence Analysis of Pottery at the Bayshore Homes Site in Pinellas County, Florida

Nostrom, Rachel 01 August 2014 (has links)
The Bayshore Homes site was occupied intermittently over a period of approximately twelve hundred years, with the two main occupation periods being CE 150-550 and CE 900-1350. During those lengthy occupations a substantial amount of plain and decorated pottery was discarded at the site. A portable X-ray fluorescence (pXRF) spectrometer was utilized to analyze the elemental composition of 133 sherds, both decorated and plain. The resulting elemental composition data were then analyzed using multivariate statistics in an attempt to discern discrete clay sources that may have been exploited by inhabitants of the Bayshore Homes site. Principal component analysis (PCA) and discriminant function analysis (DA) were employed to identify three discrete clay sources exploited in the production of pottery. The results of the statistical analyses were then used to answer two basic, yet pertinent, questions about the Bayshore pottery: 1) Were the same clay sources exploited during both occupation periods? 2) Were the same clay sources exploited for both decorated and plain pottery? The results of the statistical analyses indicate that the same clay sources were exploited for both occupation periods, though evidence suggests that the dominant clay source in use did change over time. The results also imply that the same clay sources were utilized in the production of plain and decorated pottery, which suggests that at least some portion of the decorated pottery excavated from the Bayshore site was produced locally, and not obtained through trade. Finally, the results of this research demonstrate that pXRF is a useful tool for preliminary differentiation of clay sources in Florida.
107

Adjusting the parameter estimation of the parentage analysis software MasterBayes to the presence of siblings : a thesis presented in partial fulfillment of the requirements for the degree of Master of Applied Statistics at Massey University, Albany, New Zealand

Heller, Florian January 2009 (has links)
Parentage analysis is concerned with the estimation of a sample’s pedigree structure, which is often essential knowledge for estimating population parameters of animal species, such as reproductive success. While it is often easy to relate one parent to an offspring simply by observation, the second parent remains frequently unknown. Parentage analysis uses genotypic data to estimate the pedigree, which then allows inferring the desired parameters. There are several software applications available for parentage analysis, one of which is MasterBayes, an extension to the statistical software package R. MasterBayes makes use of behavioural, phenotypic, spatial and genetic data, providing a Bayesian approach to simultaneously estimate pedigree and population parameters of interest, allowing for a range of covariate models. MasterBayes however assumes the sample to be a randomly collected from the population of interest. Often however, collected data will come from nests or otherwise from groups that are likely to contain siblings. If siblings are present, the assumption of a random population sample is not met anymore and as a result, the parameter variance will be underestimated. This thesis presents four methods to adjust MasterBayes’ parameter estimate to the presence of siblings, all of which are based on the pedigree structure, as estimated by MasterBayes. One approach, denoted as DEP, provides a Bayesian estimate, similar to MasterBayes’ approach, but incorporating the presence of siblings. Three further approaches, denoted as W1, W2 and W3, apply importance sampling to re-weight parameter estimates obtained from MasterBayes and DEP. Though fully satisfying adjustment of the estimate’s variance is only achieved at nearly perfect pedigree assignment, the presented methods do improve MasterBayes’ parameter estimation in the presence of siblings considerably, when the pedigree is uncertain. DEP and W3 show to be the most successful adjustment methods, providing comparatively accurate, though yet underestimated variances for small family sizes. W3 is the superior approach when the pedigree is highly uncertain, whereas DEP becomes superior when about half of all parental assignments are correct. Large family sizes introduce to all approaches a tendency to underestimate the parameter variance, the degree of underestimation depending on the certainty of pedigree. Additionally, the importance sampling schemes provide at large uncertainty of pedigree comparatively good estimates of the parameter’s expected values, where the non importance sampling approaches severely fail.
108

Analysis of traffic accidents before and after resurfacing

Geedipally, Srinivas January 2005 (has links)
<p>This Dissertation includes a statistical analysis of traffic accidents followed by a test to know the effect of new pavement on traffic safety. The accident data is considered for the roads those are in Region South-East Sweden that got new pavement during the year 2001. In Sweden, this is the fourth study concerning the before and after effect of the new pavement. Johansson (1997) studied the change in the number of accidents between the before-years and after-years. Tholén (1999) and Velin et al (2002) have additionally compared the change with the change in the number of accidents in a reference road network (also called control sites) consisting of all public roads in Region West Sweden which were not resurfaced during the study period.</p>
109

Railway Safety - Risks and Economics

Bäckman, Johan January 2002 (has links)
Safety analysis is a process involving several techniques.The purpose of this thesis is to test and develop methodssuitable for the safety analysis of railway risks and railwaysafety measures. Safety analysis is a process comprisingproblem identification, risk estimation, valuation of safetyand economic analysis. The main steps are described in separatechapters, each of which includes a discussion of the methodsand a review of previous research, followed by the contributionof this author. Although the safety analysis proceduredescribed can be used for analysing railway safety, it has suchgeneral foundations that it can be used wherever safety isimportant and wherever safety measures are evaluated. Itcombines cost benefit analysis with criteria for thedistribution and the absolute levels of risk. Risks are estimated with both statistical and risk analysismethods. Historical data on railway accidents are analysed andstatistical models fitted to describe trends in accident ratesand consequences. A risk analysis model is developed usingfault tree and event tree techniques, together with Monte Carlosimulation, to calculate risks for passenger train derailments.The results are compared with the statistical analysis ofhistorical data. People's valuation of safety in different contexts isanalysed, with relative values estimated in awillingness-to-pay study. A combination of focus groups andindividual questionnaires is used. Two different methods areused to estimate the value of safety and the results arecompared. Comparisons are also made with other studies. Different approaches for safety analysis and methods foreconomic analysis of safety are reviewed. Cost-benefit analysisas a decision criterion is discussed and a study on theeconomic effectsof a traffic control system is presented. There are several results of the work. Historical data showsa decrease in the accident rate. The average consequence ofeach accident has not changed over time. The risk analysismodel produces comparable results and enables analysis ofvarious safety measures. The valuation study shows that peopleprefer the prevention of small-scale accidents over theprevention of larger, catastrophic accidents. There are onlysmall differences in the valuation of safety in differentcontexts.
110

Evaluation Of Effect Of Fillet Rolling Process On The Fatigue Performance Of A Diesel Engine Crankshaft

Cevik, Gul 01 September 2012 (has links) (PDF)
In this study, effect of fillet rolling process on fatigue performance of a diesel engine crankshaft was investigated. Crankshafts from two different materials, were studied / ductile cast iron EN-GJS 800-2 and micro-alloyed steel 38MnVS6. Resonance bending fatigue tests were conducted with crankshaft samples. Test plan according to staircase test methodology was used. Statistical analyses were carried out with the test data by Maximum Likelihood Estimation method in order to calculate the fatigue limits and construct the S-N curves based on Random Fatigue Limit (RFL) and Modified Basquin models. Fatigue limit calculations were also conducted by Dixon-Mood method and by Maximum Likelihood Estimation methodology for Normal and Weibull distributions. Fillet rolling process was simulated by computer based analysis in order to calculate the compressive residual stress profile at the fillet region to shed more light on the mechanisms and effect of fillet rolling. Fatigue performances of crankshafts from two types of materials were evaluated both at unrolled and fillet rolled states. Effect of fillet rolling load on fatigue performance was also evaluated with steel crankshafts. It was found that ductile cast iron showed better performance under bending fatigue tests than the steel crankshaft both at the fillet rolled and unrolled conditions. On the other hand, fillet rolling process was found to be more effective on steel crankshaft than ductile cast iron crankshaft in terms of fatigue performance improvement. It was also seen that fatigue limit increases with the fillet rolling load up to a limit where surface quality is deteriorated. Residual stress analysis showed that a higher magnitude of residual stress can develop on steel crankshaft fillet region whereas the effective depth of the residual stress is higher on ductile cast iron crankshaft with the same rolling condition. Residual stress analysis of steel crankshafts rolled at different rolling conditions show that, peak residual stress increase with the increasing rolling load is not significantly high and main effect of increased rolling load is the increased effective depth of residual stresses. The MLE methodology used in statistical analysis of the test data was found to be effective for life regression and fatigue strength distributions analysis. RFL model has provided better life regression analysis and fatigue limit calculations than Modified Basquin model. Dixon-Mood method was found to be overestimating the fatigue limit.

Page generated in 0.2801 seconds