• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 453
  • 274
  • 163
  • 47
  • 25
  • 22
  • 19
  • 10
  • 6
  • 5
  • 5
  • 4
  • 2
  • 2
  • 1
  • Tagged with
  • 1190
  • 259
  • 193
  • 143
  • 124
  • 87
  • 74
  • 67
  • 61
  • 61
  • 61
  • 61
  • 57
  • 54
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Bayesian Estimation of Material Properties in Case of Correlated and Insufficient Data

Giugno, Matteo 02 October 2013 (has links)
Identification of material properties has been highly discussed in recent times thanks to better technology availability and its application to the field of experimental mechanics. Bayesian approaches as Markov-chain Monte Carlo (MCMC) methods demonstrated to be reliable and suitable tools to process data, describing probability distributions and uncertainty bounds for investigated parameters in absence of explicit inverse analytical expressions. Though it is necessary to repeat experiments multiple times for good estimations, this might be not always feasible due to possible incurring limitations: the thesis addresses the problem of material properties estimation in presence of correlated and insufficient data, resulting in multivariate error modeling and high sample covariance matrix instability. To recover from the lack of information about the true covariance we analyze two different methodologies: first the hierarchical covariance modeling is investigated, then a method based on covariance shrinkage is employed. A numerical study comparing both approaches and employing finite element analysis within MCMC iterations will be presented, showing how the method based on covariance shrinkage is more suitable to post-process data for the range of problems under investigation.
132

Quantifying the Permeability Heterogeneity of Sandstone Reservoirs in Boonsville Field, Texas by Integrating Core, Well Log and 3D Seismic Data

Song, Qian 03 October 2013 (has links)
Increasing hydrocarbon reserves by finding new resources in frontier areas and improving recovery in the mature fields, to meet the high energy demands, is very challenging for the oil industry. Reservoir characterization and heterogeneity studies play an important role in better understanding reservoir performance to meet this industry goal. This study was conducted on the Boonsville Bend Conglomerate reservoir system located in the Fort Worth Basin in central-north Texas. The primary reservoir is characterized as highly heterogeneous conglomeratic sandstone. To find more potential and optimize the field exploitation, it’s critical to better understand the reservoir connectivity and heterogeneity. The goal of this multidisciplinary study was to quantify the permeability heterogeneity of the target reservoir by integrating core, well log and 3D seismic data. A set of permeability coefficients, variation coefficient, dart coefficient, and contrast coefficient, was defined in this study to quantitatively identify the reservoir heterogeneity levels, which can be used to characterize the intra-bed and inter-bed heterogeneity. Post-stack seismic inversion was conducted to produce the key attribute, acoustic impedance, for the calibration of log properties with seismic. The inverted acoustic impedance was then used to derive the porosity volume in Emerge (the module from Hampson Russell) by means of single and multiple attributes transforms and neural network. Establishment of the correlation between permeability and porosity is critical for the permeability conversion, which was achieved by using the porosity and permeability pairs measured from four cores. Permeability volume was then converted by applying this correlation. Finally, the three heterogeneity coefficients were applied to the permeability volume to quantitatively identify the target reservoir heterogeneity. It proves that the target interval is highly heterogeneous both vertically and laterally. The heterogeneity distribution was obtained, which can help optimize the field exploitation or infill drilling designs.
133

ライバル関係の認知の基準 : 大学生の自由記述の分析から

Ota, Nobuyuki, 太田, 伸幸 12 1900 (has links)
国立情報学研究所で電子化したコンテンツを使用している。
134

The initial atmospheric corrosion of copper and zinc induced by carboxylic acids : Quantitative in situ analysis and computer simulations

Gil, Harveth January 2011 (has links)
Degradation of metals through atmospheric corrosion is a most important and costly phenomenon with significant effects on, e.g., the lifespan of industrial materials, the reliability of electronic components and military equipment, and the aesthetic appearance of our cultural heritage. Atmospheric corrosion is the result of the interaction between the metal and its atmospheric environment, and occurs in the presence of a thin aqueous adlayer. The common incorporation of pollutant species into this adlayer usually enhances the degradation process. During atmospheric corrosion indoors, low concentrations of organic atmospheric constituents, such as formic, acetic, propionic, butyric and oxalic acids, have found to play an accelerating role on a broad range of metals or their alloys, including lead, steel, nickel, copper, cadmium, magnesium and zinc. In this doctoral thesis the initial stages of the atmospheric corrosion of copper exposed to synthetic air, aiming at simulating representative indoor atmospheric environments, have been investigated both experimentally and through a computational method. The experiments have been based on a unique analytical setup in which a quartz crystal microbalance (QCM) was integrated with infrared reflection absorption spectroscopy (IRAS). This enabled the initial atmospheric corrosion of copper to be analyzed during ongoing corrosion in humidified air at room temperature and additions of 120 ppb (parts per volume billions) of acetic, formic or propionic acid. The main phases identified were copper (I) oxide (Cu2O) and various forms of copper carboxylate, and their amounts deduced with the different analytical techniques agree with a relative accuracy of 12% or better. Particular emphasis has been on the identification of different forms of copper (I) oxide generated during these exposures. An electrochemically based model has been proposed to describe how copper oxides, formed in the presence of acetic acid, are electrochemically reduced in neutral solution. The model includes the electrochemical reduction of copper (II) oxide (CuO), amorphous copper (I) oxide (Cu2O)am, intermediate copper (I) oxide (Cu2O)in, and crystalline copper (I) oxide (Cu2O)cr. A good agreement is obtained between the model and experimental data, which supports the idea of a reduction sequence which starts with copper (II) oxide and continues with the reduction of the three copper (I) oxides at more negative potentials. The quantified analytical data obtained in this doctoral study on corrosion products formed on copper, and corresponding data on zinc reported elsewhere, were used as the starting point to develop a computational model, GILDES, that describes the atmospheric corrosion processes involved. GILDES considers the whole interfacial regime in which all known chemical reactions have been considered which are assumed to govern the initial atmospheric corrosion of copper or zinc in the presence of carboxylic acids. The model includes two separate pathways, a proton-induced dissolution of cuprous ions or zinc ions followed by the formation of either copper (I) oxide or zinc (II) oxide, and a carboxylate-induced dissolution followed by the formation of either copper (II) carboxylate or zinc (II) carboxylate. The model succeeds to predict the two main phases in the corrosion products and a correct ranking of aggressiveness of the three acids for both copper and zinc. The ranking has been attributed to differences in acid dissociation constant and deposition velocity of the carboxylic acids investigated. / QC 20111114
135

Proposta de um modelo para avaliação das relações causais entre métricas de modelos de avaliação de desempenho

Fiterman, Luciano January 2006 (has links)
Os indicadores de desempenho têm papel fundamental na gestão das organizações, pois mostram aos decisores a situação da organização e como ela se encontra em relação a seus objetivos. Entre os sistemas de indicadores de desempenho utilizado nas organizações, têm tido destaque a conjugação de métricas financeiras e não financeiras, baseada na crença de que a melhora nos resultados não financeiros irá ocasionar a melhora nos índices financeiros. Entretanto, não há uma metodologia consagrada para testar se esses relacionamentos (relações de causa-e-efeito) existem na realidade. O objetivo desse trabalho foi propor e validar parcialmente uma metodologia para testar e quantificar as relações causais entre indicadores de desempenho. A seqüência de passos foi definida a partir da literatura através da implementação de ferramentas do Desdobramento da Função Qualidade, Gerenciamento pelas Diretrizes, Pensamento Sistêmico e Ferramenta para Seleção de Planos de Ação. O método escolhido para sua validação parcial foi o estudo de caso. A unidade de análise foi uma organização que já utiliza métricas financeiras e não financeiras e possui base histórica de dados. A pesquisa utilizou como fontes de evidência a observação participante e entrevista estruturada. Para a análise dos dados foram utilizadas técnicas estatísticas e representação escrita. Os resultados permitem concluir que a metodologia consegue quantificar as relações causais entre métricas de desempenho. A aplicação também gerou grande aprendizado organizacional. A principal contribuição desse trabalho é o modelo conceitual parcialmente validado o qual pode ser utilizado para transformar o sistema de indicadores de desempenho em fonte de informações para a tomada de decisão através da quantificação das relações de causa-e-efeito. / Performance metrics have a fundamental role in organizations, because they show to decision makers the situation of the organization in relation to its objectives. Most of the metrics systems used have financial and non-financial indicators, based on the belief that if a non-financial performance is increased, it will cause the same behavior in financial results. On the other hand, there is not a consecrated methodology to test if these relationships (causal relations) exist in the real world. The objective of this paper is to propose and partially validate a methodology to test and quantify the causal relations among performance metrics. A sequence of steps was defined from literature research, using tools from Quality Function Deployment, Policy Deployment, System Dynamics and Tool for Action Planning Selection. The research method chosen was case study. The research unit was an organization that already uses financial and non-financial metrics and has historic data of it. As font of evidences, were used participant observation and structure interviews. Data analysis was made with statistical techniques and written representation. From the results, it can be concluded using the methodology it’s possible to quantify the causal relations between performance metrics. The application of this methodology also contributed the organizations learning. The mainly contribution of this paper is the partially validated conceptual methodology, that can be used to make the performance metric system a information source to decision making, trough the quantification of causal relations.
136

Identification and quantification of anthocyanins in the transgenic tomato

Su, Xiaoyu January 1900 (has links)
Master of Science / Food Science Institute / Weiqun Wang / Anthocyanins, a sub-class of flavonoids, are natural pigments derived from phenylpropanoid pathway. Most tomato cultivars found in nature have very low content of anthocyanins, but dark purple tomatoes by ectopic co-expression of two transcription factors Delila (Del) and Rosea1 (Ros1) from snapdragon and chalcone isomerase (CHI) from onion accumulated high levels of anthocyanins. This study is to identify and quantitate anthocyanins in these transgenic tomato lines. Seven anthocyanins including two new anthocyanins [malvidin-3-(p-coumaroyl)-rutinoside-5-glucoside and malvidin-3- (feruloyl) -rutinoside-5-glucoside] have been identified in transgenic lines by HPLC-MS. The top two anthocyanins are petunidin 3-(trans-coumaroyl)-rutinoside- 5-glucoside and delphinidin 3-(trans-coumaroyl)-rutinoside-5-glucoside that contribute for 85% of total anthocyanins in whole fruit. Comparing with undetectable anthocyanins in the wild type, Del/Ros1-expressing tomatoes contain total anthocyanins at 4.95±0.42 g/kg dry matter in whole fruit, 5.09±0.62 g/kg dry matter in peel, and 5.56±0.29 g/kg dry matter in flesh, while CHI×Del/Ros1-coexpressing tomatoes have 9.61±0.71 g/kg dry matter in whole fruit, 29.9±1.64 g/kg dry matter in peel, and 8.65±0.39 g/kg dry matter in flesh. No anthocyanins are detectable in the seeds of each line tested. Enrichment of tomato fruit with new and high anthocyanins may provide potential health-promoting benefits.
137

Quantification of striatal dopaminergic uptake in Parkinson's disease: a new multimodal method combining SPECT DaT and MPRAGE

Smart, Karishma Lees 08 April 2016 (has links)
Parkinson's disease (PD) is a neurodegenerative disease that causes degeneration of nigral dopaminergic terminals in the caudate and the putamen regions of the striatum in the basal ganglia. According to current practice, when an unequivocal clinical diagnosis of PD cannot be made, a single-photon emission computed tomography scan using the DaTscan radionuclide (SPECT DaT scan) is ordered. However, the assessment of SPECT DaT scans in the diagnosis of PD depends on the subjective judgment of a radiologist, which can pose problems for the accuracy of the diagnosis. Furthermore, as research studies generally do not quantify SPECT DaT scans when using them, their conclusions are not based on standardized data. The aim of this paper is to propose a method of quantification for SPECT DaT scans, to be employed in diagnostic and research environments. The methodology proposed in this thesis project will eventually be used for a much larger multimodal imaging project investigating the connectivity changes in the brain related to cognitive and affective symptoms in PD patients. Each of the 4 subjects in this project underwent a SPECT DaT scan and an MPRAGE scan (Magnetization Prepared Rapid Gradient Echo), an anatomical MRI (magnetic resonance image). The SPECT DaT scans and the MPRAGEs were coregistered, and then a voxel-based quantification of the caudate and the putamen in the left and the right hemispheres was performed in every subject. First, the percentages of voxels with intensities exceeding various pericalcarine baselines were calculated. A pericalcarine baseline was used because the pericalcarine gyrus in the occipital lobe has been shown to have little to no dopaminergic activity, particularly on SPECT DaT scans. Next, asymmetry indices (AI) were calculated for two of the thresholds whereby the ratio of the percentage of voxels in the right to the left hemispheric region was taken. Wilcoxon Signed-Rank tests and bootstrapping analyses were performed on both the caudate and the putamen in all four subjects to determine the significance of any detected asymmetry. The quantification of the data and the AI values revealed asymmetries in the voxel intensities between the left and right hemispheres. This asymmetry was consistent with each subject's side of physical symptom onset. According to the bootstrapping analyses, this asymmetry was significant in five of the eight comparisons. In summary, this methodology has potential to bring greater objectivity to the use of SPECT DaT scans in the diagnosis of PD and in research through its anatomically accurate, voxel-based quantification.
138

Uncertainty quantification of an effective heat transfer coefficient within a numerical model of a bubbling fluidized bed with immersed horizontal tubes

Moulder, Christopher James 08 April 2016 (has links)
This study investigates sources of steady state computational uncertainty in an effective heat transfer coefficient (HTC) within a non-reacting bubbling fluidized bed with immersed horizontal heat-conducting tubes. The methodical evaluation of this variation, or Uncertainty Quantification (UQ), is a critical step in the experimental analysis process, and is particularly important when the values of input physical parameters are unknown or experimental data is sparse. While the concept applies broadly to all studies, this application investigates a 2D unit cell analogue of a bubbling fluidized bed designed for large-scale carbon capture applications. Without adequate characterization of simulation uncertainties in the HTC, bed operating characteristics, including the thermal efficiency, carbon capture efficiency, and sorbent half-life cannot be well understood. We focus on three primary parameters, solid-solid coefficient of restitution, solid-wall coefficient of restitution, and turbulence model, and consider how their influences vary at different bed solid fractions. This is accomplished via sensitivity analysis and the Bayesian Spline Smoothing (BSS) Analysis of Variance (ANOVA) framework. Results indicate that uncertainties approach 20% at high gas fractions, with the turbulence model accounting for 80% of this variation and the solid-solid coefficient of restitution accounting for the additional 20%.
139

'A miracle from Nairobi': David B. Barrett and the quantification of world Christianity, 1957–1982

Zurlo, Gina 15 December 2017 (has links)
This dissertation analyzes the role of quantification in the history of Christian mission by placing David B. Barrett’s World Christian Encyclopedia (1982) in its historical context. It argues that Barrett’s unique mixture of education, professional background, and geographical location in Africa helped him develop an understanding of world Christianity based on its newly-discovered diversity and fragmentation at the end of the British Empire. The Encyclopedia presented a comprehensive quantitative assessment of membership in all branches of the Church and helped shape contemporary understandings of world Christianity. In making explicit connections among world Christianity, mission history, and the social scientific study of religion, this dissertation sheds lights on the history of religious data in relationship to world Christianity. This study shows that Barrett was part of a long history of missionaries who produced church-based, scientific scholarship. It illustrates the ubiquity of such scholarship throughout the history of mission, demonstrated through an analysis of missionary quantification from the Jesuits to Barrett, including the Christian roots of American sociology. This analysis contends that American sociology in the 1960s—when Barrett received his Ph.D. in religion from Columbia University—was fundamentally shaped by the history of missionaries who produced social scientific research. The Encyclopedia was conceived, developed, and produced in Africa. Barrett’s location in Nairobi, Kenya, with the Church Missionary Society during the rise of African nationalism and decolonization informed his perspective on world Christianity. Much like the African Independent Churches he studied, Barrett broke off from the missionary establishment and threw his support behind “heretical” African groups. This analysis of Barrett’s experience in Kenya suggests that the growth of African Christianity was fundamental to reshaping definitions of world Christianity. This dissertation contributes to existing scholarship by historically placing the World Christian Encyclopedia in its theological, geographic, political, and social contexts. This study shows that Barrett was the first person to quantify religious adherence of all kinds and to equally represent all of world Christianity in one book. Further, the Encyclopedia indicated that a new era of world Christianity had come, and its center of gravity had moved from white Europe to black Africa.
140

Multi-dimensional data analysis in electron microscopy

Ostasevicius, Tomas January 2017 (has links)
This thesis discusses various large multi-dimensional dataset analysis methods and their applications. Particular attention is paid to non-linear optimization analyses and general processing algorithms and frameworks when the datasets are significantly larger than the available computer memory. All new presented algorithms and frameworks were implemented in the HyperSpy analysis toolbox. A novel Smart Adaptive Multi-dimensional Fitting (SAMFire) algorithm is presented and applied across a range of scanning transmission electron microscope (STEM) experiments. As a result, the Stark effect in quantum disks was mapped in a cathodoluminescence STEM experiment, and fully quantifiable 3D atomic distributions of a complex boron nitride core-shell nanoparticle were reconstructed from an electron energy loss spectrum (EELS) tilt-series. The EELS analysis also led to the development of two new algorithms to extract EELS near-edge structure fingerprints from the original dataset. Both approaches do not rely on standards, are not limited to thin or constant thickness particles and do not require atomic resolution. A combination of the aforementioned fingerprinting techniques and SAMFire allows robust quantifiable EELS analysis of very large regions of interest. A very large dataset loading and processing framework, “LazySignal”, was developed and tested on scanning precession electron diffraction (SPED) data. A combination of SAMFire and LazySignal allowed efficient analysis of large diffraction datasets, successfully mapping strain across an extended (ca. 1 μm × 1 μm) region and classifying the strain fields around precipitate needles in an aluminium alloy.

Page generated in 0.0598 seconds