• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 377
  • 153
  • 69
  • 59
  • 39
  • 30
  • 13
  • 11
  • 8
  • 6
  • 5
  • 4
  • 4
  • 3
  • 3
  • Tagged with
  • 970
  • 204
  • 170
  • 136
  • 103
  • 81
  • 67
  • 63
  • 63
  • 59
  • 59
  • 58
  • 57
  • 56
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
501

Development and Validation of a Numerical Tool for the Aeromechanical Design of Turbomachinery

Mayorca, María Angélica January 2010 (has links)
In aeromechanical design one of the major rules is to operate under High Cyclic Fatigue (HCF) margins and away from flutter. The level of dynamic excitations and risk of HCF can be estimated by performing forced response analyses from blade row interaction forces or Low Engine Order (LEO) excitation mechanisms. On the other hand, flutter stability prediction can be assessed by calculation of aerodynamic damping forces due to blade motion. In order to include these analyses as regular practices in an industrial aeromechanical design process, interaction between the fields of fluid and structural dynamics must be established in a rather simple yet accurate manner. Effects such as aerodynamic and structural mistuning should also be taken into account where parametric and probabilistic studies take an important role. The present work presents the development and validation of a numerical tool for aeromechanical design. The tool aims to integrate in a standard and simple manner regular aeromechanical analysis such as forced response analysis and aerodynamic damping analysis of bladed disks. Mistuning influence on forced response and aerodynamic damping is assessed by implementing existing model order reduction techniques in order to decrease the computational effort and assess results in an industrially applicable time frame.  The synthesis program solves the interaction of structure and fluid from existing Finite Element Modeling (FEM) and Computational Fluid Dynamics (CFD) solvers inputs by including a mapping program which establishes the fluid and structure mesh compatibility. Blade row interaction harmonic forces and/or blade motion aerodynamic damping forces are inputs from unsteady fluid dynamic solvers whereas the geometry, mass and stiffness matrices of a blade alone or bladed disk sector are inputs from finite element solvers. Structural and aerodynamic damping is also considered. Structural mistuning is assessed by importing different sectors and any combinations of the full disk model can be achieved by using Reduced Order Model (ROM) techniques. Aerodynamic mistuning data can also be imported and its effects on the forced response and stability assessed. The tool is developed in such a way to allow iterative analysis in a simple manner, being possible to realize aerodynamically and structurally coupled analyses of industrial bladed disks. A new method for performing aerodynamic coupled forced response and stability analyses considering the interaction of different mode families has also been implemented. The method is based on the determination of the aerodynamic matrices by means of least square approximations and is here referred as the Multimode Least Square (MLS) method. The present work includes the program description and its applicability is assessed on a high pressure ratio transonic compressor blade and on a simple blisk. / QC 20110324 / Turbopower / AROMA
502

Inventory Pinch Algorithms for Gasoline Blend Planning

Castillo, Castillo A Pedro 04 1900 (has links)
<p>Current gasoline blend planning practice is to optimize blend plans via discrete-time multi-period NLP or MINLP models and schedule blends via interactive simulation. Solutions of multi-period models using discrete-time representation typically have different blend recipes for each time period. In this work, the concept of an inventory pinch point is introduced and used it to construct a new decomposition of the multi-period MINLP problems: at the top level nonlinear blending problems for periods delimited by the inventory pinch points are solved to optimize multi-grade blend recipes; at the lower level a fine grid multi-period MILP model that uses optimal recipes from the top level is solved in order to determine how much to blend of each product in each fine grid period, subject to minimum threshold blend size. If MILP is infeasible, corresponding period between the pinch points is subdivided and recipes are re-optimized.</p> <p>Two algorithms at the top level are examined: a) multi-period nonlinear model (MPIP) and b) single-period non-linear model (SPIP). Case studies show that the MPIP algorithm produces solutions that have the same optimal value of the objective function as corresponding MINLP model, while the SPIP algorithm computes solutions that are most often within 0.01% of the solutions by MINLP. Both algorithms require substantially less computational effort than the corresponding MINLP model. Reduced number of blend recipes makes it easier for blend scheduler to create a schedule by interactive simulation.</p> / Master of Applied Science (MASc)
503

DRUG AND CELL–BASED THERAPIES TO REDUCE PATHOLOGICAL REMODELING AND CARDIAC DYSFUNCTION AFTER ACUTE MYOCARDIAL INFARCTION

Sharp III, Thomas E. January 2017 (has links)
Remarkable advances have been made in the treatment of cardiovascular diseases (CVD), however, CVD still accounts for the most deaths in industrialized nations. Ischemic heart disease (IHD) can lead to acute coronary syndrome (ACS) (myocardial infarction [MI]). The standard of care is reperfusion therapy followed by pharmacological intervention to attenuate clinical symptoms related to the MI. While survival from MI has dramatically increased with the implementation of reperfusion therapy, these individuals will inevitably suffer progressive pathological remodeling leaving them predispose to develop heart failure (HF). HF is a clinical syndrome defined as the impairment of the heart to maintain organ perfusion at rest and/or during times of exertion (i.e. exercise intolerance). Clinically, this is accompanied by dyspnea, pulmonary or splanchnic congestion and peripheral edema. Physiologically, there is neurohormal activation through the classical β–adrenergic and PKA–dependent signalin / Physiology
504

Fluorescence and Diffuse Reflectance Spectroscopy for Margin Analysis in Breast Cancer

Shalaby, Nourhan 15 June 2017 (has links)
This study investigates the possibility of using a time-resolved Fluorescence and Diffuse Reflectance Spectroscopy (tr-FRS) system to define tumour surgical margins of invasive ducal carcinoma of breast. UV excitation light was used for the fluorescence component and data was collected from the 370-550 nm range. A broadband source was used for diffuse reflectance collection and the emitted response was in the 400-800 nm range. 40 matched pair cases were collected from patients undergoing breast conservation surgeries. Histological analysis was performed on each sample to determine the fat and tumour content within each normal and tumour sample respectively. Statistical analysis was performed on the optical data to reveal biochemical changes in the endogenous fluorophores collagen, reduced nicotinamide adenine dinucleotide (NADH), and flavin adenine dinucleotide (FAD) as well as changes in absorption and scattering properties attributed to variances in absorber concentrations and cell density respectively. Statistical significant differences in collagen, NADH, and FAD lifetimes, collagen, NADH, FAD and NADH/FAD intensity, diffuse reflectance and reduced scatter coefficient were observed between tumour and normal breast samples. These significant factors were used in Principle Component Analysis model construction and a binary classification scheme using Soft Independent Modeling of Class Analogy (SIMCA) was used as a classification tool to predict unknown breast samples as either normal or tumour with specificity of 60% and sensitivity slightly over 50%. / Thesis / Master of Science (MSc)
505

Efficiently Combining Multiband Compression and Improved Contrast-Enhancing Frequency Shaping in Hearing Aids

Ansari, Shahabuddin 07 1900 (has links)
<p>Sensorineural hearing loss imparts two serious hearing deficits in hearing-impaired people: reduced dynamic range of hearing and reduced frequency selectivity. Psychophysically, these deficits render loss of speech audibility and speech intelligibility to a hearing-impaired person. Studies of an impaired cochlea in cats have shown that the hearing loss originates from damage to or complete loss of inner and outer hair cells. Neurophysiology of an impaired cochlea in cats shows that the tuning curves of the auditory nerve fibers become elevated and broadened. Amplification in hearing aids has been used to restore audibility in hearing-impaired people. Multiband compression has been commercially available in conventional hearing aids to compensate for the reduced dynamic range of hearing. However, little has been achieved to improve the intelligibility of speech in the hearing-impaired people. The aim of this thesis is to restore not only the speech audibility in a hearing-impaired person, but also to improve their speech intelligibility through some hearing-aid signal processing. The compensation technique used in this thesis for speech intelligibility is based on the hypothesis that a narrowband response of the auditory nerve fibers to speech signals ensure phonemic discriminability in the central nervous system.</p><p>Miller et al. [1999] have proposed contrast-enhancing frequency shaping ( CEFS) to compensate for the broadband responses of the fibers to first and second formants (Fl and F2) of a speech stimulus. Bruce [2004] has shown that the multiband compression can be combined with CEFS without counteracting each other. In Bruce's algorithm, a multiband compressor is serially combined with a time-domain CEFS filter. The MICEFS algorithm, herein presented, is a combination of multiband compression and an improved version of CEFS implemented in the frequency domain. The frequency domain implementation of MICEFS has improved the time delay response of the algorithm by 10 ms as compared to series implementation proposed by Bruce. The total time delay of the MICEFS algorithm is 16 ms, which is still longer than the standard time delay of 10 ms in hearing aids. The MICEFS algorithm was tested on a computational model of auditory periphery [Bruce et al., 2003] using a synthetic vowel and a synthetic sentence. The testing paradigm consisted of five conditions: 1) unmodified speech presented to a normal cochlea; 2) speech modified with halfgain rule presented to an impaired cochlea; 3) CEFS modified speech presented to the impaired cochlea; 4) speech modified with MICEFS presented to the impaired cochlea, and; 5) MICEFS-modified speech with some added noise in the formant estimation presented to an impaired cochlea. The spectral enhancement filter used in MICEFS has improved the synchrony capture of the fibers to the first three formants of a speech stimulus. MICEFS has also restored the correct tonotopic representation in the average discharge rate of the fibers at the first three formants of the speech.</p> / Thesis / Master of Applied Science (MASc)
506

Calculations of Reduced Probability For E2 Transitions / Calculations of Reduced Probability For E2 Transitions in Deformed Even-Even Nuclei

Kiang, David Bun I 05 1900 (has links)
The reduced probability of E2 transitions between rotational levels built upon γ-vibrational states was calculated for even-even nuclei. General expressions were derived as functions of the spin of the initial state and a parameter γ10. Branching ratios for special cases were obtained, which compare quite favourably with experiment. / Thesis / Master of Science (MS)
507

Triazole-linked reduced amide isosteres: An approach for the fragment-based drug discovery of anti-Alzheimer's BACE1 inhibitors and NH-assisted Fürst-Plattner opening of cyclohexene oxides

Monceaux, Christopher Jon 14 January 2011 (has links)
In the scope of our BACE1 inhibitor project we used an originally designed microtiter plate-based screening to discover 4 triazole-linked reduced amide isosteres that showed modest (single digit micromolar) BACE1 inhibition. Our ligands were designed based on a very potent (single digit nanomolar) isopththalamide ligand from Merck. We supplanted one of the amide linkages in order to incorporate our triazole and saw a 1000-fold decrease in potency. We then enlisted Molsoft, L.L.C. to compare our ligand to Merck's in silico to account for this discrepancy. They found that the triazole linkage gives rise to a significantly different docking pose in the active site of the BACE1 enzyme, therefore diminishing its potency relative to the Merck ligand. The ability to control the regio- and stereochemical outcome of organic reactions is an ongoing interest and challenge to synthetic chemists. The pre-association of reacting partners through hydrogen bonding (H-bonding) can often to yield products with extremely high stereoselectivity. We were able to show that anilines, due to their enhanced acidity relative to amines, can serve as substrate directing moieties in the opening of cyclohexene oxides. We observed that by judicious choice of conditions we could control the regiochemical outcome of the reaction. These studies demonstrate that an intramolecular anilino-NH hydrogen bond donor can direct Fürst-Plattner epoxide opening. A unified mechanism for this phenomenon has been proposed in this work which consists of a novel mechanistic route we call "NH-directed Fürst-Plattner." We further studied the opening of cyclohexene oxides by incorporating amide and amide derivative substituents in both the allylic and homoallylic position relative to the epoxide moiety. Our attempts to control regioselectivity in the allylic systems were unsuccessful; however when the directing substituent was in the homoallylic position, we could demonstrate some degree of regioselectivity. An additional project that the author worked on for approximately one year during his graduate student tenure is not described within this work. In February of 2009 AstraZeneca, Mayo Clinic, and Virginia Tech Intellectual Properties Inc. concomitantly announced that AstraZeneca licensed a portfolio of preclinical Triple Reuptake Inhibitor (TRI) compounds for depression. The lead compound, PRC200, was discovered by a collaborative effort between the Carlier and Richelson (Mayo Clinic Jacksonville) research groups in 1998. The author was tasked to develop backup candidates of PRC200 in order to improve the pharmacokinetics of the lead compound. Due to confidentiality agreements, this work is not reported herein. / Ph. D.
508

An Evaluation of the School Choice Plan in Charlotte-Mecklenburg Schools and its Perceived Effects on Academic Achievement for all Students

Cline, Terry Lee 21 November 2006 (has links)
Does ethnicity of the student prevent equal levels of learning at an equal pace? Are schools required to teach all children effectively, no matter what their socio-economic status, gender, or ethnicity? Educators and researchers have longed for the answers to these questions. For years, educators have been looking for ways to teach children in schools that are racially identifiable and have the highest percentages of children on free and reduced lunch. School districts that have choice as a way of assigning students are increasing the number of racially identifiable schools. In Charlotte-Mecklenburg Schools, a choice plan was implemented in June 2001. That plan created more schools of poverty within the district. The district also offered additional resources, teacher incentives, and financial assistance as a way to leverage the student make-up of the school district and the individual schools at all levels. / Ed. D.
509

Model-based Tests for Standards Evaluation and Biological Assessments

Li, Zhengrong 27 September 2007 (has links)
Implementation of the Clean Water Act requires agencies to monitor aquatic sites on a regular basis and evaluate the quality of these sites. Sites are evaluated individually even though there may be numerous sites within a watershed. In some cases, sampling frequency is inadequate and the evaluation of site quality may have low reliability. This dissertation evaluates testing procedures for determination of site quality based on modelbased procedures that allow for other sites to contribute information to the data from the test site. Test procedures are described for situations that involve multiple measurements from sites within a region and single measurements when stressor information is available or when covariates are used to account for individual site differences. Tests based on analysis of variance methods are described for fixed effects and random effects models. The proposed model-based tests compare limits (tolerance limits or prediction limits) for the data with the known standard. When the sample size for the test site is small, using model-based tests improves the detection of impaired sites. The effects of sample size, heterogeneity of variance, and similarity between sites are discussed. Reference-based standards and corresponding evaluation of site quality are also considered. Regression-based tests provide methods for incorporating information from other sites when there is information on stressors or covariates. Extension of some of the methods to multivariate biological observations and stressors is also discussed. Redundancy analysis is used as a graphical method for describing the relationship between biological metrics and stressors. A clustering method for finding stressor-response relationships is presented and illustrated using data from the Mid-Atlantic Highlands. Multivariate elliptical and univariate regions for assessment of site quality are discussed. / Ph. D.
510

Numerical Analysis for Data-Driven Reduced Order Model Closures

Koc, Birgul 05 May 2021 (has links)
This dissertation contains work that addresses both theoretical and numerical aspects of reduced order models (ROMs). In an under-resolved regime, the classical Galerkin reduced order model (G-ROM) fails to yield accurate approximations. Thus, we propose a new ROM, the data-driven variational multiscale ROM (DD-VMS-ROM) built by adding a closure term to the G-ROM, aiming to increase the numerical accuracy of the ROM approximation without decreasing the computational efficiency. The closure term is constructed based on the variational multiscale framework. To model the closure term, we use data-driven modeling. In other words, by using the available data, we find ROM operators that approximate the closure term. To present the closure term's effect on the ROMs, we numerically compare the DD-VMS-ROM with other standard ROMs. In numerical experiments, we show that the DD-VMS-ROM is significantly more accurate than the standard ROMs. Furthermore, to understand the closure term's physical role, we present a theoretical and numerical investigation of the closure term's role in long-time integration. We theoretically prove and numerically show that there is energy exchange from the most energetic modes to the least energetic modes in closure terms in a long time averaging. One of the promising contributions of this dissertation is providing the numerical analysis of the data-driven closure model, which has not been studied before. At both the theoretical and the numerical levels, we investigate what conditions guarantee that the small difference between the data-driven closure model and the full order model (FOM) closure term implies that the approximated solution is close to the FOM solution. In other words, we perform theoretical and numerical investigations to show that the data-driven model is verifiable. Apart from studying the ROM closure problem, we also investigate the setting in which the G-ROM converges optimality. We explore the ROM error bounds' optimality by considering the difference quotients (DQs). We theoretically prove and numerically illustrate that both the ROM projection error and the ROM error are suboptimal without the DQs, and optimal if the DQs are used. / Doctor of Philosophy / In many realistic applications, obtaining an accurate approximation to a given problem can require a tremendous number of degrees of freedom. Solving these large systems of equations can take days or even weeks on standard computational platforms. Thus, lower-dimensional models, i.e., reduced order models (ROMs), are often used instead. The ROMs are computationally efficient and accurate when the underlying system has dominant and recurrent spatial structures. Our contribution to reduced order modeling is adding a data-driven correction term, which carries important information and yields better ROM approximations. This dissertation's theoretical and numerical results show that the new ROM equipped with a closure term yields more accurate approximations than the standard ROM.

Page generated in 0.0598 seconds