• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2881
  • 446
  • 379
  • 288
  • 257
  • 135
  • 94
  • 58
  • 47
  • 31
  • 25
  • 15
  • 15
  • 15
  • 15
  • Tagged with
  • 5535
  • 1412
  • 1244
  • 1094
  • 994
  • 951
  • 951
  • 861
  • 847
  • 585
  • 538
  • 511
  • 445
  • 398
  • 380
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Measurement of arsenobetaine and arsenocholine in fish tissue by fast atom bombardment mass spectrometry

Zimmerman, Michael L. 08 1900 (has links) (PDF)
M.S. / Environmental Science / A technique to measure arsenobetaine and arsenocholine in fish tissue by fast atom bombardment mass spectrometry was developed with particular attention to quantitative analysis. Experiments were performed which demonstrate analysis of the compounds desorbed directly from thin layer silica chromatography matrices, quantitative analysis of arsenobetaine in real fish samples, and accurate mass measurement of arsenobetaine in normal FAB/MS using peaks from the glycerol matrix as mass references. Improvements to the technique to quantitatively measure these important arsenic metabolites are suggested including optimization of the extraction/isolation procedures and use of isotopically labelled internal standards or surrogates for more accurate measurements.
212

Development of imaging methods to quantify the laminar microstructure in rat hearts

Hudson, Kristen Kay 15 November 2004 (has links)
The way in which the myocardium responds to its mechanical environment must be understood in order to develop reasonable treatments for congestive heart failure. The first step toward this understanding is to characterize and quantify the cardiac microstructure in healthy and diseased hearts. Myocardium has a laminar architecture made up of myolaminae, which are sheets of myocytes surrounded by a collagen weave. By enhancing the contrast between the myocytes and the surrounding collagen, the myocardium can be investigated and its laminar structure can be quantified. Many of the techniques that have been used to view the microstructure of the heart require the use of toxic or caustic chemicals for fixation or staining. An efficient imaging method that uses polarization microscopy and enhances the contrast between the collagen and myocytes while minimizing the use of harmful chemicals was developed in this research. Collagen is birefringent; therefore its visibility should be enhanced through polarization microscopy and image processing. The sheet angles were viewed directly by cutting slices of a rat septum perpendicular to the fiber angle. Images of different polarization combinations were taken and a region of interest was selected on the sample. Image processing techniques were used to reduce the intensity variation on the images and account for the variable gain of the camera. The contrast between the collagen and myocytes was enhanced by comparing adjusted images to the background and looking at a single image this comparison produced. Although the contrast was enhanced, the embedding media reduced the collagen signal and the enhancement was not as striking as expected.
213

Quantitative transportation risk analysis based on available data/databases: decision support tools for hazardous materials transportation

Qiao, Yuanhua 17 September 2007 (has links)
Historical evidence has shown that incidents due to hazardous materials (HazMat) releases during transportation can lead to severe consequences. The public and some agencies such as the Department of Transportation (DOT) show an increasing concern with the hazard associated with HazMat transportation. Many hazards may be identified and controlled or eliminated through use of risk analysis. Transportation Risk Analysis (TRA) is a powerful tool in HazMat transportation decision support system. It is helpful in choosing among alternate routes by providing information on risks associated with each route, and in selecting appropriate risk reduction alternatives by demonstrating the effectiveness of various alternatives. Some methodologies have been developed to assess the transportation risk; however, most of those proposed methodologies are hard to employ directly by decision or policy makers. One major barrier is the lack of the match between available data/database analysis and the numerical methodologies for TRA. In this work methodologies to assess the transportation risk are developed based on the availability of data or databases. The match between the availability of data/databases and numerical TRA methodologies is pursued. Each risk component, including frequency, release scenario, and consequence, is assessed based on the available data/databases. The risk is measured by numerical algorithms step by step in the transportation network. Based on the TRA results, decisions on HazMat transportation could be made appropriately and reasonably. The combination of recent interest in expanding or building new facilities to receive liquefied natural gas (LNG) carriers, along with increased awareness and concern about potential terrorist action, has raised questions about the potential consequences of incidents involving LNG transportation. One of those consequences, rapid phase transition (RPT), is studied in this dissertation. The incidents and experiments of LNG-water RPT and theoretical analysis about RPT mechanism are reviewed. Some other consequences, like pool spread and vapor cloud dispersion, are analyzed by Federal Energy Regulatory Commission (FERC) model.
214

The proposed resilience analysis methodology and its application to the SaskWater pumping station

Gao, Fei 14 April 2010
Resilience engineering first appeared as a new approach for both system design and system safety in the last decade. One of the first substantive publications on resilience as applied to engineering was Resilience Engineering: Concepts and Precepts [Hollnagel et al. 2006]. Hollnagel, Woods, and Leveson developed the basic concepts behind resilience engineering in order to understand and prevent tragedies such as the Columbia Challenger accident and the September 11 terrorist attack.<p> In its present stage, resilience engineering has several fundamental problems. 1. There is not an appropriate definition for resilience. 2. The differences between resilience and other similar concepts are not clarified. 3. There is no quantitative method which can measure resilience. The three questions need to be addressed in order to advance the concept of resilience engineering and form a theoretical concept to an applied science. These three issues then form the foundation of this thesis.<p> As a first step, a resilience definition is presented based on the concepts of system function and damage. Then, the differences between resilience and five similar concepts (reliability, robustness, repairing, redundancy, and sustainability) are clearly elaborated. As a last step, a method for quantifying resilience is proposed in the form of a resilience index. This method exclusively measures system resilience by analyzing the system recoverability from two points of view: reconfiguration and replacement of components.<p> In order to illustrate the approach to and definitions of resilience, an actual application is considered: a water pumping station operated by SaskWater in Saskatoon, Saskatchewan (the Clarence Booster Station). This pumping station is a complicated system consisting of mechanical electrical and chemical subsystems. The resilience of Clarence Booster Station is analyzed using the proposed definition of resilience and resilience index.<p> This thesis is just an initial step establishing a comprehensive definition (qualitatively and quantitatively) for resilience. The resilience index so defined in this work appears to have potential but much more scrutiny and refinement must be pursued to ensure that it is truly applicable to more universal engineering applications.
215

The Quantitative Genetics of Mate Choice Evolution: Theory and Empiricism

Ratterman, Nicholas 1981- 14 March 2013 (has links)
The evolution of mate choice remains one of the most controversial topics within evolutionary biology. In particular, the coevolutionary dynamics between ornaments and mating preferences has been extensively studied, but few generalizations have emerged. From a theoretical standpoint, the nature of the genetic covariance built up by the process of mate choice has received considerable attention, though the models still make biologically unrealistic assumptions. Empirically, the difficulty of estimating parameters in the models has hindered our ability to understand what processes are occurring in nature. Thus, it is the goal of this dissertation to contribute to the field both theoretically and empirically. I begin with a review of the evolution of mate choice and demonstrate how the lack of cross-talk between theoretical and empirical pursuits into studying mate choice has constrained our ability to extract basic principles. The review is followed by a new model of intersexual selection that relaxes some of the critical assumptions inherent in sexual selection theory. There are two empirical studies whose goal is to measure mating preference functions and genetic correlations in a way that can be related back to theory. Finally, I conclude by setting the stage for future endeavors into exploring the evolution of mate choice. The results presented herein demonstrate four things: (i) a lack of communication between theoretical and empirical studies of mate choice; (ii) genetic drift plays a much larger role in preference evolution than previously demonstrated; (iii) genetic correlations other than those explicitly modeled are likely to be important in preference evolution; and (iv) variation in mating preferences can eliminate intersexual selection altogether. From these four findings it can be concluded that a tighter link between theory and empiricism is needed, with a particular emphasis on the importance of measuring individual-level preference functions. Models will benefit from integrating the specific phenotypes measured by empiricists. Experimentation will be more useful to theory if particular attention is paid to the exact phenotypes that are measured. Overall, this dissertation is a stepping stone for a more cohesive and accurate understanding of mate choice evolution.
216

Calibration and Model Uncertainty of a Two-Factor Mean-Reverting Diffusion Model for Commodity Prices

Chuah, Jue Jun January 2013 (has links)
With the development of various derivative instruments and index products, commodities have become a distinct asset class which can offer enhanced diversification benefits to the traditional asset allocation of stocks and bonds. In this thesis, we begin by discussing some of the key properties of commodity markets which distinguish them from bond and stock markets. Then, we consider the informational role of commodity futures markets. Since commodity prices exhibit mean-reverting behaviour, we will also review several mean-reversion models which are commonly used to capture and describe the dynamics of commodity prices. In Chapter 4, we focus on discussing a two-factor mean-reverting model proposed by Hikspoors and Jaimungal, as a means of providing additional degree of randomness to the long-run mean level. They have also suggested a method to extract the implied market prices of risk, after estimating both the risk-neutral and real-world parameters from the calibration procedure. Given the usefulness of this model, we are motivated to investigate the robustness of this calibration process by applying the methodology to simulated data. The capability to produce stable and accurate parameter estimates will be assessed by selecting various initial guesses for the optimization process. Our results show that the calibration method had a lot of difficulties in estimating the volatility and correlation parameters of the model. Moreover, we demonstrate that multiple solutions obtained from the calibration process would lead to model uncertainty in extracting the implied market prices of risk. Finally, by using historical crude oil data from the same time period, we can compare our calibration results with those obtained by Hikspoors and Jaimungal.
217

The proposed resilience analysis methodology and its application to the SaskWater pumping station

Gao, Fei 14 April 2010 (has links)
Resilience engineering first appeared as a new approach for both system design and system safety in the last decade. One of the first substantive publications on resilience as applied to engineering was Resilience Engineering: Concepts and Precepts [Hollnagel et al. 2006]. Hollnagel, Woods, and Leveson developed the basic concepts behind resilience engineering in order to understand and prevent tragedies such as the Columbia Challenger accident and the September 11 terrorist attack.<p> In its present stage, resilience engineering has several fundamental problems. 1. There is not an appropriate definition for resilience. 2. The differences between resilience and other similar concepts are not clarified. 3. There is no quantitative method which can measure resilience. The three questions need to be addressed in order to advance the concept of resilience engineering and form a theoretical concept to an applied science. These three issues then form the foundation of this thesis.<p> As a first step, a resilience definition is presented based on the concepts of system function and damage. Then, the differences between resilience and five similar concepts (reliability, robustness, repairing, redundancy, and sustainability) are clearly elaborated. As a last step, a method for quantifying resilience is proposed in the form of a resilience index. This method exclusively measures system resilience by analyzing the system recoverability from two points of view: reconfiguration and replacement of components.<p> In order to illustrate the approach to and definitions of resilience, an actual application is considered: a water pumping station operated by SaskWater in Saskatoon, Saskatchewan (the Clarence Booster Station). This pumping station is a complicated system consisting of mechanical electrical and chemical subsystems. The resilience of Clarence Booster Station is analyzed using the proposed definition of resilience and resilience index.<p> This thesis is just an initial step establishing a comprehensive definition (qualitatively and quantitatively) for resilience. The resilience index so defined in this work appears to have potential but much more scrutiny and refinement must be pursued to ensure that it is truly applicable to more universal engineering applications.
218

An Economic Cycle-based Multi-factor Alpha Model¡X with Application in the Taiwan Market

TSENG, Miao-lien 11 August 2012 (has links)
This study aims to find an effective linear combination of factors in different economic cycle periods and then construct two factor timing multi-factor alpha models, one each for the expansion and contraction periods. Then, we wish to examine a further two effects, namely calendar effect and cross effect. The calendar periods are divided into the first half year and the second half year. The cross effect is the combination of the economic cycle and the calendar effect. In addition, this study puts different loadings in core and satellite descriptors, which means we wish to examine which descriptors are more important when we rebalance our portfolio weekly. The empirical results show that the Value factor is effective in expansion and the first half year, and the Size and Earnings Quality factors are effective in contraction and the second half year. Moreover, the Price Momentum and Trading Activity factors are effective most of the time. We find that the optimal weight for core descriptors is 0.3 and for satellite descriptors is 0.7. Finally, the information ratios of our models are superior to the Standard alpha model by Hsu et al. (2011) and the Market Trend-based alpha model by Wang (2011). Taking the AMCross as an example, when the tracking error is below 3%, the IR is 1.40, the active return is 3.09%, the tracking error is 2.20%, the turnover rate is 207% and the transaction costs are 1.2%.
219

Risk Assessment and Validation of Building Performance-based Fire Engineering Designs

Wu, Wei-shuo 23 June 2006 (has links)
¡@¡@Recently due to the significant economic growth in Taiwan, buildings were constructed taller with usage diversified. However, the Building Code is difficult to cope with this rapid change and also sometimes with design negligence, fire accidents are not uncommon in this country. ¡@¡@Based on former experience, casualties occurred mainly due to the smoke hazard and inadequate time for egress. Therefore, quantitative assessment on these two items has become increasingly important, which is the main theme of this study. ¡@¡@During the design stage, a lot of constraints existed in adapting the local fire code, which is prescriptive in nature, especially in designing buildings with large spaces, atria, or malls. In that case, performance-based design was often attempted followed by quantitative risk assessment to validate its design effectiveness. ¡@¡@In this study, the ABRI Manual for Fire Risk assessment has been applied, followed by the F method for comparative study. The result indicated that both methods can be applied as reliable tools for fire risk assessment and warrants its application in engineering projects.
220

Development of imaging methods to quantify the laminar microstructure in rat hearts

Hudson, Kristen Kay 15 November 2004 (has links)
The way in which the myocardium responds to its mechanical environment must be understood in order to develop reasonable treatments for congestive heart failure. The first step toward this understanding is to characterize and quantify the cardiac microstructure in healthy and diseased hearts. Myocardium has a laminar architecture made up of myolaminae, which are sheets of myocytes surrounded by a collagen weave. By enhancing the contrast between the myocytes and the surrounding collagen, the myocardium can be investigated and its laminar structure can be quantified. Many of the techniques that have been used to view the microstructure of the heart require the use of toxic or caustic chemicals for fixation or staining. An efficient imaging method that uses polarization microscopy and enhances the contrast between the collagen and myocytes while minimizing the use of harmful chemicals was developed in this research. Collagen is birefringent; therefore its visibility should be enhanced through polarization microscopy and image processing. The sheet angles were viewed directly by cutting slices of a rat septum perpendicular to the fiber angle. Images of different polarization combinations were taken and a region of interest was selected on the sample. Image processing techniques were used to reduce the intensity variation on the images and account for the variable gain of the camera. The contrast between the collagen and myocytes was enhanced by comparing adjusted images to the background and looking at a single image this comparison produced. Although the contrast was enhanced, the embedding media reduced the collagen signal and the enhancement was not as striking as expected.

Page generated in 0.4137 seconds