• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 243
  • 225
  • 53
  • 41
  • 39
  • 11
  • 11
  • 10
  • 8
  • 7
  • 6
  • 6
  • 4
  • 3
  • 3
  • Tagged with
  • 785
  • 219
  • 151
  • 121
  • 114
  • 109
  • 82
  • 74
  • 67
  • 66
  • 62
  • 57
  • 55
  • 54
  • 54
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Agregatinių imitacinių modelių programinio kodo generavimas ir integravimas su duomenų baze / Source code generation for aggregate simulation models and their integration with database

Munčys, Darius 16 August 2007 (has links)
Šiame darbe analizuojama MDE programų inžinerijos metodai, o taip pat OMG MDA ir Microsoft DSL Tools. MS DSL Tools - tai galingas priedas prie MS Visual Studio 2005, skirtas taikymo srities kalbų metamodelių kūrimui ir grafinių aplinkų darbui su jomis generavimui. Taip pat galima sukurti tekstinų šabloną, kuris gali transformuoti duotą modelį į programinį kodą. Darbe aprašomas sukurtas modeliavimo kalbos metamodelis. taip pat aprašoma tekstinių šablonų kalba, skirta modelių trasformacijom kurti. / This paper analyses MDE approach to software engineering as well as OMG’s MDA and Microsoft DSL Tools. MS DSL Tools are a powerful addition to MS Visual Studio 2005 for creating your own Domain Specific Language metamodels and generating IDEs for working with it. Then you can make a text template to transform your model directly into code. The work describes developed aggregate modeling language metamodel. It also explains text template language for model transformations. The research also includes analysis of the possibility to use model transformations to generate code for data acquisition form database.
232

Measurement of Carrier Fluid Viscosities for Oil Sand Extraction and Tailings Slurries

Smith, Jessie L Unknown Date
No description available.
233

Effect of Laminar Shear on the Aggregate Structure of Flocculant-dosed Kaolinite Slurries

Vaezi Ghobaeiyeh, Farid Unknown Date
No description available.
234

Seismic Performance of Moment Resisting Frame Members Produced from Lightweight Aggregate Concrete

Allington, Christopher James January 2003 (has links)
A total of 47 lightweight aggregate concrete columns were constructed from four different types of lightweight aggregate and provided with different quantities of transverse reinforcement. The specimens were tested under a monotonically increasing level of compressive axial load. The rate of load application was varied from pseudo-static to the rate of dynamic loading expected during a major seismic excitation. The results from the experimental testing of the column members were used to derive a theoretical stress-strain model to predict the behaviour of lightweight aggregate concrete members under imposed loads. The stress-strain model was derived to predict the response of both lightweight aggregate and conventional weight concretes with compressive strengths up to and including 100 MPa. The model was calibrate against the experimental results obtained in this study and previously tested lightweight aggregate and conventional weight concrete columns. A series of pseudo-cyclic moment-curvature analyse were undertaking using the derived stress-strain model, to predict the behaviour of the lightweight aggregate concrete members when subjected to axial load and flexure. The results were compared to the confinement requirements in the potential plastic hinge regions of column elements required by the New Zealand Concrete Structures Standard, NZS3101: 1995. It was determined that the confinement requirements of NZS3101: 1995 were could be used to accurately determine the required quantity of transverse reinforcement for lightweight aggregate concrete members with a concrete density greater than 1700 kg/m3. A total of four lightweight aggregate concrete beam column subassemblies were constructed and tested under reversed cyclic lateral loading. The results from the specimen indicate that cyclic behaviour of the lightweight aggregate concrete was similar to conventional weight concrete. However the bond capacity between the longitudinal reinforcement and the surrounding concrete was weaker than previously tested conventional weight concrete members.
235

Reduced Tillage and Living Mulches for Organic Vegetable Production

Hessler, Alex G 01 January 2013 (has links)
Reduced tillage and living mulches are alternative management strategies that can improve soil quality by minimizing disturbance and building soil organic matter. Weed suppression by these two practices alone is often insufficient to avoid crop yield losses, but their performance in an integrated system is not well understood. This project investigated the production of bell pepper (Capsicum annuum L.) and dry bean (Phaseolus vulgaris L.) in conventional tillage and strip tillage with a living mulch of teff (Eragrostis tef Zucc.) and Korean lespedeza (Kummerowia stipulacea Maxim.). Yields of pepper and bean were generally higher under conventional tillage without living mulch. Weed biomass was not influenced by tillage, and was adequately suppressed by teff in a year when conditions for teff growth were favorable. Mowing appeared to suppress weed growth but not living mulch growth. Soil nitrate and ammonium concentration was generally higher under conventional tillage without living mulch. Delaying living mulch seeding by 15 days after crop establishment generally did not affect weed suppression or crop yield. Soil aggregate stability was not consistently affected by tillage or living mulch. Increased mowing frequency, living mulch planting delay, or distance between the crop row and the living mulch may be necessary to achieve acceptable yields with living mulches.
236

Essays on Insurance Economics

Wang, Jinjing 11 August 2015 (has links)
This dissertation thesis address how aggregate shocks affect insurance firms' risk management and asset investment decisions as well as the impact of these decisions on insurance prices and regulation. The first chapter develops a signaling model to examine how insurance firms choose among retention, reinsurance and securitization especially for catastrophe risks. The second chapter examines the determination of insurance prices in an integrated equilibrium framework where insurers' assets may be subject to both idiosyncratic and aggregate shocks. The third chapter presents an empirical analysis of the hypothesized impacts of internal capital and asset risk on insurance prices as predicted by the results of the second chapter. The last chapter investigates the optimal design of insurance regulation to achieve the Pareto optimal asset and liquidity management by insurers as well as risk sharing between insurers and insurees. Chapter 1 provides a novel explanation for the predominance of retention and reinsurance relative to securitization in catastrophe risk transfer using a signaling model. An insurer's risk transfer choice trades off the lower signaling costs of reinsurance against the additional costs of reinsurance stemming from sources such as their market power, higher cost of capital relative to capital markets, and compensation for their monitoring costs. In equilibrium, the lowest risk insurers choose reinsurance, while intermediate and high risk insurance choose partial and full securitization, respectively. An increase in the loss size increases the average risk of insurers who choose securitization. Consequently, catastrophe risks, which are characterized by low frequency-high severity losses, are only securitized by very high risk insurers. Chapter 2 develops a unified equilibrium model of competitive insurance markets where insurers' assets may be exposed to idiosyncratic and aggregate shocks. We endogenize the relationship between insurance prices and insurers internal capital that potentially reconcile the conflicting predictions of previous theories that investigate the relation using partial equilibrium frameworks. Equilibrium effects lead to a non-monotonic U-shaped relation between insurance price and internal capital. Specifically, the equilibrium insurance price first decreases with a positive shock to the internal capital when it is below certain threshold level, and then increases with a positive shock the internal capital when it is above the threshold level. Further, we also derive another testable implication that an increase in the asset default risk increases the insurance price and decrease the insurance coverage. Chapter 3 studies the property and casualty insurance industry in periods from 1992 to 2012 based on the aggregate level of NAIC data. We show that the insurance price decreases with an increase in the surplus of insurance firms at the end of the previous year when the surplus is lower than 8.5 billion, and then increase when the surplus is higher than 8.5 billion. Our results provide support for the hypothesis of a U-shaped relationship between internal capital and insurance price. Our results also provide evidence for the positive relationship between asset portfolio risk and insurance price. Chapter 4 studies the effects of aggregate risk on the Pareto optimal asset and liquidity management by insurers as well as risk-sharing between insurers and insurers. When aggregate risk is low, both insurers and insurers hold no liquidity reserves, insurees are fully insured, and insurers bear all aggregate risk. When aggregate risk takes intermediate values, both insurees and insurers still hold no liquidity reserves, but insurers partially share aggregate risk with insurers. When aggregate risk is high, however, it is optimal to hold nonzero liquidity reserves, and insurees partially share aggregate risk with insurers. The efficient asset and liquidity management policies as well as the aggregate risk allocation can be implemented through a regulatory intervention policy that combines a minimum liquidity requirement when aggregate risk is high, "ex post" contingent on the aggregate state, comprehensive insurance policies, and reinsurance.
237

Credibility modeling with applications

Khapaeva, Tatiana 16 May 2014 (has links)
The purpose of this thesis is to show how the theory and practice of credibility can bene t statistical modeling. The task was, fundamentally, to derive models that could provide the best estimate of the losses for any given class and also to assess the variability of the losses, both from a class perspective as well as from an aggregate perspective. The model tting and diagnostic tests will be carried out using standard statistical packages. A case study that predicts the number of deaths due to cancer is considered, utilizing data furnished by the Colorado Department of Public Health and Environment. Several credibility models are used, including Bayesian, B uhlmann and B uhlmann-Straub approaches, which are useful in a wide range of actuarial applications.
238

An Investigation of the Optimal Sample Size, Relationship between Existing Tests and Performance, and New Recommended Specifications for Flexible Base Courses in Texas

Hewes, Bailey 03 October 2013 (has links)
The purpose of this study was to improve flexible base course performance within the state of Texas while reducing TxDOT’s testing burden. The focus of this study was to revise the current specification with the intent of providing a “performance related” specification while optimizing sample sizes and testing frequencies based on material variability. A literature review yielded information on base course variability within and outside the state of Texas, and on what tests other states, and Canada, are currently using to characterize flexible base performance. A sampling and testing program was conducted at Texas A&M University to define current variability information, and to conduct performance related tests including resilient modulus and permanent deformation. In addition to these data being more current, they are more representative of short-term variability than data obtained from the literature. This “short-term” variability is considered more realistic for what typically occurs during construction operations. A statistical sensitivity analysis (based on the 80th percentile standard deviation) of these data was conducted to determine minimum sample sizes for contractors to qualify for the proposed quality monitoring program (QMP). The required sample sizes for contractors to qualify for the QMP are 20 for gradation, compressive strength, and moisture-density tests, 15 for Atterberg Limits, and 10 for Web Ball Mill. These sample sizes are based on a minimum 25,000 ton stockpile, or “lot”. After qualifying for the program, if contractors can prove their variability is better than the 80th percentile, they can reduce their testing frequencies. The sample size for TxDOT’s verification testing is 5 samples per lot and will remain at that number regardless of reduced variability. Once qualified for the QMP, a contractor may continue to send material to TxDOT projects until a failing sample disqualifies the contractor from the program. TxDOT does not currently require washed gradations for flexible base. Dry and washed sieve analyses were performed during this study to investigate the need for washed gradations. Statistical comparisons of these data yielded strong evidence that TxDOT should always use a washed method. Significant differences between the washed and dry method were determined for the percentage of material passing the No. 40 and No. 200 sieves. Since TxDOT already specifies limits on the fraction of material passing the No. 40 sieve, and since this study yielded evidence of that size fraction having a relationship with resilient modulus (performance), it would be beneficial to use a washed sieve analysis and therefore obtain a more accurate reading for that specification. Furthermore, it is suggested the TxDOT requires contractors to have “target” test values, and to place 90 percent within limits (90PWL) bands around those target values to control material variability.
239

Copulas for credit derivative pricing and other applications.

Crane, Glenis Jayne January 2009 (has links)
Copulas are multivariate probability distributions, as well as functions which link marginal distributions to their joint distribution. These functions have been used extensively in finance and more recently in other disciplines, for example hydrology and genetics. This study has two components, (a) the development of copula-based mathematical tools for use in all industries, and (b) the application of distorted copulas in structured finance. In the first part of this study, copulabased conditional expectation formulae are described and are applied to small data sets from medicine and hydrology. In the second part of this study we develop a method of improving the estimation of default risk in the context of collateralized debt obligations. Credit risk is a particularly important application of copulas, and given the current global financial crisis, there is great motivation to improve the way these functions are applied. We compose distortion functions with copula functions in order to obtain greater flexibility and accuracy in existing pricing algorithms. We also describe an n-dimensional dynamic copula, which takes into account temporal and spatial changes. / Thesis (Ph.D.) - University of Adelaide, School of Mathematical sciences, 2009
240

Einfluss der Reaktionen verschiedener Zementhauptbestandteile auf den Alkalihaushalt der Porenlösung des Zementsteins /

Schäfer, Elke. January 2006 (has links)
Techn. Universiẗat, Diss., 2004--Clausthal.

Page generated in 0.0509 seconds