• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 45
  • 14
  • 5
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 99
  • 12
  • 8
  • 8
  • 8
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Programming and Conceptual Design Using Building Information Modeling

Avila, Mary-Alice 01 January 2009 (has links)
This thesis explores the benefits of using Building Information Modeling (BIM) during the programming and conceptual design phase of a project. The research was based on a case study undertaken dealing with the decisions and assumptions made during the design phases of the Center for Science at Cal Poly San Luis Obispo. The project team used a traditional approach to project plan development. The finding of this study was that the project process would have greatly benefited utilizing BIM tools and a collaborative team approach in the programming and conceptual design phase. Because decisions made early in the project have enormous implications to aesthetics and cost, the increase in analysis of design options afforded by the use of BIM tools would have minimized inaccurate, incomplete and unreliable information, and allowed the design team to work in a more efficient, collaborative manner transmitting through all phases of the project.
32

Robustness Against Non-Normality : Evaluating LDA and QDA in Simulated Settings Using Multivariate Non-Normal Distributions

Viktor, Gånheim, Isak, Åslund January 2023 (has links)
Evaluating classifiers in controlled settings is essential for empirical applications, as extensive knowledge on model-behaviour is needed for accurate predictions. This thesis investigates robustness against non-normality of two prominent classifiers, LDA and QDA. Through simulation, errors in leave-one-out cross-validation are compared for data generated by different multivariate distributions, also controlling for covariance structures, class separation and sample sizes. Unexpectedly, the classifiers perform better on data generated by heavy-tailed symmetrical distributions than by the normal distribution. Possible explanations are proposed, but the cause remains unknown. There is need for further studies, investigating more settings as well as mathematical properties to verify and understand these results.
33

The cross in the valley ; the history of the establishment of the Catholic church in the Northern San Joaquin Valley of California up to 1863

Bonta, Robert Eugene 01 January 1963 (has links) (PDF)
This thesis is the story of the development of the Catholic Church in San Joaquin County and the adjacent areas that were served by the pioneer clergymen of Stockton 1 s St. Mary's Church from approximately 1851 to 186). These first dozen years of Central California Catholicism cover the pastorates of Stockton's first four priests: Fathers Dominic Blaive, Cornelius Delahunty, Robert Maurice, and Joseph Gallagher. These dozen years mark the transition of Stockton from a hectic supply center for the Southern Mines to a stable community whose economy became, based upon the agricultural production of its surrounding rural areas. These first four pastors, then, witnessed the development of the early American Catholic Church from its infancy as a mission when Abbe Blaive arrived in Stockton from his native France, to its maturity as a stable.and respected community church under the spiritual direction of the American, Father Joseph Gallagher.
34

Assumption-Based Runtime Verification of Finite- and Infinite-State Systems

Tian, Chun 23 November 2022 (has links)
Runtime Verification (RV) is usually considered as a lightweight automatic verification technique for the dynamic analysis of systems, where a monitor observes executions produced by a system and analyzes its executions against a formal specification. If the monitor were synthesized, in addition to the monitoring specification, also from extra assumptions on the system behavior (typically described by a model as transition systems), then it may output more precise verdicts or even be predictive, meanwhile it may no longer be lightweight, since monitoring under assumptions has the same computation complexity with model checking. When suitable assumptions come into play, the monitor may also support partial observability, where non-observable variables in the specification can be inferred from observables, either present or historical ones. Furthermore, the monitors are resettable, i.e. being able to evaluate the specification at non-initial time of the executions while keeping memories of the input history. This helps in breaking the monotonicity of monitors, which, after reaching conclusive verdicts, can still change its future outputs by resetting its reference time. The combination of the above three characteristics (assumptions, partial observability and resets) in the monitor synthesis is called the Assumption-Based Runtime Verification, or ABRV. In this thesis, we give the formalism of the ABRV approach and a group of monitoring algorithms based on specifications expressed in Linear Temporal Logic with both future and past operators, involving Boolean and possibly other types of variables. When all involved variables have finite domain, the monitors can be synthesized as finite-state machines implemented by Binary Decision Diagrams. With infinite-domain variables, the infinite-state monitors are based on satisfiability modulo theories, first-order quantifier elimination and various model checking techniques. In particular, Bounded Model Checking is modified to do its work incrementally for efficiently obtaining inconclusive verdicts, before IC3-based model checkers get involved. All the monitoring algorithms in this thesis are implemented in a tool called NuRV. NuRV support online and offline monitoring, and can also generate standalone monitor code in various programming languages. In particular, monitors can be synthesized as SMV models, whose behavior correctness and some other properties can be further verified by model checking.
35

Evaluations of SWEs and SPH numerical modelling techniques for dam break flows

Pu, Jaan H., Shao, Songdong, Huang, Y., Hussain, Khalid 19 November 2014 (has links)
No / The standard shallow water equations (SWEs) model is often considered to provide weak solutions to the dam-break flows due to its depth-averaged shock-capturing scheme assumptions. In this study, an improved SWEs model using a recently proposed Surface Gradient Upwind Method (SGUM) is used to compute dam-break flows in the presence of a triangular hump. The SGUM allows the SWEs model to stably and accurately reproduce the highly complex shock currents caused by the dam-break event, as it improves the treatment of SWEs numerical source terms, which is particularly crucial for simulating the wet/dry front interface of the dam-break flow. Besides, an Incompressible Smoothed Particle Hydrodynamics (ISPH) modeling technique is also employed in this study to compare with the performance of the SGUM-SWEs model. The SPH method is totally mesh free and thus it can efficiently track the large free surface deformation. The ISPH approach uses a strictly incompressible two-step semi-implicit solution method. By reproducing a documented experimental dam-break flow, it has demonstrated that both model simulation results gave good agreement with the experimental data at different measurement locations. However, the ISPH simulations showed a better prediction of the dam-break peak wave building-up time, where its superiority was demonstrated. Furthermore, the ISPH model could also predict more detailed flow surface profiles across the streamwise flow direction and the velocity and pressure structures.
36

Analysis of atmospheric influences on ratio thermography for solar tower systems

Englin, Albin January 2022 (has links)
The knowledge of temperature and emissivity of the receiver are both critical for a solar tower power plant, in order to guarantee an efficient operation of the thermal receiver on the one hand, while monitoring any degradation of the receiver coating on the other hand. To make these measurements, a new thermographic system is currently being developed, using a multispectral camera working in the short wavelength infrared spectrum. This system applies the principle of ratio thermography, using a couple of narrow bandpass filters centered on atmospheric water absorption bands, at 1.4 and 1.9 µm, to reduce the influence of solar reflections on the measurement signal, making it sensitive to atmospheric conditions. In this thesis, a batch simulation approach is used to identify boundary atmospheric and operating conditions necessary to achieve temperature errors below 2 %, minimizing the influence of solar reflection. Furthermore the influence of atmospheric parameters on the sensitivity of ratio thermography is analyzed, in particular the validity of the gray body assumption. It is shown that the atmosphere has a critical influence on the measurement accuracy. A humid atmosphere and/or high zenith angle is necessary for making accurate measurements. Furthermore only receiver temperatures above 450◦C could be measured for the current system configuration, regardless of atmospheric conditions. Assuming negligible solar reflections, the validity of the gray body assumption is shown to be sensitive to the precipitable water vapor. A model based atmospheric compensation is therefore required to further improve the accuracy of ratio thermography.
37

Some Statistical Aspects of Association Studies in Genetics and Tests of the Hardy-Weinberg Equilibrium

He, Ran 08 October 2007 (has links)
No description available.
38

Strongly-Coupled Conjugate Heat Transfer Investigation of Internal Cooling of Turbine Blades using the Immersed Boundary Method

Oh, Tae Kyung 02 July 2019 (has links)
The present thesis focuses on evaluating a conjugate heat transfer (CHT) simulation in a ribbed cooling passage with a fully developed flow assumption using LES with the immersed boundary method (IBM-LES-CHT). The IBM with the LES model (IBM-LES) and the IBM with CHT boundary condition (IBM-CHT) frameworks are validated prior to the main simulations by simulating purely convective heat transfer (iso-flux) in the ribbed duct, and a developing laminar boundary layer flow over a two-dimensional flat plate with heat conduction, respectively. For the main conjugate simulations, a ribbed duct geometry with a blockage ratio of 0.3 is simulated at a bulk Reynolds number of 10,000 with a conjugate boundary condition applied to the rib surface. The nominal Biot number is kept at 1, which is similar to the comparative experiment. As a means to overcome a large time scale disparity between the fluid and the solid regions, the use of a high artificial solid thermal diffusivity is compared to the physical diffusivity. It is shown that while the diffusivity impacts the instantaneous fluctuations in temperature, heat transfer and Nusselt numbers, it has an insignificantly small effect on the mean Nusselt number. The comparison between the IBM-LES-CHT and iso-flux simulations shows that the iso-flux case predicts higher local Nusselt numbers at the back face of the rib. Furthermore, the local Nusselt number augmentation ratio (EF) predicted by IBM-LES-CHT is compared to the body fitted grid (BFG) simulation, experiment and another LES conjugate simulation. Even though there is a mismatch between IBM-LES-CHT prediction and other studies at the front face of the rib, the area-averaged EF compares reasonably well in other regions between IBM-LES-CHT prediction and the comparative studies. / Master of Science / The present thesis focuses on the computational study of the conjugate heat transfer (CHT) investigation on the turbine internal ribbed cooling channel. Plenty of prior research on turbine internal cooling channel have been conducted by considering only the convective heat transfer at the wall, which assumes an iso-flux (constant heat flux) boundary condition at the surface. However, applying an iso-flux condition on the surface is far from the realistic heat transfer mechanism occurring in internal cooling systems. In this work, a conjugate heat transfer analysis of the cooling channel, which considers both the conduction within the solid wall and the convection at the ribbed inner wall surface, is conducted for more realistic heat transfer coefficient prediction at the inner ribbed wall. For the simulation, the computational mesh is generated by the immersed boundary method (IBM), which can ease the mesh generation by simply immersing the CAD geometry into the background volume grid. The IBM is combined with the conjugate boundary condition to simulate the internal ribbed cooling channel. The conjugate simulation is compared with the experimental data and another computational study for the validation. Even though there are some discrepancy between the IBM simulation and other comparative studies, overall results are in good agreement. From the thermal prediction comparison between the iso-flux case and the conjugate case v using the IBM, it is found that the heat transfer predicted by the conjugate case is different from the iso-flux case by more than 40 percent at the rib back face. The present study shows the potential of the IBM framework with the conjugate boundary condition for more complicated geometry, such as full turbine blade model with external and internal cooling system.
39

Neri di Bicci: A study of three of his patrons' commissions of the Assumption of the Virgin altarpieces with a focus on their choice of an all'antica style

Samples, Kara 12 August 2014 (has links)
This thesis will analyze why three of Neri di Bicci’s patrons—the Spini family of Florence, a nun of the Bridgettine Order of Florence, and Ser Amideo of Santa Maria degli Ughi—desired to commission an altarpiece of the Assumption of the Virgin in an all’antica style. Neri di Bicci’s background as an artist, existing scholarship, and comparisons of older styles of art will also be discussed.
40

Ověřování předpokladů modelu proporcionálního rizika / Ověřování předpokladů modelu proporcionálního rizika

Marčiny, Jakub January 2014 (has links)
The Cox proportional hazards model is a standard tool for modelling the effect of covariates on time to event in the presence of censoring. The appropriateness of this model is conditioned by the validity of the proportional hazards assumption. The assumption is explained in the thesis and methods for its testing are described in detail. The tests are implemented in R, including self-written version of the Lin- Zhang-Davidian test. Their application is illustrated on medical data. The ability of the tests to reveal the violation of the proportional hazards assumption is investigated in a simulation study. The results suggest that the highest power is attained by the newly implemented Lin-Zhang-Davidian test in most cases. In contrast, the weighted version of the Lin-Wei-Ying test was found to have inadequate size for low sample sizes.

Page generated in 0.0807 seconds