• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 314
  • 157
  • 141
  • 68
  • 54
  • 29
  • 19
  • 10
  • 9
  • 8
  • 7
  • 6
  • 6
  • 6
  • 5
  • Tagged with
  • 925
  • 121
  • 75
  • 65
  • 64
  • 57
  • 55
  • 50
  • 50
  • 49
  • 48
  • 46
  • 44
  • 42
  • 42
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
531

Non- epithelial bone cysts of the jaw

Dashti, Mahdi January 2020 (has links)
>Magister Scientiae - MSc / Aneurysmal Bone Cysts (ABC) and Solitary Bone Cysts (SBC), both non-epithelial cysts of the jaws are defined as benign lesions of an unclear aetiology. There is limited literature available on these two primary non-epithelial cysts of the jaws, especially in African populations. This retrospective study focused on the clinical and radiographic features, as well as management of the non-epithelial cysts of the jaws presenting at the University of the Western Cape Oral Health Centre from 1970-2018. The aim of this study was to describe the clinical and radiological features of non-epithelial cysts of the jaws that presented at the Departments of Maxillo-Facial and Oral Surgery and Diagnostics and Radiology at the University of the Western Cape Oral Health Centre as well as their management and recurrence patterns.
532

Efficient Temporal Reasoning with Uncertainty

Nilsson, Mikael January 2015 (has links)
Automated Planning is an active area within Artificial Intelligence. With the help of computers we can quickly find good plans in complicated problem domains, such as planning for search and rescue after a natural disaster. When planning in realistic domains the exact duration of an action generally cannot be predicted in advance. Temporal planning therefore tends to use upper bounds on durations, with the explicit or implicit assumption that if an action happens to be executed more quickly, the plan will still succeed. However, this assumption is often false. If we finish cooking too early, the dinner will be cold before everyone is at home and can eat. Simple Temporal Networks with Uncertainty (STNUs) allow us to model such situations. An STNU-based planner must verify that the temporal problems it generates are executable, which is captured by the property of dynamic controllability (DC). If a plan is not dynamically controllable, adding actions cannot restore controllability. Therefore a planner should verify after each action addition whether the plan remains DC, and if not, backtrack. Verifying dynamic controllability of a full STNU is computationally intensive. Therefore, incremental DC verification algorithms are needed. We start by discussing two existing algorithms relevant to the thesis. These are the very first DC verification algorithm called MMV (by Morris, Muscettola and Vidal) and the incremental DC verification algorithm called FastIDC, which is based on MMV. We then show that FastIDC is not sound, sometimes labeling networks as dynamically controllable when they are not.  We analyze the algorithm to pinpoint the cause and show how the algorithm can be modified to correctly and efficiently detect uncontrollable networks. In the next part we use insights from this work to re-analyze the MMV algorithm. This algorithm is pseudo-polynomial and was later subsumed by first an n5 algorithm and then an n4 algorithm. We show that the basic techniques used by MMV can in fact be used to create an n4 algorithm for verifying dynamic controllability, with a new termination criterion based on a deeper analysis of MMV. This means that there is now a comparatively easy way of implementing a highly efficient dynamic controllability verification algorithm. From a theoretical viewpoint, understanding MMV is important since it acts as a building block for all subsequent algorithms that verify dynamic controllability. In our analysis we also discuss a change in MMV which reduces the amount of regression needed in the network substantially. In the final part of the thesis we show that the FastIDC method can result in traversing part of a temporal network multiple times, with constraints slowly tightening towards their final values.  As a result of our analysis we then present a new algorithm with an improved traversal strategy that avoids this behavior.  The new algorithm, EfficientIDC, has a time complexity which is lower than that of FastIDC. We prove that it is sound and complete.
533

Entangled Polynomials

Pallone, Ashley H. 03 June 2021 (has links)
No description available.
534

[en] INFLATION TARGETING WITH A FISCAL TAYLOR RULE / [pt] METAS DE INFLAÇÃO COM UMA REGRA DE TAYLOR FISCAL

EDUARDO HENRIQUE LEITNER 17 September 2020 (has links)
[pt] Este estudo propõe e testa um regime de metas de inflação alternativoque nós chamamos de Regra de Taylor Fiscal (FTR). Nesse regime, o governo, mantém a taxa de juros nominal constante e usa a alíquota de imposto sobre o consumo como instrumento para estabilizar a inflação e o hiato do produto. Nós estimamos um modelo padrão de ciclo de negócios a partir de dados dos EUA do período da Grande Moderação (1985-2007) e comparamos os resultados observados aos resultados de uma simulação contrafactual em que aplicamos os choques estimados ao mesmo modelo substituindo a regra de Taylor padrão pela FTR. Nós verificamos que, comparada a uma regra de Taylor padrão, a FTR pode ser capaz de prover uma performance similar em termos de estabilização econômica e portanto constitui uma opção teoricamente viável de política de estabilização econômica. / [en] This study proposes and tests an alternative inflation targeting regime which we call the fiscal Taylor rule (FTR). In this regime, the government keeps the nominal interest rate constant and uses the consumption tax rate as an instrument to stabilize inflation and the output gap. We estimate a standard business cycle model on US data from the Great Moderation period (1985-2007) and compare the observed outcomes to those of a counterfactual simulation where we apply the estimated shocks to the same business cycle model replacing the standard Taylor rule by the FTR. We find that compared to the standard Taylor rule, the FTR may be capable of providing similar performance in terms of economic stabilization and thus constitutes a theoretically viable option of policy framework.
535

Artificial Neural Networks for Financial Time Series Prediction

Malas, Dana January 2023 (has links)
Financial market forecasting is a challenging and complex task due to the sensitivity of the market to various factors such as political, economic, and social factors. However, recent advances in machine learning and computation technology have led to an increased interest in using deep learning for forecasting financial data. One the one hand, the famous efficient market hypothesis states that the market is so efficient that no one can consistently benefit from it, and the random walk theory suggests that asset prices are unpredictable based on historical data. On the other hand, previous research has shown that financial time series can be forecasted to some extent using artificial neural networks (ANNs). Despite being a relatively new addition to financial research with less study than the traditional models such as moving averages and linear regression models, ANNs have been shown to outperform the traditional models to some extent. Hence, considering the efficient market hypothesis and the random walk theory, there is a knowledge gap on whether neural networks can be used for financial time series prediction. This paper explores the use of ANNs, specifically recurrent neural networks, to predict financial time series data using a long short-term memory (LSTM) network model. The study will employ an experimental research strategy to construct and test an LSTM model to predict financial time series data, with the aim of examining its performance and evaluating it relative to other models and methods. For evaluating its performance, evaluation metrics are computed and the model is compared with a constructed simple moving average (SMA) model as well as other models in existing studies. The paper also explores the application and processing of transformed financial data, where it was found that achieving stationarity by data transformation was not necessary for the LSTM model to perform better. The study also found that the LSTM model outperformed the SMA model when hyperparameters were set to capture long-term dependencies. However, in the short-term, the SMA model outperformed the LSTM model.
536

STRESS ANALYSIS OF RUBBER BLOCKS UNDER VERTICAL LOADING AND SHEAR LOADING

Suh, Jong Beom 02 October 2007 (has links)
No description available.
537

Full-Scale Shake Table Cyclic Simple Shear Testing of Liquefiable Soil

Jacobs, Jasper Stanford 01 February 2016 (has links) (PDF)
This research consists of full-scale shake table tests to investigate liquefaction of sandy soils. Consideration of the potential and consequences of liquefaction is critical to the performance of any structure built in locations of high seismicity underlain by saturated granular materials as it is the leading cause of damage associated with ground failure. In certain cases the financial losses associated with liquefaction can significantly impact the financial future of an entire region. Most liquefaction triggering studies are performed in the field where liquefaction has been previously observed, or in tabletop laboratory testing. The study detailed herein is a controlled laboratory test performed at full scale to allow for the measurement of field-scale index testing before and after cyclic loading. Testing was performed at the Parson’s geotechnical and Earthquake Laboratory at Cal Poly San Luis Obispo on the 1-dimensional shake table with a mounted flexible walled testing apparatus. The testing apparatus, originally constructed for soil-structure interaction experiments utilizing soft clay was retrofitted for the purpose of studying liquefaction. This research works towards comparing large-scale simple-shear liquefaction testing to small-scale simple-shear liquefaction testing of a #2/16 Monterey sand specimen. The bucket top was modified in order to apply a vertical load to the soil skeleton to replicate overburden soil conditions. Access ports were fitted into the bucket top for instrument cable access and to allow cone penetration testing before and after cyclic loading. A shear-wave generator was created to propagate shear waves into the sample for embedded accelerometers to measure small strain stiffness of the sample. Pore-pressure transducers were embedded in the soil sample to capture excess pore water pressure produced during liquefaction. Displacement transducers were attached to the bucket in order to measure shear strains during cyclic testing and to measure post-liquefaction volumetric deformations. The results of this investigation provide an empirical basis to the behavior of excess pore water production, void re-distribution, shear wave velocity, shear strain and cone penetrometer tip resistance of #2/16 Monterey sand before, during, and after liquefaction in a controlled laboratory environment at full-scale.
538

Insatser för lästräning i svensk grundskola : Förekomst av och innehåll i lästräningsinsatser för elever med lässvårigheter. / Reading intervention in Swedish compulsory schools : Content and frequency of reading interventions for students in need of reading support.

Sättlin, Anna-Lotta January 2022 (has links)
This essay aims to examine how reading interventions are used  in Swedish elementary schools. The two initial questions examine how many schools that implement reading interventions individually and in small groups with students with reading difficulties and which teaching methods and reading programs that are being used for these interventions. The third question considers which reasons the special needs teachers and special education needs teachers states for choosing  methods and programs for students with reading difficulties.  To answer these questions a survey with special needs teachers and special education needs teachers was completed. The results indicate that 52,1% of the elementary schools in this survey use intensive reading interventions based on phonics. The result also indicates that decoding is the most frequent content in reading interventions and that many different methods and programs are used in reading interventions, both digital and analogue. Finally, there are different reasons for why a program or method has been chosen in the different schools.  The most common cause stated, is that the chosen method has shown good results in the past for students with reading difficulties. Some respondents stated research as a cause for their chosen method, but the majority did not.
539

Three-dimensional modeling of rigid pavement

Beegle, David J. January 1998 (has links)
No description available.
540

Structural Insight into Self-assembly of Coacervate-forming Polyesteramides

Liu, Xinhao 03 August 2022 (has links)
No description available.

Page generated in 0.6717 seconds