• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10047
  • 3816
  • 1789
  • 1243
  • 1168
  • 1167
  • 283
  • 221
  • 199
  • 140
  • 128
  • 123
  • 111
  • 106
  • 103
  • Tagged with
  • 24014
  • 4192
  • 3049
  • 2896
  • 2220
  • 2058
  • 1996
  • 1821
  • 1679
  • 1420
  • 1250
  • 1238
  • 1233
  • 1175
  • 1155
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

On the optimal stopping time of learning

Fedyszak-Koszela, Anna January 2008 (has links)
The goal of this thesis is to study the economics of computational learning. Attention is also paid to applications of computational learning models, especially Valiant's so-called `probably approximately correctly' (PAC) learning model, in econometric situations. Specifically, an economically reasonable stopping time model of learning is the subject of two attached papers. In the rst paper, Paper A, the economics of PAC learning are considered. It is shown how a general form of the optimal stopping time bounds can be achieved using the PAC convergence rates for a `pessimistic-rational' learner in the most standard binary case of passive supervised PAC model of finite Vapnik-Chervonenkis (VC) dimension.   The second paper, Paper B, states precisely and improves the ideas introduced in Paper A and tests them in a specific and mathematically simple case. Using the maxmin procedure of Gilboa and Schmeidler the bounds for the stopping time are expressed in terms of the largest expected error of recall, and thus, effectively, in terms of the least expected reward. The problem of locating a real number θ by testing whether xi ≤ θ , with xi drawn from an calculated for a range of term rates, sample costs and rewards/penalties from a recall ae included. The standard econometric situations, such as product promotion, market research, credit risk assessment, and bargaining and tenders, where such bounds could be of interest, are pointed.  These two papers are the essence of this thesis, and form it togheter with an introduction to the subject of learning. / Målet med denna avhandling är att studera optimering av inlärning när det finns kostnader. Speciellt studerar jag Valiants så kallade PAC-inlärningsmodell  (Probably Approximately Correctly), ofta använd inom datavetenskap. I två artiklar behandlar jag hur länge, ur ekonomisk synvinkel, inlärningsperioden bör fortsätta. I den första artikeln visar vi hur en generell form av begränsningar av den optimala inlärningsperioden kan fås med hjälp av PAC-konvergenshastigheten för en ’pessimistiskt rationell’ studerande (i det vanligaste binära fallet av passiv PAC-inlärningsmodell med ändlig VC-dimension). I den andra artikeln fördjupar och förbättrar vi idéerna från den första artikeln, och testar dem i en specifik situation som är matematiskt enkel. Med hjälp av Gilboa – Schmeidlers max - minprocedur  uttrycker vi begränsningarna av den optimala inlärningsperioden som funktion av det största förväntade felet och därmed som funktion av den minsta förväntade belöningen. Vi diskuterar problemet med att hitta ett reellt tal θ genom testning av huruvida xi ≤ θ, där xi dras från en okänd fördelning. Här tar vi också upp exempel på begränsningar av inlärningsperioden, beräknade för en mängd av diskontovärden, stickprovskostnader och belöning/straff för erinran, samt en del vanliga ekonometriska situationer där sådana begränsningar är av intresse, såsom marknadsföring av produkter, marknadsanalys, kreditriskskattning och offertförhandling. Avhandlingen består i huvuddel av dessa två artiklar samt en kort introduktion till ekonomiska, matematiska och datavetenskapliga inlärningsmodeller.
212

Injuring Eternity: Reflections on Time Bought and Wasted

Richardson, Matt R 01 January 2013 (has links)
Essay.
213

Between the Temporal and the Eternal Falls the Shadow

Moussaoui, Raja January 2011 (has links)
This thesis is composed of a number of fragments, each revealing an aspect of the dynamic, complex and reciprocal relationship that we form with architecture. Architecture is brought to life through the animation of light and shadow, through the construction of atmospheres and rhythms, and through the selection and use of materials which evoke time. We find meaning in architecture when it is able to communicating stories of our past, and when it is able to awaken emotions which are latent within us. Our connection with architecture is formed because we relate to it as a temporal art, and strengthened because it is through temporal and corporeal things that we understand the eternal and the spiritual. It is the existence of atmosphere in a space which seduces us to enter into a prolonged engagement with architecture. Atmosphere is that which lies at our core experience of time. We move through architecture, sensing its life through its materiality and its relationship to the environment. During this physical journey we also experience a mental journey, one which is stimulated by the unknown and driven by our imagination. This thesis posits shadow as a principle element of atmosphere, responsible for evoking life, mystery, depth, and seduction in architecture. The unknown world that exists inside each of us, and the world which exists on the far side of time, is always veiled in shadows. The fragments presented in this thesis are organized into two parts. The first group explores this argument in an abstract way, through a number of representational mediums including photography, film, literature, art and drawing. The second part focuses on an existing architecture, the Patient Built Wall at the Centre for Addiction and Mental Heath in Toronto.The Patient Built Wall is viewed as having a ‘thickness’ past its physical form; one that exists in our collective imagination as a result of the layers of history and time embedded within it. Each work presented in this section attempts to vivify the ‘life’ of the wall, and its ability to communicate meaning to an interested group. An exhibition of this work arranges these fragments spatially so that they can be understood in relation to one another, thereby forming a newly constituted whole. The aim is to create a communicative space which tests principles of light, shadow, reflection, movement and temporality, while exploring the tension that exists between our subjective and shared experience of architecture.
214

The efficiency enhanced after laboratory automation implementation

Lin, Hui-chuan 26 June 2012 (has links)
A Laboratory Automation System (LAS) was implemented in E-Da hospital since 29 March 2011. We chose every result-reported data of all determination items instead of single item. We collected 1,826,444 reported data on laboratory turnaround time (TAT) from E-Da hospital during both pre- (July 2010 to March 2011) and post-LAS (April to December 2011). All data were classed with months and 18 TAT means were calculated. A Two-Factor analysis of variance was applied to the means and we found a significant difference, from 88.8 min. to 74.4 min., between pre- and post-LAS under a 95% confidence level. This showed a result of TAT improvement of 14.4 min. (16.2%) after LAS implemented. The data were seperated into Stat or Routine sets after classed with months, and means of each set of data were calculated. The result of Two-Factor analyses of variance on the separated Stat or Routine sets revealed that the Stat TAT did not change significantly (P=.05) and the Routine TAT, as a result of improvement of 29.0 min. (21.3%), did. This showed an improvement of Routine TAT contributes to the improvement of TAT. The former conclusion could be demostrated that Routine TAT correlated best with the TAT (r2=0.99, P=.05) and Stat TAT correlated poor with the TAT (r2=0.36, P=.05). As to 90% TAT analysis, it seemes that an improvement from 14.4 min. to 10.8 min. (16.2% to 15.3%) will be underestimated if 90% TAT was sifted from original TAT and analyzed. And a TAT Outlier analysis, tail size, seems to improve better, reduced 47 min. (18.5%), than 90% TAT analysis. We suggest a total TAT analysis instead of 90% TAT or Outlier ones. Both Clinical Chemistry (CC) and Immune Assay (IA) TAT did change significantly (P=.05), as a result of improvement of 13.7 min. (16.1%) and 18.2 min. (14.7%) each other, when the data were seperated into CC or IA sets. This showed an improvement of TAT after automation was contributed by both the improvement of CC and IA TAT. It could be also demostrated by the best correlation relationship of CC and IA TAT with the TAT (r2=1.00 and 0.96, P=.05). We also found fewer laboratory errors after automation and this result consisted with staff satisfacation. In other words, the staff of the hospital were much satisfied with LAS when a fewer laboratory errors was found from the reports.
215

Modulation of cell yields and genetic responses of Salmonella fermentation and colonization in the gastrointestinal ecology of avian species

Dunkley, Kingsley Delroy 15 May 2009 (has links)
In these studies we evaluated specific environmental stimuli relevant to Salmonella virulence and physiology in the gastrointestinal tract of chickens. Results from Salmonella growth in steady state, glucose-limiting continuous culture (CC) indicated that the optimal growth condition was observed between 0.05 h-1 and 0.27 h-1 dilution rates (D). Cell protein concentrations increased proportionally with an increase in D at each steady state, but after D 0.27 h-1 there was a reduction in the cell protein concentrations as the D increased. Genetic responses generally indicated that the lowest D exhibited highest hilA relative expression. Relatively higher expression of hilA was largely observed at low D (low glucose) (0.0125 h-1, 0.025 h-1, 0.05 h-1). Salmonella incubated in CC at different pH shifts demonstrated that cell protein concentration, glucose utilization, Yield ATP and Acetate:Propionate ratios were influenced by an increase in pH (6.14 to 7.41). These parameters increased and decreased consistently with a corresponding increase and decrease in pH. Polymerase chain reaction-based denaturant gradient gel electrophoresis showed that the overall amplicon band patterns of microbial similarity have demonstrated that hens molted with Alfalfa (ALC+) diet were similar to the Full-Fed (FF+) treatment group. Additional, FF+ and ALC+ treatment groups exhibited a higher percentage similarity coefficient (>90%) than the feed deprived treatment group. Fermentation response from cecal inocula on feed substrates revealed that alfalfa based samples yielded consistently higher short chain fatty acid levels when compared to other feed substrates. Salmonella Enteritidis (SE) colonization in liver, spleen and ovaries was significantly (P < 0.05) higher in FW+ hens compared to ALC+ and FF+ treatments groups. A 4-fold (log10 1.29) reduction in SE colonization for ALC+ hens compared to feed withdrawal hens (FW+) (log10 5.12) SE colonization was observed. Relative expression of hilA in all treatment groups was significantly (P < 0.05) higher in FW+ compared to FF+ and ALC+ groups. hilA expression in FW+ hens was 3.2-, 4.2-, and 1.9-fold higher for Days 6, 11 and 12 respectively, when compared with to ALC+ hens. These results suggest that Salmonella virulence in the gastrointestinal ecology of chickens could be impacted by a combination of low nutrients availability and pH shifts.
216

Dynamic reliability using entry-time approach for maintenance of nuclear power plants

Wang, Shuwen 15 May 2009 (has links)
Entry-time processes are finite-state continuous-time jump processes with transition rates depending only on the two states involved in transition, the calendar time, and the most recent arrival time, which is termed as entry-time. The entry-time processes have the potential to provide a significantly greater range of applicability and flexibility than traditional reliability tools for case studies related to equipment and components in nuclear power plants. In this dissertation, the finite difference approximation of the integrodifferential Chapman-Kolmogorov equations for the entry-time processes was developed, and then it was verified by application to some hypothetical examples that are solved by alternative means, either (semi-)analytically or via simulation. To demonstrate the ability of entry-time model to applications in nuclear power plants for a RIAM based scenario, the entry-time approach is applied to the maintenance of main generators in nuclear power using the data from INPO-EPIX database. In this application, both reliability and financial performances acquired using the entry-time approaches corresponding to different maintenance policies are presented and discussed to help make maintenance decisions for the plant management. The ability of the EPIX database to provide time-dependent failure rates is demonstrated and the techniques for extraction of failure rates from the database for main generators are also discussed.
217

Some applications of wavelets to time series data

Jeong, Jae Sik 15 May 2009 (has links)
The objective of this dissertation is to develop a suitable statistical methodology for parameter estimation in long memory process. Time series data with complex covariance structure are shown up in various fields such as finance, computer network, and econometrics. Many researchers suggested the various methodologies defined in different domains: frequency domain and time domain. However, many traditional statistical methods are not working well in complicated case, for example, nonstationary process. The development of the robust methodologies against nonstationarity is the main focus of my dissertation. We suggest a wavelet-based Bayesian method which shares good properties coming from both wavelet-based method and Bayesian approach. To check the robustness of the method, we consider ARFIMA(0, d, 0) with linear trend. Also, we compare the result of the method with that of several existing methods, which are defined in different domains, i.e. time domain estimators, frequency domain estimators. Also, we apply the method to functional magnetic resonance imaging (fMRI) data to find some connection between brain activity and long memory parameter. Another objective of this dissertation is to develop a wavelet-based denoising technique when there is heterogeneous variance noise in high throughput data, especially protein mass spectrometry data. Since denoising technique pretty much depends on threshold value, it is very important to get a proper threshold value which involves estimate of standard deviation. To this end, we detect variance change point first and get suitable threshold values in each segment. After that, we apply local wavelet thresholding to each segment, respectively. For comparison, we consider several existing global thresholding methods.
218

Speculative parallelization of partially parallel loops

Dang, Francis Hoai Dinh 15 May 2009 (has links)
Current parallelizing compilers cannot identify a significant fraction of parallelizable loops because they have complex or statically insufficiently defined access patterns. In our previous work, we have speculatively executed a loop as a doall, and applied a fully parallel data dependence test to determine if it had any cross–processor depen- dences. If the test failed, then the loop was re–executed serially. While this method exploits doall parallelism well, it can cause slowdowns for loops with even one cross- processor flow dependence because we have to re-execute sequentially. Moreover, the existing, partial parallelism of loops is not exploited. We demonstrate a generalization of the speculative doall parallelization tech- nique, called the Recursive LRPD test, that can extract and exploit the maximum available parallelism of any loop and that limits potential slowdowns to the over- head of the run-time dependence test itself. In this thesis, we have presented the base algorithm and an analysis of the different heuristics for its practical applica- tion. To reduce the run-time overhead of the Recursive LRPD test, we have im- plemented on-demand checkpointing and commit, more efficient data dependence analysis and shadow structures, and feedback-guided load balancing. We obtained scalable speedups for loops from Track, Spice, and FMA3D that were not paralleliz- able by previous speculative parallelization methods.
219

Comparison of the Performance of Different Time Delay Estimation Techniques for Ultrasound Elastography

Sambasubramanian, Srinath 2010 August 1900 (has links)
Elastography is a non-invasive medical imaging modality that is used as a diagnostic tool for the early detection of several pathological changes in soft tissues. Elastography techniques provide the local strain distributions experienced by soft tissues due to compression. The resulting strain images are called “elastograms”. In elastography, the local tissue strains are usually estimated as the gradient of local tissue displacement. The local tissue displacements are estimated from the time delays between gated pre- and post-compression echo signals. The quality of the resulting elastograms is highly dependent on the accuracy of these local displacement estimates. While several time delay estimation (TDE) techniques have been proposed for elastography applications, there is a lack of systematic study that statistically compares the performance of these techniques. This information could prove to be of great importance to improve currently employed elastographic clinical methods. This study investigates the performance of selected time delay estimators for elastography applications. Time delay estimators based on Generalized Cross Correlation (GCC), Sum of Squared Differences (SSD) and Sum of Absolute Differences (SAD) are proposed and implemented. Within the class of GCC algorithms, we further consider: an FFT-based cross correlation algorithm (GCC-FFT), a hybrid time-domain and frequency domain cross correlation algorithm with prior estimates (GCC-PE) and an algorithm based on the use of fractional Fourier transform to compute the cross correlation (GCC -FRFT) . Image quality factors of the elastograms obtained using the different TDE techniques are analyzed and the results are compared using standard statistical tools. The results of this research suggests that correlation based techniques outperform SSD and SAD techniques in terms of SNRe, CNRe, dynamic range and robustness. The sensitivity of GCC-FFT and SSD were statistically similar and statistically higher than those of all other methods. Within the class of GCC methods, there is no statistically significant difference between SNRe of GCC-FFT, GCC-PE and GCC –FRFT for most of the strain values considered in this study. However, in terms of CNRe, GCC-FFT and GCC-FRFT were significantly better than other TDE algorithms. Based on these results, it is concluded that correlation-based algorithms are the most effective in obtaining high quality elastograms.
220

Building a Temporal GIS--Applied to Urban Landuse Zoning Change

Chiou, Jia-Cheng 06 September 2003 (has links)
Various activities and phenomenon in environment such as transportation, land resources exploitation and oceanic oil pollution can be analyzed and monitored by using GIS. But those will evolve when time elapses. In order to realize changes and implicit meaning in evolution, we can manage and analyze time-stamping geographic data (spatial-temporal data) in a temporal GIS. It is important to set up a spatial-temporal data model before building a temporal GIS. So, the focus of this research is the design of a relational spatial-temporal data model. The major type of storing time of geographic data in the relational database of a GIS is to stamp each record with time (tuple-timestamping), so this method will be used in this research. The objective of this research is the development of a temporal GIS and makes the track of land use zoning change more convenient. The management and track of change of land use zoning data of Kaoshioung city is the case study in this research. The ArcView 8.x software is used to develop a temporal GIS and spatial data and attribute data will be exported to geodatabase in it. Building spatial-temporal queries with ArcObject is the development of a temporal GIS in this research.

Page generated in 0.6627 seconds