• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1119
  • 250
  • 155
  • 136
  • 118
  • 48
  • 19
  • 17
  • 15
  • 11
  • 10
  • 7
  • 6
  • 6
  • 4
  • Tagged with
  • 2310
  • 404
  • 299
  • 292
  • 230
  • 218
  • 207
  • 203
  • 180
  • 178
  • 166
  • 159
  • 137
  • 136
  • 122
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Field verification of stream-aquifer interactions Colorado School of Mines survey field, Golden, Colorado /

Anderman, Evan R. Poeter, Eileen P. January 1993 (has links) (PDF)
Thesis (M. Eng.)--Colorado School of Mines, 1993. / Thesis advisor: Eileen Poeter, Dept. of Geology and Geological Engineering. Includes bibliographical references (leaves 93-97). Includes bibliographical references (p. 93-97). Also available in print version.
192

The influence of contemporary forest harvesting on summer stream temperatures in headwater streams of Hinkle Creek, Oregon /

Kibler, Kelly Maren. January 1900 (has links)
Thesis (M.S.)--Oregon State University, 2008. / Printout. Includes bibliographical references (leaves 83-90). Also available on the World Wide Web.
193

Recruitment and abundance of large woody debris in an Oregon coastal stream system /

Long, Barry A. January 1987 (has links)
Thesis (M.S.)--Oregon State University, 1987. / Typescript (photocopy). Includes bibliographical references (leaves 60-65). Also available on the World Wide Web.
194

Contrasting Chemical Response to Experimental Acidification of Fice Acid-sensitive Streams

Goss, Heather Vanessa January 2006 (has links) (PDF)
No description available.
195

An investigation into the contribution of the low-level jet (LLJ) to the available wind resource in Missouri

Koleiny, Ali. Fox, Neil I. January 2009 (has links)
The entire thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file; a non-technical public abstract appears in the public.pdf file. Title from PDF of title page (University of Missouri--Columbia, viewed on November 18, 2009). Thesis advisor: Dr. Neil I. Fox. Includes bibliographical references.
196

A comparative study of the simulation of daily streamflow sequences

Thambirajah, Percy Anandarajah January 1973 (has links)
Using three years of daily streamflow and meteorological data from the Similkameen watershed at Princeton, B.C., the model parameters of the existing deterministic UBC Budget Model are evaluated. With these model parameters and the available meteorological data, the synthetic streamflow sequences are generated for the other seven years for the Similkameen watershed. These are subsequently compared with the actual flows. A separate statistical stochastic model is developed by using the spectral analysis, and the three years of the same daily flows are decomposed into 30 sub-harmonics or Fourier coefficients. By interpolating the Fourier coefficients and by estimating the anticipated mean annual flows from the snowpack data at Blackwall Peak, the synthetic traces of the daily streamflow sequences are simulated for the other seven years. A first order Markovian model is used to explain the random component. The comparative study is then carried out between the actual daily streamflow sequences and those generated by the deterministic UBC Budget Model and the stochastic spectral model. In comparison with the stochastic spectral model, good fits are obtained with the fixed model parameters of the UBC Budget Model for the sequence of peaks for the simulated hydro-graphs of the intervening years. Since the winter melt factor in the UBC Budget Model was assumed to be a constant for this analysis, some errors occur between the actual and the generated cumulative volumes. With the deterministic periodic component of the spectral model, the reconciliation between the cumulative volumes is fairly well maintained. Since the role of operational hydrology is not concerned with the prediction of actual flows, the stochastic spectral model should be judged on its ability in presenting the designer with a series of synthetic traces that are likely to occur during the lifetime of a particular project. / Applied Science, Faculty of / Civil Engineering, Department of / Graduate
197

Development of a research watershed system and a streamflow prediction model

Kennedy, Gary Franklin January 1969 (has links)
Two independent hydrologic research projects, the development of (1) a research watershed system and (2) a streamflow prediction model, were carried out. The first project was primarily a field instrumentation task involving both design and implementation of a system of research watersheds. Two small (50 acre) research watersheds which may become either representative or experimental in nature were initiated within the University of British Columbia Research Forest. A larger research watershed system was described which could include the Alouette River Watershed. This system of watersheds when subjected to more rigorous experimental procedures should yield valuable, management and conservation design criteria for Pacific Coast forested regions. The second project was primarily analytic in nature, employing the use of multiple regression and a digital computer. A computer program was developed which models the snowmelt streamflow of large watersheds in a manner which makes short term prediction of the streamflow possible. The prediction variables were temperature recorded at a single centrally located station, time and streamflow recorded at the outlet from the watershed. The model predicted flood flow one to five days in advance of measured streamflow for the Fraser River Watershed (78,000 square miles in area) during the spring runoff period of 1955 and 1964. This model required calibration at the beginning of each spring runoff period. / Science, Faculty of / Resources, Environment and Sustainability (IRES), Institute for / Graduate
198

Extending AdaBoost:Varying the Base Learners and Modifying the Weight Calculation

Neves de Souza, Erico January 2014 (has links)
AdaBoost has been considered one of the best classifiers ever developed, but two important problems have not yet been addressed. The first is the dependency on the ``weak" learner, and the second is the failure to maintain the performance of learners with small error rates (i.e. ``strong" learners). To solve the first problem, this work proposes using a different learner in each iteration - known as AdaBoost Dynamic (AD) - thereby ensuring that the performance of the algorithm is almost equal to that of the best ``weak" learner executed with AdaBoost.M1. The work then further modifies the procedure to vary the learner in each iteration, in order to locate the learner with the smallest error rate in its training data. This is done using the same weight calculation as in the original AdaBoost; this version is known as AdaBoost Dynamic with Exponential Loss (AB-EL). The results were poor, because AdaBoost does not perform well with strong learners, so, in this sense, the work confirmed previous works' results. To determine how to improve the performance, the weight calculation is modified to use the sigmoid function with algorithm output being the derivative of the same sigmoid function, rather than the logistic regression weight calculation originally used by AdaBoost; this version is known as AdaBoost Dynamic with Logistic Loss (AB-DL). This work presents the convergence proof that binomial weight calculation works, and that this approach improves the results for the strong learner, both theoretically and empirically. AB-DL also has some disadvantages, like the search for the ``best" classifier and that this search reduces the diversity among the classifiers. In order to attack these issues, another algorithm is proposed that combines AD ``weak" leaner execution policy with a small modification of AB-DL's weight calculation, called AdaBoost Dynamic with Added Cost (AD-AC). AD-AC also has a theoretical upper bound error, and the algorithm offers a small accuracy improvement when compared with AB-DL, and traditional AdaBoost approaches. Lastly, this work also adapts AD-AC's weight calculation approach to deal with data stream problem, where classifiers must deal with very large data sets (in the order of millions of instances), and limited memory availability.
199

Incipient motion of boulders in open channel flow

Stols, Kevin January 2018 (has links)
A research report submitted to the Faculty of Engineering and the Built Environment, University of the Witwatersrand, Johannesburg, in partial fulfilment of the requirements for the degree of Master of Science in Engineering. Johannesburg, 2018 / The use of boulders to create habitat heterogeneity is important for aquatic diversity, and being able to predict the stability of a boulder that is placed in a river will aid in sizing the boulder. Identifying ways to increase the stability could save costs associated with over design or replacement due to the boulder washing away. Existing research on incipient motion centres on determining threshold conditions for bed material or protruding elements within a bed surface with relation to, among others: shape of particle, size of particle, relative depth of particle to flow depth, and impact of channel slope. The existing research is limited to bed material that is of a similar size and there is no research on the conditions for incipient motion elements that are relatively large compared to the bed material it is resting on. An idealised flume study was performed to identify trends that several factors have on boulder stability, as well as to verify the results obtained from a pivoting analysis model prediction for a spherical boulder. An additional study was performed to obtain drag coefficients that were suitable for use on spherical boulders that were either embedded into the bed material or simply resting on top of the bed material. The results of the drag experiments were varied; only the results for the non-embedded were suitable to integrate into the model predictions while drag coefficients for the embedded boulders need to be taken from previously published results. The results of the flume study provided good confirmation of the model predictions with the average absolute experimental error being 4%. The trends identified in the flume study show that the most effective method in improving a boulder’s stability is to embed it into the bed material with this being more effective than increasing the size of the boulder. / MT 2018
200

Efficient Distributed Processing Over Micro-batched Data Streams

Ahmed Abdelhamid (10539053) 07 May 2021 (has links)
<div><div><div><p>Advances in real-world applications require high-throughput processing over large data streams. Micro-batching is a promising computational model to support the needs of these applications. In micro-batching, the processing and batching of the data are interleaved, where the incoming data tuples are first buffered as data blocks, and then are processed collectively using parallel function constructs (e.g., Map-Reduce). The size of a micro-batch is set to guarantee a certain response-time latency that is to conform to the application’s service-level agreement. Compared to native tuple-at-a-time data stream processing, micro- batching can sustain higher data rates. However, existing micro-batch stream processing systems lack Load-awareness optimizations that are necessary to maintain performance and enhance resource utilization. In this thesis, we investigate the micro-batching paradigm and pinpoint some of its design principles that can benefit from further optimization. A new data partitioning scheme termed Prompt is presented that leverages the characteristics of the micro-batch processing model. Prompt enables a balanced input to the batching and processing cycles of the micro-batching model. Prompt achieves higher throughput process- ing with an increase in resource utilization. Moreover, Prompt+ is proposed to enforce la- tency by elastically adapting resource consumption according to workload changes. More specifically, Prompt+ employs a scheduling strategy that supports elasticity in response to workload changes while avoiding rescheduling bottlenecks. Moreover, we envision the use of deep reinforcement learning to efficiently partition data in distributed streaming systems. PartLy demonstrates the use of artificial neural networks to facilitate the learning of efficient partitioning policies that match the dynamic nature of streaming workloads. Finally, all the proposed techniques are abstracted and generalized over three widely used stream process- ing engines. Experimental results using real and synthetic data sets demonstrate that the proposed techniques are robust against fluctuations in data distribution and arrival rates. Furthermore, it achieves up to 5x improvement in system throughput over state-of-the-art techniques without degradation in latency.</p></div></div></div>

Page generated in 0.0434 seconds