• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1173
  • 514
  • 326
  • 61
  • 61
  • 61
  • 61
  • 61
  • 60
  • 51
  • 40
  • 27
  • 26
  • 17
  • 16
  • Tagged with
  • 2669
  • 2669
  • 501
  • 410
  • 329
  • 297
  • 260
  • 257
  • 207
  • 189
  • 187
  • 187
  • 180
  • 150
  • 141
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

The developmental course of distance, time, and velocity concepts : a generative connectionist model / Development of distance, time and velocity.

Buckingham, David, 1962- January 1993 (has links)
Two sets of connectionist simulations of children's acquisition of distance (d), time (t), and velocity (v) concepts using a generative algorithm, cascade-correlation (Fahlman & Lebiere, 1990), are reported. Pure condition simulations represent a situation in which memory demands across the concepts are equal. The limited memory condition explores the effects of differing memory demands. / It was found that the rules that correlated most highly with network responses during training were consistent with the developmental course of children's concepts (Wilkening, 1981; 1982). Networks integrated the defining dimensions of the concepts first by identity rules (e.g., v = d), then additive rules (e.g., v = d-t), and finally multiplicative rules (e.g., v = d $ div$ t). / The results are discussed in terms of similarity to children's development, the effects of memory demands, the contribution of connectionism to cognitive development, and directions for future research. It is argued that cascade-correlation provides an explicit mechanism of developmental change--weight adjustment and hidden unit recruitment.
282

Near-contact binary spotting activity : the effect of a common atmosphere / Near contact binary spotting activity

Gritton, Jeffrey A. January 2008 (has links)
In this investigation of near-contact binary stars, the author fit a synthetic light, computer generated, curve model to observations by adjusting various parameters of two near-contact binary pairs, CN Andromeda and TZ Draconis. By fitting asymmetries in the light curves using spotting parameters, the spotting activity for both stars can be determined. From the spotting parameters it is possible to compare the spotting activity of these two near-contact binaries to the spotting activity of 47 contact binaries (Csizmadia et al., 2004). The author determined that, for both TZ Dra and CN And, spots are located at positions that were previously not seen in other observations of contact binaries (Hill, 2007). / Department of Physics and Astronomy
283

A Monte Carlo simulation of the EEMC detector located in the STAR experiment at RHIC / Monte Carlo simulation of the endcap electromagnetic calorimeter detector located in the solenoidal tracker at RHIC experiment at Relativistic Heavy Ion Collider

McClain, Christopher J. January 2005 (has links)
A Monte-Carlo simulation program of the response of the Endcap Electromagnetic Calorimeter (EEMC) and Shower Maximum Detector (SMD) was developed to determine the ability, of the detectors, to provide y/n° discrimination and calculate the effects crosstalk between readout channels from multianode photomultiplier tubes (MAPMT). The importance of this discrimination process is to allow a better measure of the direct-photon asymmetries, which are then used to calculate the gluon contribution to the proton spin structure. These measurements arise from polarized-proton collisions provided by the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory and are detected using the Solenoidal Tracker at RHIC (STAR), which includes the EEMC and SMD. In order to obtain accurate asymmetry measurements, the photons resulting from 7c° decay must be identified through pion-mass reconstruction to avoid confusing them as direct photons. This Monte-Carlo simulation and reconstruction algorithm successfully identified 60% of the pions from single-pion events and 40% of the pions from two-pion events. The effects of MAPMT crosstalk, as determined by the Monte Carlo, were less than 2% on n° identification, and therefore were determined to be insignificant. / Department of Physics and Astronomy
284

Use of artificial neural networks for modelling multivariate water quality times series / by Holger Robert Maier.

Maier, Holger R. January 1995 (has links)
Corrigenda attached to back end paper. / Bibliography: p. 526-559. / xxx, 559 p. : ill. ; 30 cm. / Title page, contents and abstract only. The complete thesis in print form is available from the University Library. / This research analyses the suitability of back-propagation artifical neural networks (ANNs) for modelling multivariate water quality time series. The ANNs are successfully applied to two case studies, the long-term forcasting of salinity and the modelling of blue-green algae, in the River Murray, Australia. / Thesis (Ph.D.)--University of Adelaide, Dept. of Civil and Environmental Engineering, 1996?
285

Estimation of the spatio-temporal heterogeneity of rainfall and its importance towards robust catchment simulation, within a hydroinformatic environment

Umakhanthan, Kanagaratnam, Civil & Environmental Engineering, Faculty of Engineering, UNSW January 2002 (has links)
Rainfall is a natural process, which has a high degree of variability in both space and time. Information on the spatial and temporal variability of rainfall plays an important role in the process of surface runoff generation. Hence it is important for a variety of applications in hydrology and water resources management. The spatial variability of rainfall can be substantial even for very small catchments and an important factor in the reliability of rainfall-runoff simulations. Catchments in urban areas usually are small, and the management problems often require the numerical simulation of catchment processes and hence the need to consider the spatial and temporal variability of rainfall. A need exists, therefore, to analyse the sensitivity of rainfall-runoff behaviour of catchment modelling systems (CMS) to imperfect knowledge of rainfall input, in order to judge whether or not they are reliable and robust, especially if they are to be used for operational purposes. Development of a methodology for identification of storm events according to the degree of heterogeneity in space and time and thence development of a detailed spatial and temporal rainfall model within a hydroinformatic environment utilising real-time data has been the focus of this project. The improvement in runoff prediction accuracy and hence the importance of the rainfall input model in runoff prediction is then demonstrated through the application of a CMS for differing variability of real storm events to catchments with differing orders of scale. The study identified both spatial and temporal semi-variograms, which were produced by plotting the semi-variance of gauge records in space and time against distance and time respectively. These semi-variograms were utilised in introducing estimators to measure the degree of heterogeneity of each individual storm events in their space and time scale. Also, the proposed estimators use ground based gauge records of the real storm events and do not rely on delicate meteorological interpretations. As the results of the investigation on the developed semi-variogram approach, real storm events were categorised as being High Spatial-High Temporal (HS-HT); High Spatial-Low Temporal; (HS-LT); Low Spatial-High Temporal (LS-HT); and Low Spatial-Low Temporal variability.A comparatively detailed rainfall distribution model in space and time was developed within the Geographical Information Systems (GIS). The enhanced rainfall representation in both space and time scale is made feasible in the study by the aid of the powerful spatial analytic capability of GIS. The basis of this rainfall model is an extension of the rainfall model developed by Luk and Ball (1998) through a temporal discretisation of the storm event. From this model, improved estimates of the spatially distributed with smaller time steps hyetographs suited for especially the urban catchments could be obtained. The importance of the detailed space-time rainfall model in improving the robustness of runoff prediction of CMS was investigated by comparing error parameters for predictions from CMS using alternate rainfall models, for various degrees of spatiotemporal heterogeneity events. Also it is appropriate to investigate whether the degree of this improvement to be dependent on the variability of the storm event which is assessed by the adopted semi-variogram approach. From the investigations made, it was found that the spline surface rainfall model, which considered the spatial and temporal variability of the rainfall in greater detail than the Thiessen rainfall model resulted in predicted hydrographs that more closely duplicated the recorded hydrograph for the same parameter set. The degree of this improvement in the predicted hydrograph was found to be dependent on the spatial and temporal variability of the storm event as measured by the proposed semi-variogram approach for assessing this feature of a storm event. The analysis is based on forty real events recorded from the Centennial Park Catchment (1.3km2) and the Upper Parramatta River Catchment (110km2) in Sydney, Australia. These two case study catchments were selected to ensure that catchment scale effects were incorporated in the conclusions developed during the study.
286

Estimation of the spatio-temporal heterogeneity of rainfall and its importance towards robust catchment simulation, within a hydroinformatic environment

Umakhanthan, Kanagaratnam, Civil & Environmental Engineering, Faculty of Engineering, UNSW January 2002 (has links)
Rainfall is a natural process, which has a high degree of variability in both space and time. Information on the spatial and temporal variability of rainfall plays an important role in the process of surface runoff generation. Hence it is important for a variety of applications in hydrology and water resources management. The spatial variability of rainfall can be substantial even for very small catchments and an important factor in the reliability of rainfall-runoff simulations. Catchments in urban areas usually are small, and the management problems often require the numerical simulation of catchment processes and hence the need to consider the spatial and temporal variability of rainfall. A need exists, therefore, to analyse the sensitivity of rainfall-runoff behaviour of catchment modelling systems (CMS) to imperfect knowledge of rainfall input, in order to judge whether or not they are reliable and robust, especially if they are to be used for operational purposes. Development of a methodology for identification of storm events according to the degree of heterogeneity in space and time and thence development of a detailed spatial and temporal rainfall model within a hydroinformatic environment utilising real-time data has been the focus of this project. The improvement in runoff prediction accuracy and hence the importance of the rainfall input model in runoff prediction is then demonstrated through the application of a CMS for differing variability of real storm events to catchments with differing orders of scale. The study identified both spatial and temporal semi-variograms, which were produced by plotting the semi-variance of gauge records in space and time against distance and time respectively. These semi-variograms were utilised in introducing estimators to measure the degree of heterogeneity of each individual storm events in their space and time scale. Also, the proposed estimators use ground based gauge records of the real storm events and do not rely on delicate meteorological interpretations. As the results of the investigation on the developed semi-variogram approach, real storm events were categorised as being High Spatial-High Temporal (HS-HT); High Spatial-Low Temporal; (HS-LT); Low Spatial-High Temporal (LS-HT); and Low Spatial-Low Temporal variability.A comparatively detailed rainfall distribution model in space and time was developed within the Geographical Information Systems (GIS). The enhanced rainfall representation in both space and time scale is made feasible in the study by the aid of the powerful spatial analytic capability of GIS. The basis of this rainfall model is an extension of the rainfall model developed by Luk and Ball (1998) through a temporal discretisation of the storm event. From this model, improved estimates of the spatially distributed with smaller time steps hyetographs suited for especially the urban catchments could be obtained. The importance of the detailed space-time rainfall model in improving the robustness of runoff prediction of CMS was investigated by comparing error parameters for predictions from CMS using alternate rainfall models, for various degrees of spatiotemporal heterogeneity events. Also it is appropriate to investigate whether the degree of this improvement to be dependent on the variability of the storm event which is assessed by the adopted semi-variogram approach. From the investigations made, it was found that the spline surface rainfall model, which considered the spatial and temporal variability of the rainfall in greater detail than the Thiessen rainfall model resulted in predicted hydrographs that more closely duplicated the recorded hydrograph for the same parameter set. The degree of this improvement in the predicted hydrograph was found to be dependent on the spatial and temporal variability of the storm event as measured by the proposed semi-variogram approach for assessing this feature of a storm event. The analysis is based on forty real events recorded from the Centennial Park Catchment (1.3km2) and the Upper Parramatta River Catchment (110km2) in Sydney, Australia. These two case study catchments were selected to ensure that catchment scale effects were incorporated in the conclusions developed during the study.
287

Risk analysis assessment of the influence of geological factors on exploration and mining investment alternatives : development of a microcomputer simulation model

Mallinson, Clyde A January 1987 (has links)
A microcomputer risk analysis model is developed and used in an exploration - delineation simulation to analyse in particular the effects of geological factors on exploration mining investment alternatives. Analysis of results indicate that geological parameters can have profound effects on such investment alternatives and that the role of the geologist in determining and evaluating the significance of the various geological factors is critical. Simulation examples highlight some of the key geological parameters and show how changes in these parameters influence both the expected mean results and the standard deviations of such means. The risk analysis model provides an ideal means of conveying the importance of the different geoiogical factors on exploration - delineation - mining investment alternatives and may be used as a geological education aid
288

CFD Heat Transfer Simulation of the Human Upper Respiratory Tract for Oronasal Breathing Condition

Srinivasan, Raghavan January 2011 (has links)
In this thesis. a three dimensional heat transfer model of heated airflow through the upper human respiratory tract consisting of nasal, oral, trachea, and the first two generations of bronchi is developed using computational fluid dynamics simulation software. Various studies have been carried out in the literature investigating the heat and mass transfer characteristics in the upper human respiratory tract, and the study focuses on assessing the injury taking place in the upper human respiratory tract and identifying acute tissue damage based on level of exposure. The model considered is for the simultaneous oronasal breathing during the inspiration phase with high volumetric flow rate of 90/liters minute and a surrounding air temperature of 100 degrees centigrade. The study of the heat and mass transfer, aerosol deposition and flow characteristics in the upper human respiratory tract using computational fluid mechanics simulation requires access to a two dimensional or three dimensional model for the human respiratory tract. Depicting an exact model is a complex task since it involves the prolonged use of imaging devices on the human body. Hence a three dimensional geometric representation of the human upper respiratory tract is developed consisting of nasal cavity, oral cavity, nasopharynx, pharynx, oropharynx, trachea and first two generations of the bronchi. The respiratory tract is modeled circular in cross-section and varying diameter for various portions as identified in this study. The dimensions are referenced from the literature herein. Based on the dimensions, a simplified model representing the human upper respiratory tract is generated.This model will be useful in studying the flow characteristics and could assist in treatment of injuries to the human respiratory tract as well as help optimize drug delivery mechanism and dosages. Also a methodology is proposed to measure the characteristic dimension of the human nasal and oral cavity at the inlet/outlet points which are classified as internal measurements.
289

An Evolutionary Simulation of the Tragedy of the Commons

Oosterhout, Gretchen 01 January 1996 (has links)
In his seminal essay, "The Tragedy of the Commons" (1968), Garrett Hardin argued that unless human population growth is controlled, the tragedy of common resource destruction is inevitable. This research consists of the development of an evolutionary computer model to simulate the Tragedy of the Commons, and social and economic solutions that have been proposed. In the simulations, multiattribute decision models are used to represent the tradeoffs a variety of types of individuals make among economic and social values in an uncertain environment. Individuals in each iteration of the simulation decide whether or not to exploit a common resource that has a stochastic regeneration rate. A genetic algorithm is used to simulate the way the decision makers respond to economic and social payoffs that result from their choices, as the commons responds to their actions over time. Game theory analyses of the commons dilemma are also included that, in contrast to previous analyses of the Tragedy of the Commons, incorporate not only economic attributes, but social and aesthetic attributes as well. These analyses indicate that the games underlying the Tragedy of the Commons may be similar to not only the N-person Prisoner's Dilemma, as is sometimes argued, but also N-person games of Chicken, Benevolent Chicken, and Hero. Population diversity is found to be particularly important to solutions in both the evolutionary simulations and the game theory analyses. The simulations and analyses support the hypothesis that, even if potential solutions that Hardin dismissed as unrealistic in the real world are given an opportunity to work in a simulated computer world, Hardin is right: for any given commons regeneration rate, the ultimate destruction of the commons can be prevented only by draconian economic or political measures, unrealistic rates of technological innovation or changes in social values, or coercive control of population growth.
290

Freeway Control Via Ramp Metering: Development of a Basic Building Block for an On-Ramp, Discrete, Stochastic, Mesoscopic, Simulation Model within a Contextual Systems Approach

Alkadri, Mohamed Yaser 01 January 1991 (has links)
One of the most effective measures of congestion control on freeways has been ramp metering, where vehicle entry to the freeway is regulated by traffic signals (meters). Meters are run with calibrated influx rates to prevent highway saturation. However, recent observations of some metering sites in San Diego, CA indicate that metering, during peak hour demand, is helping freeway flow while sometimes creating considerable traffic back-ups on local streets, transferring congestion problems from the freeway to intersections. Metering problems stem largely from the difficulty of designing an integrated, dynamic metering scheme that responds not only to changing freeway conditions but also to fluctuating demand throughout the ramp network; a scheme whose objective is to maintain adequate freeway throughput as well as minimize disproportionate ramp delays and queue overspills onto surface streets. Simulation modeling is a versatile, convenient, relatively inexpensive and safe systems analysis tool for evaluating alternative strategies to achieve the above objective. The objective of this research was to establish a basic building block for a discrete system simulation model, ONRAMP, based on a stochastic, mesoscopic, queueing approach. ONRAMP is for modeling entrance ramp geometry, vehicular generation, platooning and arrivals, queueing activities, meters and metering rates. The architecture of ONRAMP's molecular unit is designed in a fashion so that it can be, with some model calibration, duplicated for a number of ramps and, if necessary, integrated into some other larger freeway network models. SLAM.II simulation language is used for computer implementation. ONRAMP has been developed and partly validated using data from eight ramps at Interstate-B in San Diego. From a systems perspective, simulation will be short-sided and problem analysis is incomplete unless the other non-technical metering problems are explored and considered. These problems include the impacts of signalizing entrance ramps on the vitality of adjacent intersections, land use and development, "fair" geographic distribution of meters and metering rates throughout the freeway corridor, public acceptance and enforcement, and the role and influence of organizations in charge of decision making in this regard. Therefore, an outline of a contextual systems approach for problem analysis is suggested. Benefits and problems of freeway control via ramp metering, both operational short-term and strategic long-term, are discussed in two dimensions: global (freeway) and local (intersection). The results of a pilot study which includes interviews with field experts and law enforcement officials and a small motorist survey are presented.

Page generated in 0.1352 seconds