• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • Tagged with
  • 29
  • 29
  • 14
  • 13
  • 13
  • 10
  • 9
  • 6
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Theoretical foundations of operational research

Bryer, R. A. January 1977 (has links)
The conclusions of both Parts One and Two complement and reinforce each other. After outlining the ideals of OR, I set out in Part One to find and scrutinize the philosophical foundations upon which some leading operations researchers have claimed that these ideals could be implemented. In chapters 2, 3, and 4 I argue that adopting (respectively) the positivist, conventionalist and/or idealist philosophies as the theoretical foundations upon which to build an adequate theory of inquiry for the purposes of OR would force it to abandon its ideals. These philosophies are interpreted as attempts on the part of academic operational researchers to stave-off the open-ended ambiguity and anarchy of inquiry which an unqualified interpretation of OR's ideals could engender. These attempts to give substance to the ideals of OR all exert a strong bias against raising questions about the nature of the subject-matter with which OR deals, and it in largely on these grounds that they are rejected in chapter 5 because of the implications which this has for the ideals of OR. One conclusion of Part One is that OR needs protection from such philosophies, and that a realist-type alternative at least provides this. I conclude by raising the doubt whether philosophy can provide much more to OR. The other major conclusion is that OR needs to understand its subject-matter before it can reasonably hope to implement its ideals. Given the general bias which we find in Part One against seriously considering the subject-matter of OR, we enter Part Two with some trepidation. Notwithstanding the philosophical bias against it, it is clear that OR must have a conception of the nature of its subject-matter. However, OR's ideals can just as easily be lost by inadequate attention to this task. In Part Two the biases discovered in Part One come home to roost. The first attempt to provide the ideals of OR with a substance on the basis of which its ideals can be implemented in an objective fray turns out to be just that, i.e., metaphysical 'substance' in the guise of a theory of management. We see in chapter 6 that to the extent to which this theory moves beyond merely asserting that management would 'take care' of OR's need for an objective basis, it presupposes a social theory which would show how social systems by their nature (if properly constructed) embody this objectivity. This move is foreshadowed in chapter 3 where we see Kuhn (who is taken as an exemplar of conventionalist philosophy) finally resorting to this device to prop up his conventionalism, against the growing weight of subjectivity under which it threatened to sag into the jaws of positivism. The social theory on which such claims rest is given detailed consideration in. chapter 7. In chapter 7 I give serious consideration to the possibility that OR's social theory, if it has one at all, will be developed in reaction to what it sees as the "problem of order", because this problem can be seen as but another way of stating its ideals in a specifically social way. Stating OR ideals in this way orients them directly to at least one aspect of the question of the nature of OR's subject-matter. We see that by employing, Durkheim's account of and solution to the social problem of order as a basis for comparison with OR (first as a homomorphism. and later as an isomorphism) that we are able to gain quite a firm grip on OR's social theory (and, hence, its grasp of its subject-matter). We see that this theory, although providing a justification for OR's theory of management (especially in its modern form), it is itself inadequate. The basis of the inadequacy, most fundamentally, is that the theory in question presupposes the very thing, that should be in question, namely, the nature of the social collective. I conclude with a specific illustration of the impact of this theory on the ideal of OR by analysing the inadequate treatment of power and conflict which it allows.
12

The impact of Logo on pre-service elementary teachers' beliefs, knowledge of geometry, and self-regulation of learning /

Mohr, Doris Schipp. January 2005 (has links)
Thesis (Ph.D.)--Indiana University, Dept. of Curriculum and Instruction, Mathematics Education, 2005. / Adviser: Peter Kloosterman.
13

The impact of technology on the pedagogical practice of grade nine applied mathematics teachers.

Ford, John. January 2007 (has links)
Thesis (M.A.)--University of Toronto, 2007. / Source: Masters Abstracts International, Volume: 45-06, page: 2787.
14

The impact of Logo on pre-service elementary teachers' beliefs, knowledge of geometry, and self-regulation of learning

Mohr, Doris Schipp. January 2005 (has links)
Thesis (Ph.D.)--Indiana University, Dept. of Curriculum and Instruction, Mathematics Education, 2005. / Source: Dissertation Abstracts International, Volume: 67-01, Section: A, page: 0123. Adviser: Peter Kloosterman. "Title from dissertation home page (viewed Dec. 11, 2006)."
15

Power and reason : the construction of a mathematics teacher's pedagogical discourse and practice /

Chen, Rong-Ji, January 2006 (has links)
Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2006. / Source: Dissertation Abstracts International, Volume: 67-07, Section: A, page: 2499. Adviser: Karen Ferneding. Includes bibliographical references (leaves 192-203) Available on microfilm from Pro Quest Information and Learning.
16

Analysis of human underwater undulatory swimming using musculoskeletal modelling

Phillips, Christopher W. G. January 2013 (has links)
Elite swimming is a highly competitive sport. At this professional level, the difference between a podium finish or not, is measured in fractions of a second. While improvements in specific performance metrics may deliver a marginal improvement, it is through the accumulation of marginal gains that the winning margins are created. Quantifying performance in elite sport is therefore fundamental in identifying and implementing improvements. The trade-off between energy expenditure, thrust generated and attained velocity are identified as key aspects to performance. From a review of previous swimming research it was identified that there was a lack of suitable methods for simultaneously quantifying the energy expenditure, thrust and velocity for a particular swimming technique. The aim of this thesis is to analyse the performance of human underwater undulatory swimming (UUS) |a significant proportion of a race for multiple events. This encompasses experimentally gathered data and computational musculoskeletal modelling in the analysis and evaluation of UUS technique. This thesis has developed a novel, fully functional musculoskeletal model with which detailed analysis of human UUS can be performed. The experimental and processing methods for two methods of acquiring the athlete's kinematics have also been developed. A model based upon fish locomotion is coupled with the musculoskeletal model to provide the fluid loadings for the simulation. Detailed analysis of two techniques of an elite athlete has demonstrated this process in a case study. Energy expended by the simulated muscles is estimated. Combined with the measured velocity and predicted thrust, the propulsive efficiency for each technique is determined.
17

Use of spatial models and the MCMC method for investigating the relationship between road traffic pollution and asthma amongst children

Zhang, Yong January 2000 (has links)
This thesis uses two datasets: NCDS (National Child Development Study) and Bartholomew's Digital road map to investigate the relationship between road traffic pollution and asthma amongst children. A pollution exposure model is developed to provide an indicator of road traffic pollution. Also, a spatially driven logistic regression model of the risk of asthma occurrence is developed. The relationship between asthma and pollution is tested using this model. The power of the test has been studied. Because of the uncertainty of exact spatial location of subjects, given a post-code, we have considered error-in-variable model, otherwise known as measurement error model. A general foundation is presented. Inference is attempted in three approaches. Compared with models without measurement error, no improvement on log-likelihood is made. We suggest the error can be omitted. We also take a Bayesian approach to analyse the relationship. A discretized MCMC (Markov Chain Monte Carlo) is developed so that it can be used to estimate parameters and to do inference on a very complex posterior density function. It extends the simulated tempering method to 'multi-dimension temperature' situation. We use this method to implement MCMC on our models. The improvement in speed is remarkable. A significant effect of road traffic pollution on asthma is not found. But the methodology (spatially driven logistic regression and discretized MCMC) can be applied on other data.
18

Stochastic network calculus with martingales

Poloczek, Felix January 2016 (has links)
The practicality of the stochastic network calculus (SNC) is often questioned on grounds of looseness of its performance bounds. The reason for its inaccuracy lies in the usage of too elementary tools from probability theory, such as Boole’s inequality, which is unable to account for correlations and thus inappropriate to properly model arrival flows. In this thesis, we propose an extension of stochastic network calculus that characterizes its main objects, namely arrival and service processes, in terms of martingales. This characterization allows to overcome the shortcomings of the classical SNC by leveraging Doob’s inequality to provide more accurate performance bounds. Additionally, the emerging stochastic network calculus with martingales is quite versatile in the sense that queueing related operations like multiplexing and scheduling directly translate into operations of the corresponding martingales. Concretely, the framework is applied to analyze the per-flow delay of various scheduling policies, the performance of random access protocols, and queueing scenarios with a random number of parallel flows. Moreover, we show our methodology is not only relevant within SNC but can be useful also in related queueing systems. E.g., in the context of multi-server systems, we provide a martingale-based analysis of fork-join queueing systems and systems with replications. Throughout, numerical comparisons against simulations show that the Martingale bounds obtained with Doob’s inequality are not only remarkably accurate, but they also improve the Standard SNC bounds by several orders of magnitude.
19

Development and application of low Reynolds number turbulence models for air-cooled electronics

Dhinsa, Kulvir Kaur January 2006 (has links)
Semiconductors are at the heart of electronic devices such as computers, mobile phones, avionics systems, telecommunication racks, etc. Power dissipation from semiconductor devices is continuing to increase due to the growth in the number of transistors on the silicon chip as predicted by Moore's Law. Thermal management techniques, used to dissipate this power, are becoming more and more challenging to design. Air cooling of electronic components is the preferred method for many designs where the air flow is characterised as being in the laminar-to-turbulent transitional region. Over the last fifteen years there has been a dramatic take-up of Computational Fluid Dynamics (CFD) technology in the electronics industry to simulate the airflow and temperatures in electronic systems. These codes solve the Reynolds Averaged Navier-Stokes (RANS) equations for momentum and turbulence. RANS models are popular as they are much quicker to solve than time-dependent models such as Large Eddy Simulation (LES) or Direct Numerical Simulation (DNS). At present the majority of thermal design engineers use the standard k-e model which is a high Reynolds number model. This is because there is limited knowledge on the benefit of using low Reynolds number models in the electronics cooling industry. This Ph.D. investigated and developed low Reynolds number models for use in electronics cooling CFD calculations. Nine turbulence models were implemented and validated in the in-house CFD code PHYSICA. This includes three zero-equation, two single equation, and four zonal models. All of these models are described in the public literature except the following two models which were developed in this study: AUTO_CAP: This zero-equation model automates the existing LVEL_CAP model available within the commercial CFD code FLOTHERM. ke I kl: This zonal model uses a new approach to blend the k — l model used at the wall with the k-e model used to predict the bulk airflow. Validation of these turbulence models was undertaken on eight different test cases. This included the detailed experimental work undertaken by Meinders. Results show that the ke I kl model provides the most accurate flow predictions. For prediction of temperature there was no clear favourite. This was probably due to the use of the universal log-law function in this study. A generalised wall function may be more appropriate. Results from this research have been disseminated through a total of nine peer-reviewed conference and journal publications, evidence of the interest the topic of this investigation generates amongst electronic packaging engineers.
20

Computational analysis of viscoelastic free surface flows

Edussuriya, Suchitra Samanthi January 2003 (has links)
The demand for increasingly small and lightweight products require micro-scale components made of materials which are durable and light. Polymers have therefore become a popular choice since they can be used to produce materials which meet industrial requirements. Many of these polymers are viscoelastic fluids. The reduction in the sizes of components make physical experimentation difficult and costly. Therefore computational tools are being sought to replace old methods of testing. This research has been concerned with the development of a finite volume algorithm for viscoelastic flow which can be readily applied to real world applications. A major part of the research involved the implementation of the Oldroyd-B constitutive equations and associated solution methods, in the 3-D multi-physics software environment PHYSICA+. This provides an unstructured finite volume solution technique for viscoelastic flow. This algorithm is validated using the 4:1 planar contraction and results are reported. The developed viscoelastic algorithm has also been coupled with two interface tracking techniques one of which includes surface tension effects. These techniques are the Scalar Equation Algorithm (SEA) and the Level Set Method (LSM). With both techniques the algorithms are able to take into account flow effects from both fluids (ie. air and polymer) in a two-fluid system. The LSM technique maintains a sharp interface overcoming the smearing of the interface which generally affects interface tracking techniques on Eulerian fixed grids, for example SEA, and enables the curvature of the interface to be calculated accurately to implement surface tension effects. This integrated viscoelastic flow solver and free surface algorithm is then illustrated by predicting two industrial flow processes as used in the electronic packaging industry.

Page generated in 0.1323 seconds