• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 258
  • 45
  • 25
  • 25
  • 25
  • 25
  • 25
  • 25
  • 5
  • 5
  • 4
  • 1
  • 1
  • Tagged with
  • 368
  • 368
  • 347
  • 66
  • 63
  • 55
  • 26
  • 24
  • 23
  • 21
  • 19
  • 18
  • 18
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Distributed simulation of personal communication service networks

Cismasu, Codrut Octavian. January 2001 (has links)
Parallel and distributed simulation is recognized as a viable method for the modeling of complex dynamic systems. The main benefits of using a parallel versus a sequential technique in solving these problems are decreasing execution time and distributing the memory used in the simulation over a number of processors. / Parallel simulation techniques have successfully been applied to the modeling of cellular communication services. They focus on defining efficient models for simulation, on the study of protocols or on the performance analysis of resource allocation algorithms. In this thesis, we continue the research in this domain by defining a model focusing on simulation accuracy. We will also test the limits on the size of the application model that can be simulated. Two channel allocation techniques are implemented. The first is based on fixed channel allocation and the second on a technique known as channel segregation. The latter technique requires that interference data be computed. For this purpose, we describe a brute force interference calculation algorithm. (Abstract shortened by UMI.)
132

The development of assignment procedures for semi-automated truckshovel systems

Lira Bonates, Eduardo Jorge January 1992 (has links)
The purpose of this study is to develop a micro computer-based dispatching system to assist mine operators in choosing the optimal truck assignment under a set of given conditions. The system is active, semi-automated and includes procedures to suggest to the dispatcher the best truck allocation with the aim of reaching a specified production objective. / New heuristic dispatching procedures, namely Adjusted Travel Empty and Load Time, Adjusted Cycle Time, Adjusted Cycle and Waiting Times and Minimize Cycle Time have been developed in order to consider the achievement of objectives throughout the working shift. These procedures are tested and compared with the fundamental heuristic procedures; Locked-in, Maximize Truck Use, Maximize Shovel Use, Match Factor and Priority Number. The heuristic procedures do not attempt to optimize the decision but produce acceptable results most of the time. / For the purpose of testing the new heuristic dispatching procedures a computer simulation program was designed. This program resembles as much as possible the real production process. An advance-clock approach was selected to allow insertion of procedures at any point in the haulage cycle. / It was found that truck dispatching systems offer the potential for significant improvements in productivity. The new procedures are also able to consider production objectives. However each mine must evaluate each dispatching procedure independently before a definitive decision is made.
133

Sediment Dynamics in Alluvial Rivers Used as a Resource for Land-Building

Gaweesh, Ahmed Moustafa 07 April 2015 (has links)
<p> There is a dire need to use sediment from alluvial rivers to sustain and create new marsh land and sustain barrier islands and ridges. Coastal Louisiana is a prime example where wetland loss rates are one of the highest nationwide. This study focuses on investigating the sediment dynamics of the Lower Mississippi River, specifically the temporal and spatial variability of the sediment concentration as well as the sediment size characteristics. The objectives of this study are: to analyze and quantify the impact of diversion design parameters on the efficiency of sediment capture, to analyze the hydrodynamic and morphological patterns at sand bar borrow areas and to quantify the infill spatial and temporal patterns of these dredged pits. The investigation was performed using a morphodynamic numerical tool (Delft3D). The Louisiana 2012 State Master Plan identified two viable mechanisms to build land: sediment diversions and dedicated dredging. The morphodynamic model was parameterized and validated using historical and recent field observations. The model was used to examine three different parameters hypothesized as key design parameters that govern the sediment capture efficiency of sediment diversions: the alignment angle, invert elevation and diversion size. Diverted sediment loads and the sediment concentration ratio were used to assess the efficiency achieved to the corresponding change in design. Implications of choosing the designs on construction and efficiency was discussed. </p><p> The model was also used to investigate the riverside morphological response to a number of parameters for dredging lateral sand bars. Detailed analyses were carried out for the hydrodynamics at the dredge pit and its implications on the morphological development. Sensitivity analysis of hydrodynamic and sediment transport parameters examined the morphological response within the dredge pit. Findings put emphasis on data collection requirements and helped form future recommendations for predictive modeling of dredged sandbar infill. The study is concluded with an economic assessment for the impact of land-building mechanisms on the riverside in correlation to waterborne economy.</p>
134

A validation of the enterprise management engineering approach to knowledge management systems engineering

Mac Garrigle, Ellen F. 18 June 2014 (has links)
<p> Knowledge management is one of the current "buzzwords" gaining popularity on an almost-daily basis within the business world. Much attention has been paid to the theory and justification of knowledge management (KM) as an effective business and organizational practice. However, much less attention has been paid to the more specific issues of effective <u>implementation</u> of knowledge management, or to the potential financial benefit or payoff that could potentially result from an effective system implementation. As the concept of KM becomes more generally accepted, knowledge management systems (KMS) are becoming more prevalent. A KMS is often considered simply another information system to be designed, built, and supported by the IT department. In actual implementation, many KM system development efforts are not successful. There is frequently a perception that strict adherence to development processes produces an excessive time lag, rigor, and formality which will "disrupt" the desired free flow of knowledge. Professor Michael Stankosky of GWU has posited a more flexible variation of the usual systems engineering (SE) approach, tailored specifically to the KM domain and known as Enterprise Management Engineering<sup>&copy;</sup> (EME). This approach takes the four major pillars of KM as identified by GWU research in this area&mdash;Leadership, Organization, Technology, and Learning&mdash;and adapts eighteen key SE steps to accommodate the more flexible and imprecise nature of "knowledge". </p><p> Anecdotal study of successful KMS developments has shown that many of the more formal processes imposed by systems engineering (such as defining strategic objectives before beginning system development) serve a useful purpose. Consequently, an integrated systems engineering process tailored specifically to the KM domain should lead to more successful implementations of KM systems. If this is so, organizations that have followed some or all of the steps in this process will have designed and deployed more "successful" KMS than those organizations that have not done so. To support and refine this approach, a survey was developed to determine the usage of the 18 steps identified in EME. These results were then analyzed against a objective financial measurement of organizational KM to determine whether a correlation exists. This study is intended to test the validity of the efficacy of the EME approach to KM implementation. </p><p> For the financial measurement data, the subject list of organizations for this study used a measure of intangible valuation developed by Professor Baruch Lev of NYU called Knowledge Capital Earnings <sup>&copy;</sup> (KCE). This is the amount of earnings that a company with good "knowledge" has left over once its earnings based on tangible financial and physical assets have been subtracted from overall earnings. KCE can then be used to determine the Knowledge Capital (KC) of an organization. This in turn provides two quantitative measures (one relative, one absolute) that can be used to define a successful knowledge company. </p><p> For this study, Lev's research from 2001 was updated, using more recent financial data. Several of these organizations completed a survey instrument based upon the 18 points of the EME approach. The results for the 18 steps were compared against each other and against each organization's KC scores. The results show that there <u>is</u> a significant correlation between EME and the relative KC measurement, and select EME steps do correlate significantly with a high KC value. Although this study, being the first validation effort, does not show provable <u>causation</u>, it does demonstrate a quantifiable <u>correlation</u> and association between EME and successful KM implementation. This in turn should contribute to the slim body of objective knowledge on the design, deployment, and measurement of KM systems.</p>
135

Allocation of Resources to Defend Spatially Distributed Networks Using Game Theoretic Allocations

Kroshl, William M. 30 January 2015 (has links)
<p> This dissertation presents research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary. The allocation methodology recognizes the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of and anticipating the defender's efforts to mitigate the threat. </p><p> Conceptualizing the problem as a Stackelberg "leader-follower" game, the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, a maximum-flow linear program (LP), or a combination of both of these methods, which then inform an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionarily stable strategies in which actions by either side alone cannot increase their share of victories. </p><p> These techniques are demonstrated on several example networks using several different methods of evaluating the value of the nodes and comparing the evolutionary agent-based results to a more traditional, Probabilistic Risk Analysis (PRA) approach. The results show that the agent-based allocation approach results in a greater percentage of defender victories than does the PRA-based allocation approach.</p>
136

Systems analysis for robotic mining

Mottola, Laura January 1996 (has links)
Mining automation has incrementally progressed from line-of-sight remote operation to teleoperation and automatic control of mobile machines, mainly due to significant advances in underground communication systems. The present trend points towards a robotic mining environment where mobile machinery and stationary equipment will be fully integrated with a mine-wide information system overseeing all aspects of mining via a communication network. The successful design and implementation of the software and hardware components necessary to realize this vision depends on the level of seamless integration achieved. The complexity involved in terms of systems functionality and coherence necessitates systems analysis and computer-aided software engineering tools to actively support this integration effort. / Hence, the primary objective of this thesis is to introduce and relate systems analysis concepts and tools to the business of mining. This investigation begins by setting the industrial context of this work with respect to past initiatives and future trends. It discusses different approaches to the design and implementation of mining information systems. It reviews the fundamentals of software and information engineering as well as structured and object-oriented analysis and design. It presents a survey of computerized tools for systems analysis. It then applies systems analysis concepts and tools to a high-level top-down analysis of a Mine Information System and examines a specific mining process in detail. Finally, it compares the applicability of structured versus object-oriented analysis and design methodologies to the complex problem of mining.
137

Designing and implementing memory consistency models for shared-memory multiprocessors

Merali, Shamir January 1996 (has links)
The most commonly assumed memory consistency model for shared-memory multiprocessors is Sequential Consistency (SC), which is a straightforward extension of the sequential programming model, and is therefore simple to reason with. However, SC places severe restrictions on the use of high-performance hardware and compiler optimizations. In order to mitigate the performance limitations of SC, designers have proposed relaxed consistency models that retain the SC programming interface for a restricted class of programs, and at the same time allow more aggressive hardware implementations by relaxing restrictions on memory access ordering. / In cache-based systems, the management of the cache is an important issue in the implementation of a consistency model, since the presence of multiple copies of the same location in multiple caches requires that these copies be managed in a way that does not violate the requirements of the consistency model. Aggressive cache management schemes can exploit looser constraints on event ordering by reducing consistency-related cache-coherence traffic. / Location Consistency (LC), a consistency model first presented in (GS93), was designed expressly to minimize the constraints on event ordering, in an attempt to improve performance. At the same time, LC presents a formally defined interface that is easy to understand and reason with. In this thesis, we present sufficiency conditions for LC, propose a cache coherence protocol that implements the model, and present a preliminary cost and performance analysis of the protocol.
138

Syntactic and semantic structures in cocolog logic control

Martínez-Mascarúa, Carlos Mario. January 1997 (has links)
The research presented in this thesis is formulated within the Conditional Observer and Controller Logic (COCOLOG) framework. COCOLOG is a family of first order languages and associated theories for the design and implementation of controllers for discrete event systems (DESs). / The opening part of this thesis presents a high level formulation of COCOLOG called Macro COCOLOG. First, we present the theory of Macro COCOLOG languages, a framework for the enhancement of the original COCOLOG language via definitional constructions. Second, we present the theory of Macro COCOLOG actions, a framework for the enhancement of COCOLOG allowing the utilisation of hierarchically aggregated control actions. / In this thesis Macro COCOLOG is applied to a pair of examples: the control of the motion of a mobile robot and the flow of water through a tank. / The next question addressed in the thesis is the possibility of expanding the original COCOLOG theories in various ways concerning the fundamental issues of the arithmetic system and the notion of reachability in DESs as expressed in COCOLOG. Specifically, the fundamental nature of the reachability predicate, Rbl(&middot;,&middot;,&middot;), is explored, and found to be completely determined by notions axiomatised in subtheories of the original COCOLOG theory. This result effectively reduces the complexity of the proofs originally involving Rbl(&middot;,&middot;,&middot;). / Following this line of thought, two sets of Macro languages and associated theories are developed which are shown to be as powerful (in terms of expressiveness and deductive scope) as the original COCOLOG theories and hence, necessarily, as powerful as Markovian fragment COCOLOG theories. / A final result along these lines is that the control law itself (originally expressed in a set of extra logical Conditional Control Rules) can be incorporated into the COCOLOG theories via function symbol definition. / The efficient implementation of COCOLOG controllers serves as a motivation for the final two chapters of the thesis. A basic result in this chapter is that a COCOLOG controller may itself be realized as a DES since, for any COCOLOG controller, it is shown that one may generate a finite state machine realizing that controller. This realization can then be used for real time (i.e. reactive) control. (Abstract shortened by UMI.)
139

Coordinating job release dates with workdays| A job shop application to utilities field service scheduling

Pelkey, Ryan Lawrence 26 February 2014 (has links)
<p> A local utility company processes a variety of jobs each day including meter reading, service shut-offs, emergency response, and customer service work. For the Company, a specific workflow begins with automated meter-reading (AMR) and ends with collections/service shut-offs (CSOs) for accounts with excessively late payments (AMR-CSO workflow). There are considerable and systemic sources of variability in both the workload and resource demands of the AMR-CSO workflow including order arrival, order release schedules, order batch-sizing and maintenance scheduling. This project draws on theory from the job-shop problem to explore possible means to mitigate this variability. We hypothesized that controlling various forms of input variability would lead to reduced downstream workload variability. Using discrete event simulation we tested a variety of measures to reduce input variability in the workflow. Consistent with other literature we find that various workload control tactics have limited impact on output measures and system performance. However, we found that system is much more sensitive to resource capacity variability. One input control tactic we call Targeted Release allowed us to reduce Company capacity variability which suggested significantly improved outcomes. These initial results are promising for both the Company and for future investigation of tactics to mitigate resource capacity variability.</p>
140

Exploring structure-behavior relations in nonlinear dynamic feedback models /

Guneralp, Burak, January 2006 (has links)
Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 2006. / Source: Dissertation Abstracts International, Volume: 68-02, Section: B, page: 1282. Adviser: George Gertner. Includes bibliographical references (leaves 162-168) Available on microfilm from Pro Quest Information and Learning.

Page generated in 0.0455 seconds