• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1311
  • 444
  • 238
  • 177
  • 78
  • 38
  • 29
  • 25
  • 23
  • 19
  • 18
  • 14
  • 12
  • 11
  • 10
  • Tagged with
  • 3074
  • 540
  • 483
  • 471
  • 455
  • 427
  • 417
  • 372
  • 321
  • 301
  • 295
  • 282
  • 262
  • 242
  • 234
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Design and Application of Discrete Explicit Filters for Large Eddy Simulation of Compressible Turbulent Flows

Deconinck, Willem 24 February 2009 (has links)
In the context of Large Eddy Simulation (LES) of turbulent flows, there is a current need to compare and evaluate different proposed subfilter-scale models. In order to carefully compare subfilter-scale models and compare LES predictions to Direct Numerical Simulation (DNS) results (the latter would be helpful in the comparison and validation of models), there is a real need for a "grid-independent" LES capability and explicit filtering methods offer one means by which this may be achieved. Advantages of explicit filtering are that it provides a means for eliminating aliasing errors, allows for the direct control of commutation errors, and most importantly allows a decoupling between the mesh spacing and the filter width which is the primary reason why there are difficulties in comparing LES solutions obtained on different grids. This thesis considers the design and assessment of discrete explicit filters and their application to isotropic turbulence prediction.
322

Image Compression Using Bidirectional DCT to Remove Blocking Artifacts

Faridi, Imran Zafar 12 May 2005 (has links)
Discrete Cosine Transform (DCT) is widely used transform in many areas of the current information age. It is used in signal compression such as voice recognition, shape recognition and also in FBI finger prints. DCT is the standard compression system used in JPEG format. The DCT quality deteriorates at low-bit compression rate. The deterioration is due to the blocking artifact inherent in block DCT. One of the successful attempts to reduce these blocking artifacts was conversion of Block-DCT into Line-DCT. In this thesis we will explore the Line-DCT and introduce a new form of line-DCT called Bidirectional-DCT, which retains the properties of Line- DCT while improving computational efficiency. The results obtained in this thesis show significant reduction in processing time both in one dimensional and two dimensional DCT in comparison with the traditional Block-DCT. The quality analysis also shows that the least mean square error is considerably lower than the traditional Block-DCT which is a consequence of removing the blocking artifacts. Finally, unlike the traditional block DCT, the Bidirectional-DCT enables compression with very low bit rates and very low blocking artifacts.
323

Nonlinear Time-Frequency Control Theory with Applications

Liu, Mengkun 1978- 14 March 2013 (has links)
Nonlinear control is an important subject drawing much attention. When a nonlinear system undergoes route-to-chaos, its response is naturally bounded in the time-domain while in the meantime becoming unstably broadband in the frequency-domain. Control scheme facilitated either in the time- or frequency-domain alone is insufficient in controlling route-to-chaos, where the corresponding response deteriorates in the time and frequency domains simultaneously. It is necessary to facilitate nonlinear control in both the time and frequency domains without obscuring or misinterpreting the true dynamics. The objective of the dissertation is to formulate a novel nonlinear control theory that addresses the fundamental characteristics inherent of all nonlinear systems undergoing route-to-chaos, one that requires no linearization or closed-form solution so that the genuine underlying features of the system being considered are preserved. The theory developed herein is able to identify the dynamic state of the system in real-time and restrain time-varying spectrum from becoming broadband. Applications of the theory are demonstrated using several engineering examples including the control of a non-stationary Duffing oscillator, a 1-DOF time-delayed milling model, a 2-DOF micro-milling system, unsynchronized chaotic circuits, and a friction-excited vibrating disk. Not subject to all the mathematical constraint conditions and assumptions upon which common nonlinear control theories are based and derived, the novel theory has its philosophical basis established in the simultaneous time-frequency control, on-line system identification, and feedforward adaptive control. It adopts multi-rate control, hence enabling control over nonstationary, nonlinear response with increasing bandwidth ? a physical condition oftentimes fails the contemporary control theories. The applicability of the theory to complex multi-input-multi-output (MIMO) systems without resorting to mathematical manipulation and extensive computation is demonstrated through the multi-variable control of a micro-milling system. The research is of a broad impact on the control of a wide range of nonlinear and chaotic systems. The implications of the nonlinear time-frequency control theory in cutting, micro-machining, communication security, and the mitigation of friction-induced vibrations are both significant and immediate.
324

Discrete Optimization and Agent-Based Simulation for Regional Evacuation Network Design Problem

Wang, Xinghua 14 March 2013 (has links)
Natural disasters and extreme events are often characterized by their violence and unpredictability, resulting in consequences that in severe cases result in devastating physical and ecological damage as well as countless fatalities. In August 2005, Hurricane Katrina hit the Southern coast of the United States wielding serious weather and storm surges. The brunt of Katrina’s force was felt in Louisiana, where the hurricane has been estimated to total more than $108 billion in damage and over 1,800 casualties. Hurricane Rita followed Katrina in September 2005 and further contributed $12 billion in damage and 7 fatalities to the coastal communities of Louisiana and Texas. Prior to making landfall, residents of New Orleans received a voluntary, and then a mandatory, evacuation order in an attempt to encourage people to move themselves out of Hurricane Katrina’s predicted destructive path. Consistent with current practice in nearly all states, this evacuation order did not include or convey any information to individuals regarding route selection, shelter availability and assignment, or evacuation timing. This practice leaves the general population free to determine their own routes, destinations and evacuation times independently. Such freedom often results in inefficient and chaotic utilization of the roadways within an evacuation region, quickly creating bottlenecks along evacuation routes that can slow individual egress and lead to significant and potentially dangerous exposure of the evacuees to the impending storm. One way to assist the over-burdened and over-exposed population during extreme event evacuation is to provide an evacuation strategy that gives specific information on individual route selection, evacuation timing and shelter destination assignment derived from effective, strategic pre-planning. For this purpose, we present a mixed integer linear program to devise effective and controlled evacuation networks to be utilized during extreme event egress. To solve our proposed model, we develop a solution methodology based on Benders Decomposition and test its performance through an experimental design using the Central Texas region as our case study area. We show that our solution methods are efficient for large-scale instances of realistic size and that our methods surpass the size and computational limitations currently imposed by more traditional approaches such as branch-and-cut. To further test our model under conditions of uncertain individual choice/behavior, we create an agent-based simulation capable of modeling varying levels of evacuee compliance to the suggested optimal routes and varying degrees of communication between evacuees and between evacuees and the evacuation authority. By providing evacuees with information on when to evacuate, where to evacuate and how to get to their prescribed destination, we are able to observe significant cost and time increases for our case study evacuation scenarios while reducing the potential exposure of evacuees to the hurricane through more efficient network usage. We provide discussion on scenario performance and show the trade-offs and benefits of alternative batch-time evacuation strategies using global and individual effectiveness measures. Through these experiments and the developed methodology, we are able to further motivate the need for a more coordinated and informative approach to extreme event evacuation.
325

Design and Application of Discrete Explicit Filters for Large Eddy Simulation of Compressible Turbulent Flows

Deconinck, Willem 24 February 2009 (has links)
In the context of Large Eddy Simulation (LES) of turbulent flows, there is a current need to compare and evaluate different proposed subfilter-scale models. In order to carefully compare subfilter-scale models and compare LES predictions to Direct Numerical Simulation (DNS) results (the latter would be helpful in the comparison and validation of models), there is a real need for a "grid-independent" LES capability and explicit filtering methods offer one means by which this may be achieved. Advantages of explicit filtering are that it provides a means for eliminating aliasing errors, allows for the direct control of commutation errors, and most importantly allows a decoupling between the mesh spacing and the filter width which is the primary reason why there are difficulties in comparing LES solutions obtained on different grids. This thesis considers the design and assessment of discrete explicit filters and their application to isotropic turbulence prediction.
326

Control of Batch Processes Based on Hierarchical Petri Nets

ONOGI, Katsuaki, KURIMOTO, Hidekazu, HASHIZUME, Susumu, ITO, Takashi, YAJIMA, Tomoyuki 01 November 2004 (has links)
No description available.
327

Evaluating Lean Manufacturing Proposals through Discrete Event Simulation – A Case Study at Alfa Laval

Detjens, Sönke, Flores, Erik January 2013 (has links)
In their strive for success in competitive markets companies often turn to Lean philosophy. However, for many companies Lean benefits are hard to substantialize especially when their ventures have met success through traditional manufacturing approaches. Traditional Lean tools analyze current situations or help Lean implementation. Therefore productions facilities require tools that enhance the evaluation of Lean proposals in such a way that decisions are supported by quantitative data and not only on a gut feeling. This thesis proposes how Discrete Event Simulation may be used as an evaluation tool in production process improvement to decide which proposal best suits Lean requirements. Theoretical and empirical studies were carried out. Literature review helped define the problem. A case study was performed at Alfa Laval to investigate through a holistic approach how and why did this tool provide a solution to the research questions. Case study analysis was substantiated with Discrete Event Simulation models for the evaluation of current and future state Lean proposals. Results of this study show that Discrete Event Simulation was not designed and does not function as a Lean specific tool. The use of Discrete Event Simulation in Lean assessment applications requires the organization to understand the principles of Lean and its desired effects. However, the use of traditional static Lean tools such as Value Stream Mapping and dynamic Discrete Event Simulation complement each other in a variety of ways. Discrete Event Simulation provides a unique condition to account for process variability and randomness. Both measurement of and reduction in variability through simulation provide insight to Lean implementation strategies.
328

Discrete event modelling and Simulation of an Assembly Line at GKN Driveline Köping AB

Yesilgul, Mustafa, Nasser, Firas January 2013 (has links)
Today’s economic conditions force companies and organizations to work more effectively in their processes due to different reasons.  Especially; after the Second World War, owing to the changing business perception and strong competition between companies, new terms such as productivity, flexible systems, efficiency, and lean came into industrial engineering discipline. However, these kinds of terms also brought a new question. How are they reached?  At that point, discrete event simulation has been used as an effective method to give an answer to this question. From this perspective; this project focuses on discrete event simulation and its role in real industrial processes. The main interest of this paper is discrete event simulation, but in this study we also tried to give some detailed information about other types of simulations such as continuous and discrete rate. Basically, we can say that this paper consists of several parts. In the beginning of this paper, the reader can find some theoretical information about simulation itself and the requirements for implementing it on real processes. Secondly, we tried to explain different types of simulations and the reason why we used discrete event simulation instead of continuous or discrete rate in our case study. Furthermore, one of the main areas of this research is to inform the reader about how computer support is used as a simulation tool by today’s companies. To do this, a powerful software, Extendsim8, is described in detail.  The reader is able to find all the information about how to create discrete event models in this software. In case study part, we are able to find the results of the five months work that we did between February and June at GKNDriveline Köping AB in Sweden. In these five months, we had been busy with analyzing an assembly line, collecting data, creating a simulation model, discussion with workers and engineers and doing some tests such as validation & verification. In this part, the reader can find all the information about the production line and the simulation model. In conclusion, reader can find the results of the project at the end with the visualization of future state. As it will be discussed repeatedly in the paper, validation is one of the important steps in a simulation project. Therefore, in order to see the reliability of our simulation model, different calculations and tests were made. Last of all, some of results will be shown by graphs and tables in order to give better insight to reader.
329

Folding and Unfolding

Demaine, Erik January 2001 (has links)
The results of this thesis concern folding of one-dimensional objects in two dimensions: planar linkages. More precisely, a planar linkage consists of a collection of rigid bars (line segments) connected at their endpoints. Foldings of such a linkage must preserve the connections at endpoints, preserve the bar lengths, and (in our context) prevent bars from crossing. The main result of this thesis is that a planar linkage forming a collection of polygonal arcs and cycles can be folded so that all outermost arcs (not enclosed by other cycles) become straight and all outermost cycles become convex. A complementary result of this thesis is that once a cycle becomes convex, it can be folded into any other convex cycle with the same counterclockwise sequence of bar lengths. Together, these results show that the configuration space of all possible foldings of a planar arc or cycle linkage is connected. These results fall into the broader context of folding and unfolding <I>k</I>-dimensional objects in <i>n</i>-dimensional space, <I>k</I> less than or equal to <I>n</I>. Another contribution of this thesis is a survey of research in this field. The survey revolves around three principal aspects that have received extensive study: linkages in arbitrary dimensions (folding one-dimensional objects in two or more dimensions, including protein folding), paper folding (normally, folding two-dimensional objects in three dimensions), and folding and unfolding polyhedra (two-dimensional objects embedded in three-dimensional space).
330

LOAD PREDICTION FOR A MOORED CONICAL DRILLSHIP IN LEVEL UNBROKEN ICE: A DISCRETE ELEMENT AND EXPERIMENTAL INVESTIGATION

Lawrence, Karl Patrick January 2009 (has links)
This thesis is composed of theoretical, experimental, and numerical studies. In Part I, it discusses fundamental challenges of the discrete element method, provides a set of algorithms for addressing them, and presents performance gains of an improved algorithm on a target computer platform. A new contact detection and force resolution algorithm based upon (i) the fast common-plane (FCP) algorithm, (ii) using axisaligned bounding boxes (AABBs) to perform a proximity search, (iii) estimating the time of collision, and (iv) accurate resolution of contact points is presented. Benchmark simulations indicate an order of magnitude increase in performance is achievable for a relatively small number of elements. A new parallel discrete element algorithm is presented which combines the domain decomposition, object-oriented, and perfectly parallel strategies of parallelism to eliminate the drawbacks of parallel discrete element algorithms put forth by past studies. A significant speed-up is observed in comparison to past studies in trials conducted on a NUMA-based SMP computer. In Part II, various applications of the discrete element method are reviewed, with an emphasis on ice-structure interaction. The conical design of the Kulluk drillship is of particular interest due to its success in operating in the Beaufort Sea from 1975- 1993 and its subsequent purchase and recommission by Shell in 2006. Three previous experimental studies and a unique set of full-scale data measurements form the basis for comparison of a concurrent experimental and numerical investigation. The results of a model scale experiment at the NRC-IOT are analyzed and presented, followed by results of the numerical simulations. A 1:40 scale replica of the Kulluk platform in level ice produces results which are consistent with past experiments and confirm expected trends as well as different regimes of results dependent on the ductile/brittle behavior of ice. The numerical setup models the full-scale platform in three dimensions with a 24-sided rigid conical structure, ice as an elastic brittle material with plate-bending elements, and platform mooring through the implementation of a spread mooring algorithm. Numerical results are in agreement with past results for ice thickness of less than 1.2m, confirming that the initial design goal of the Kulluk was achieved while still overestimating the loads in comparison to the full-scale data set. Two explanations are presented for the non-conformity of the experimental and numerical predictions to the full-scale data results.

Page generated in 0.0634 seconds