• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 137
  • 79
  • 41
  • 23
  • 16
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 370
  • 61
  • 56
  • 52
  • 51
  • 45
  • 39
  • 37
  • 36
  • 34
  • 33
  • 30
  • 29
  • 29
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Stochastic Modeling of Orb-Web Capture Mechanics Supports the Importance of Rare Large Prey for Spider Foraging Success and Suggests How Webs Sample Available Biomass

Evans, Samuel C. January 2013 (has links)
No description available.
122

MODELING UNSTEADINESS IN STEADY SIMULATIONS WITH NEURAL NETWORK GENERATED LUMPED DETERMINISTIC SOURCE TERMS

LUKOVIC, BOJAN January 2002 (has links)
No description available.
123

EFFECTS OF HOT STREAK MANAGEMENT ON SURFACE TEMPERATURE OF FIRST-STAGE TURBINE USING LINEARIZED METHOD

GUPTA, VIPUL KUMAR 11 October 2001 (has links)
No description available.
124

The design of a probabilistic engineering economic analysis package for a microcomputer

Puetz, Gilbert H. January 1985 (has links)
No description available.
125

Delay, Stop and Queue Estimation for Uniform and Random Traffic Arrivals at Fixed-Time Signalized Intersections

Kang, Youn-Soo 24 April 2000 (has links)
With the introduction of different forms of adaptive and actuated signal control, there is a need for effective evaluation tools that can capture the intricacies of real-life applications. While the current state-of-the-art analytical procedures provide simple approaches for estimating delay, queue length and stops at signalized intersections, they are limited in scope. Alternatively, several microscopic simulation softwares are currently available for the evaluation of signalized intersections. The objective of this dissertation is fourfold. First, it evaluates the consistency, accuracy, limitations and scope of the alternative analytical models. Second, it evaluates the validity of micro simulation results that evolve as an outcome of the car-following relationships. The validity of these models is demonstrated for idealized hypothetical examples where analytical solutions can be derived. Third, the dissertation expands the scope of current analytical models for the evaluation of oversaturated signalized intersections. Finally, the dissertation demonstrates the implications of using analytical models for the evaluation of real-life network and traffic configurations. This dissertation compared the delay estimates from numerous models for an undersaturated and oversaturated signalized intersection considering uniform and random arrivals in an attempt to systematically evaluate and demonstrate the assumptions and limitations of different delay estimation approaches. Specifically, the dissertation compared a theoretical vertical queuing analysis model, the queue-based models used in the 1994 and 2000 versions of the Highway Capacity Manual, the queue-based model in the 1995 Canadian Capacity Guide for Signalized Intersections, a theoretical horizontal queuing model derived from shock wave analysis, and the delay estimates produced by the INTEGRATION microscopic traffic simulation software. The results of the comparisons for uniform arrivals indicated that all delay models produced identical results under such traffic conditions, except for the estimates produced by the INTEGRATION software, which tended to estimate slightly higher delays than the other approaches. For the random arrivals, the results of the comparisons indicated that the delay estimates obtained by a micro-simulation model like INTEGRATION were consistent with the delay estimates computed by the analytical approaches. In addition, this dissertation compared the number of stops and the maximum extent of queue estimates using analytical procedures and the INTEGRATION simulation model for both undersaturated and oversaturated signalized intersections to assess their consistency and to analyze their applicability. For the number of stops estimates, it is found that there is a general agreement between the INTEGRATION microscopic simulation model and the analytical models for undersaturated signalized intersections. Both uniform and random arrivals demonstrated consistency between the INTEGRATION model and the analytical procedures; however, at a v/c ratio of 1.0 the analytical models underestimate the number of stops. The research developed an upper limit and a proposed model for estimating the number of vehicle stops for oversaturated conditions. It was demonstrated that the current state-of-the-practice analytical models can provide stop estimates that far exceed the upper bound. On the other hand, the INTEGRATION model was found to be consistent with the upper bound and demonstrated that the number of stops converge to 2.3 as the v/c ratio tends to 2.0. For the maximum extent of queue estimates, the estimated maximum extent of queue predicted from horizontal shock wave analysis was higher than the predictions from vertical deterministic queuing analysis. The horizontal shock wave model predicted lower maximum extent of queue than the CCG 1995 model. For oversaturated conditions, the vertical deterministic queuing model underestimated the maximum queue length. It was found that the CCG 1995 predictions were lower than those from the horizontal shock wave model. These differences were attributed to the fact that the CCG 1995 model estimates the remaining residual queue at the end of evaluation time. A consistency was found between the INTEGRATION model and the horizontal shock wave model predictions with respect to the maximum extent of queue for both undersaturated and oversaturated signalized intersections. Finally, the dissertation analyzed the impact of mixed traffic condition on the vehicle delay, person delay, and number of vehicle stops at a signalized intersection. The analysis considered approximating the mixed flow for equivalent homogeneous flows using two potential conversion factors. The first of these conversion factors was based on relative vehicle lengths while the second was based on relative vehicle riderships. The main conclusion of the analysis was that the optimum vehicle equivalency was dependent on the background level of congestion, the transit vehicle demand, and the Measure of Effectiveness (MOE) being considered. Consequently, explicit simulation of mixed flow is required in order to capture the unique vehicle interactions that result from mixed flow. Furthermore, while homogeneous flow approximations might be effective for some demand levels, these approximations are not consistently effective. / Ph. D.
126

Designing Security Defenses for Cyber-Physical Systems

Foruhandeh, Mahsa 04 May 2022 (has links)
Legacy cyber-physical systems (CPSs) were designed without considering cybersecurity as a primary design tenet especially when considering their evolving operating environment. There are many examples of legacy systems including automotive control, navigation, transportation, and industrial control systems (ICSs), to name a few. To make matters worse, the cost of designing and deploying defenses in existing legacy infrastructure can be overwhelming as millions or even billions of legacy CPS systems are already in use. This economic angle, prevents the use of defenses that are not backward compatible. Moreover, any protection has to operate efficiently in resource constraint environments that are dynamic nature. Hence, the existing approaches that require ex- pensive additional hardware, propose a new protocol from scratch, or rely on complex numerical operations such as strong cryptographic solutions, are less likely to be deployed in practice. In this dissertation, we explore a variety of lightweight solutions for securing different existing CPSs without requiring any modifications to the original system design at hardware or protocol level. In particular, we use fingerprinting, crowdsourcing and deterministic models as alternative backwards- compatible defenses for securing vehicles, global positioning system (GPS) receivers, and a class of ICSs called supervisory control and data acquisition (SCADA) systems, respectively. We use fingerprinting to address the deficiencies in automobile cyber-security from the angle of controller area network (CAN) security. CAN protocol is the de-facto bus standard commonly used in the automotive industry for connecting electronic control units (ECUs) within a vehicle. The broadcast nature of this protocol, along with the lack of authentication or integrity guarantees, create a foothold for adversaries to perform arbitrary data injection or modification and impersonation attacks on the ECUs. We propose SIMPLE, a single-frame based physical layer identification for intrusion detection and prevention on such networks. Physical layer identification or fingerprinting is a method that takes advantage of the manufacturing inconsistencies in the hardware components that generate the analog signal for the CPS of our interest. It translates the manifestation of these inconsistencies, which appear in the analog signals, into unique features called fingerprints which can be used later on for authentication purposes. Our solution is resilient to ambient temperature, supply voltage value variations, or aging. Next, we use fingerprinting and crowdsourcing at two separate protection approaches leveraging two different perspectives for securing GPS receivers against spoofing attacks. GPS, is the most predominant non-authenticated navigation system. The security issues inherent into civilian GPS are exacerbated by the fact that its design and implementation are public knowledge. To address this problem, first we introduce Spotr, a GPS spoofing detection via device fingerprinting, that is able to determine the authenticity of signals based on their physical-layer similarity to the signals that are known to have originated from GPS satellites. More specifically, we are able to detect spoofing activities and track genuine signals over different times and locations and propagation effects related to environmental conditions. In a different approach at a higher level, we put forth Crowdsourcing GPS, a total solution for GPS spoofing detection, recovery and attacker localization. Crowdsourcing is a method where multiple entities share their observations of the environment and get together as a whole to make a more accurate or reliable decision on the status of the system. Crowdsourcing has the advantage of deployment with the less complexity and distributed cost, however its functionality is dependent on the adoption rate by the users. Here, we have two methods for implementing Crowdsourcing GPS. In the first method, the users in the crowd are aware of their approximate distance from other users using Bluetooth. They cross validate this approximate distance with the GPS-derived distance and in case of any discrepancy they report ongoing spoofing activities. This method is a strong candidate when the users in the crowd have a sparse distribution. It is also very effective when tackling multiple coordinated adversaries. For method II, we exploit the angular dispersion of the users with respect to the direction that the adversarial signal is being transmitted from. As a result, the users that are not facing the attacker will be safe. The reason for this is that human body mostly comprises of water and absorbs the weak adversarial GPS signal. The safe users will help the spoofed users find out that there is an ongoing attack and recover from it. Additionally, the angular information is used for localizing the adversary. This method is slightly more complex, and shows the best performance in dense areas. It is also designed based on the assumption that the spoofing attack is only terrestrial. Finally, we propose a tandem IDS to secure SCADA systems. SCADA systems play a critical role in most safety-critical infrastructures of ICSs. The evolution of communications technology has rendered modern SCADA systems and their connecting actuators and sensors vulnerable to malicious attacks on both physical and application layers. The conventional IDS that are built for securing SCADA systems are focused on a single layer of the system. With the tandem IDS we break this habit and propose a strong multi-layer solution which is able to expose a wide range of attack. To be more specific, the tandem IDS comprises of two parts, a traditional network IDS and a shadow replica. We design the shadow replica as a deterministic IDS. It performs a workflow analysis and makes sure the logical flow of the events in the SCADA controller and its connected devices maintain their expected states. Any deviation would be a malicious activity or a reliability issue. To model the application level events, we leverage finite state machines (FSMs) to compute the anticipated states of all of the devices. This is feasible because in many of the existing ICSs the flow of traffic and the resulting states and actions in the connected devices have a deterministic nature. Consequently, it leads to a reliable and free of uncertainty solution. Aside from detecting traditional network attacks, our approach bypasses the attacker in case it succeeds in taking over the devices and also maintains continuous service if the SCADA controller gets compromised. / Doctor of Philosophy / Our lives are entangled with cyber-physical systems (CPSs) on a daily basis. Examples of these systems are vehicles, navigation systems, transportation systems, industrial control systems, etc. CPSs are mostly legacy systems and were built with a focus on performance, overlooking security. Security was not considered in the design of these old systems and now they are dominantly used in our everyday life. After numerous demonstration of cyber hacks, the necessity of protecting the CPSs from adversarial activities is no longer ambiguous. Many of the advanced cryptographic techniques are far too complex to be implemented in the existing CPSs such as cars, satellites, etc. We attempt to secure such resource constraint systems using simple backward compatible techniques in this dissertation. We design cheap lightweight solutions, with no modifications to the original system. In part of our research, we use fingerprinting as a technique to secure passenger cars from being hacked, and GPS receivers from being spoofed. For a brief description of fingerprinting, we use the example of two identical T-shirts with the same size and design. They will always have subtle differences between them no matter how hard the tailor tried to make them identical. This means that there are no two T-shirts that are exactly identical. This idea, when applied to analog signalling on electric devices, is called fingerprinting. Here, we fingerprint the mini computers inside a car, which enables us to identify these computers and prevent hacking. We also use the signal levels to design fingerprints for GPS signals. We use the fingerprints to distinguish counterfeit GPS signals from the ones that have originated from genuine satellites. This summarizes two major contributions in the dissertation. Our earlier contribution to GPS security was effective, but it was heavily dependent on the underlying hardware, requiring extensive training for each radio receiver that it was protecting. To remove this dependence of training for the specific underlying hardware, we design and implement the next framework using defenses that require application-layer access. Thus, we proposed two methods that leverage crowdsourcing approaches to defend against GPS spoofing attacks and, at the same time, improve the accuracy of localization for commodity mobile devices. Crowdsourcing is a method were several devices agree to share their information with each other. In this work, GPS users share their location and direction information, and in case of any discrepancy they figure that they are under attack and cooperate to recover from it. Last, we shift the gear to the industrial control systems (ICSs) and propose a novel IDS to protect them against various cyber attacks. Unlike the conventional IDSs that are focused on one of the layers of the system, our IDS comprises of two main components. A conventional component that exposes traditional attacks and a second component called a shadow replica. The replica mimics the behavior of the system and compares it with that of the actual system in a real-time manner. In case of any deviation between the two, it detects attacks that target the logical flow of the events in the system. Note that such attacks are more sophisticated and difficult to detect because they do not leave any obvious footprints behind. Upon detection of attacks on the original controller, our replica takes over the responsibilities of the original ICS controller and provides service continuity.
127

Service ORiented Computing EnviRonment (SORCER) for Deterministic Global and Stochastic Optimization

Raghunath, Chaitra 13 September 2015 (has links)
With rapid growth in the complexity of large scale engineering systems, the application of multidisciplinary analysis and design optimization (MDO) in the engineering design process has garnered much attention. MDO addresses the challenge of integrating several different disciplines into the design process. Primary challenges of MDO include computational expense and poor scalability. The introduction of a distributed, collaborative computational environment results in better utilization of available computational resources, reducing the time to solution, and enhancing scalability. SORCER, a Java-based network-centric computing platform, enables analyses and design studies in a distributed collaborative computing environment. Two different optimization algorithms widely used in multidisciplinary engineering design---VTDIRECT95 and QNSTOP---are implemented on a SORCER grid. VTDIRECT95, a Fortran 95 implementation of D. R. Jones' algorithm DIRECT, is a highly parallelizable derivative-free deterministic global optimization algorithm. QNSTOP is a parallel quasi-Newton algorithm for stochastic optimization problems. The purpose of integrating VTDIRECT95 and QNSTOP into the SORCER framework is to provide load balancing among computational resources, resulting in a dynamically scalable process. Further, the federated computing paradigm implemented by SORCER manages distributed services in real time, thereby significantly speeding up the design process. Results are included for an aircraft design application. / Master of Science
128

Mechanistic Modeling of Biodiesel Production via Heterogeneous Catalysis

Lerkkasemsan, Nuttapol 25 May 2010 (has links)
Biodiesel has emerged as a promising renewable and clean energy alternative to petrodiesel. While biodiesel has traditionally been prepared through homogeneous basic catalysis, heterogeneous acid catalysis has been investigated recently due to its ability to convert cheaper but high free fatty acid content oils such as waste vegetable oil while decreasing production cost. In this work, the esterification of free fatty acid over sulfated zirconia and activated acidic alumina in a batch reactor was considered. The models of the reaction over the catalysts were developed in two parts. First, a kinetic study was performed using a deterministic model to develop a suitable kinetic expression; the related parameters were subsequently estimated by numerical techniques. Second, a stochastic model was developed to further confirm the nature of the reaction at the molecular level. The esterification of palmitic acid obeyed the Eley-Rideal mechanism in which palmitic acid and methanol are adsorbed on the surface for SO?/ZrO?-550°C and AcAl?O? respectively. The coefficients of determination of the deterministic model were 0.98, 0.99 and 0.99 for SO?/ZrO?-550°C at 40, 60 and 80°C respectively and 0.99, 0.98 and 0.96 for AcAl?O? at the same temperature. The deterministic and stochastic models were in good agreement. / Master of Science
129

Sparse Fast Trigonometric Transforms

Bittens, Sina Vanessa 13 June 2019 (has links)
No description available.
130

Minimizing memory requirements for deterministic test data in embedded testing

Ahlström, Daniel January 2010 (has links)
<p>Embedded and automated tests reduce maintenance costs for embedded systems installed in remote locations. Testing multiple components of an embedded system, connected on a scan chain, using deterministic test patterns stored in a system provide high fault coverage but require large system memory. This thesis presents an approach to reduce test data memory requirements by the use of a test controller program, utilizing the observation of that there are multiple components of the same type in a system. The program use deterministic test patterns specific to every component type, which is stored in system memory, to create fully defined test patterns when needed. By storing deterministic test patterns specific to every component type, the program can use the test patterns for multiple tests and several times within the same test. The program also has the ability to test parts of a system without affecting the normal functional operation of the rest of the components in the system and without an increase of test data memory requirements. Two experiments were conducted to determine how much test data memory requirements are reduced using the approach presented in this thesis. The results for the experiments show up to 26.4% reduction of test data memory requirements for ITC´02 SOC test benchmarks and in average 60% reduction of test data memory requirements for designs generated to gain statistical data.</p>

Page generated in 0.1865 seconds