• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 183
  • 56
  • 44
  • 23
  • 20
  • 12
  • 11
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 436
  • 94
  • 56
  • 53
  • 50
  • 46
  • 43
  • 38
  • 35
  • 32
  • 30
  • 30
  • 29
  • 27
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Design Guidelines for Reducing Redundancy in Relational and XML Data

Kolahi, Solmaz 31 July 2008 (has links)
In this dissertation, we propose new design guidelines to reduce the amount of redundancy that databases carry. We use techniques from information theory to define a measure that evaluates a database design based on the worst possible redundancy carried in the instances. We then continue by revisiting the design problem of relational data with functional dependencies, and measure the lowest price, in terms of redundancy, that has to be paid to guarantee a dependency-preserving normalization for all schemas. We provide a formal justification for the Third Normal Form (3NF) by showing that we can achieve this lowest price by doing a good 3NF normalization. We then study the design problem for XML documents that are views of relational data. We show that we can design a redundancy-free XML representation for some relational schemas while preserving all data dependencies. We present an algorithm for converting a relational schema to such an XML design. We finally study the design problem for XML documents that are stored in relational databases. We look for XML design criteria that ensure a relational storage with low redundancy. First, we characterize XML designs that have a redundancy-free relational storage. Then we propose a restrictive condition for XML functional dependencies that guarantees a low redundancy for data values in the relational storage.
62

An Integrative Approach to Reliability Analysis of an IEC 61850 Digital Substation

Zhang, Yan 1988- 14 March 2013 (has links)
In recent years, reliability evaluation of substation automation systems has received a significant attention from the research community. With the advent of the concept of smart grid, there is a growing trend to integrate more computation and communication technology into power systems. This thesis focuses on the reliability evaluation of modern substation automation systems. Such systems include both physical devices (current carrying) such as lines, circuit breakers, and transformers, as well as cyber devices (Ethernet switches, intelligent electronic devices, and cables) and belong to a broader class of cyber-physical systems. We assume that the substation utilizes IEC 61850 standard, which is a dominant standard for substation automation. Focusing on IEC 61850 standard, we discuss the failure modes and analyze their effects on the system. We utilize reliability block diagrams for analyzing the reliability of substation components (bay units) and then use the state space approach to study the effects at the substation level. Case study is based on an actual IEC 61850 substation automation system, with different network topologies consideration concluded. Our analysis provides a starting point for evaluating the reliability of the substation and the effects of substation failures to the rest of the power system. By using the state space methods, the steady state probability of each failure effects were calculated in different bay units. These probabilities can be further used in the modeling of the composite power system to analyze the loss of load probabilities.
63

Design Guidelines for Reducing Redundancy in Relational and XML Data

Kolahi, Solmaz 31 July 2008 (has links)
In this dissertation, we propose new design guidelines to reduce the amount of redundancy that databases carry. We use techniques from information theory to define a measure that evaluates a database design based on the worst possible redundancy carried in the instances. We then continue by revisiting the design problem of relational data with functional dependencies, and measure the lowest price, in terms of redundancy, that has to be paid to guarantee a dependency-preserving normalization for all schemas. We provide a formal justification for the Third Normal Form (3NF) by showing that we can achieve this lowest price by doing a good 3NF normalization. We then study the design problem for XML documents that are views of relational data. We show that we can design a redundancy-free XML representation for some relational schemas while preserving all data dependencies. We present an algorithm for converting a relational schema to such an XML design. We finally study the design problem for XML documents that are stored in relational databases. We look for XML design criteria that ensure a relational storage with low redundancy. First, we characterize XML designs that have a redundancy-free relational storage. Then we propose a restrictive condition for XML functional dependencies that guarantees a low redundancy for data values in the relational storage.
64

A self restoring system on low voltage level

Bergquist, Hampus January 2012 (has links)
Fortums electric grid in Norra Djurgårdsstaden is a test grid for smart equipment and they are investigating new techniques and ways to improve the quality of the grid. With the quality improvements that are researched, a "self-restoring system" is a part of the research with the intention to lower the amount of outages and shorten the time it takes to restore faults. This thesis can be seen as a part of the optimization process of the grid in Norra Djurgårdsstaden where the benefits with a basic self-restoring system have been investigated on low voltage level. In the thesis the self-restoring system has been classified into a "basic" and an "advanced" category. The basic self-restoring system cross-connect several feeding paths by cross-connecting different low voltage grids and use mechanical equipment to change between cables when a fault in a cable occurs. The advanced self-restoring system uses several feeders and smart grid technology with equipment and softwares which communicate and visualize the grid. The difference between the systems is that the advanced system can visualize the grid and is able to tell when and where faults have occurred to a more detailed level. The advanced system can also calculate the power available and does not need the same amount of cables for redundancy because it can command users to lower their consumption when an outage has occurred. A decision was made to only investigate the technique on low voltage level because a basic system already exists on medium voltage in Norra Djurgårdsstaden. Results show that investing in a basic self-restoring system in Norra Djurgårdsstaden would cost about 2 million SEK and lower the total amount of outages for the customers in the area from 45 minutes per customer and year down to about 41 minutes. The reason why the decrease is only four minutes per year and customer is because faults occurring on higher voltage level cannot be reduced with the system. It is totally about 10 % of the faults that occur on low voltage level. One conclusion from the thesis is that the reduction in quality costs which are because to the lowered outages will not be enough to pay back the investment. More outage-time per customer and year need to be prevented with the system or the customers need to value reduced outages significantly more.
65

Design of the Network Controller with Improving the real-time UDP Packets Reliability

Li, Ang-Lian 24 August 2006 (has links)
Nowadays, the methods of improving the Reliability of network packets are existent. Some start with the software and others with the hardware. For example, like the Dual System, Redundancy or Fault Tolerant Network, etc. However, it costs a lot to construct the required mechanism, and these methods are not reliable for some special network packets, like the UDP packet. Is there a method to raise the reliability of UDP packet just to increase the software equipments or hardware equipments? The network controller that improves the Reliability of UDP Packets in this thesis uses the same two packets, and it transfers the two packets into different network paths to the same destination workstation. This mechanism can avoid the network accidents causing by the network wires breaking or the network switch machine turn-off, and the destination workstation can¡¦t be hurt from the damage of losing the information carried by UDP packets. Moreover, this method can detect the network accidents as the function of the Fault Tolerant Network by using the dual packets, and send signals to alert the network manager ahead of time. In addition, by using the network controller, network manager or constructor mentioned in this thesis, the RNP topology can be easily built, just by connecting the RNP type wires to the network switches or network bridges.
66

Broadcasting Support in Mobile Ad Hoc Wireless Local Area Networks

Chang, Shu-Ping 01 July 2003 (has links)
Broadcasting is a fundamental primitive in local area networks (LANs).Operations of many data link protocols, for example, ARP (Address Resolution Protocol) and IGMP (Internet Group Management Protocol), must rely on this LAN primitive. To develop the broadcasting service in mobile ad hoc wireless LANs (WLANs) is a challenge. This is because a mobile ad hoc WLAN is a multi-hop wireless network in which messages may travel along several links from the source to the destination via a certain path. Additionally, there is no fixed network topology because of host moving. Furthermore, the broadcast nature of a radio channel makes a packet be transmitted by a node to be able to reach all neighbors. Therefore, the total number of transmissions (forward nodes) is generally used as the cost criterion for broadcasting. The problem of finding the minimum number of forward nodes in a static radio network is NP-complete. Almost all previous works, therefore, for broadcasting in the WLAN are focusing on finding approximation approaches in a, rather than, environment. In this paper, we propose a novel distributed protocol in WLANs to significantly reduce or eliminate the communication overhead in addition to maintaining positions of neighboring nodes. The important features of our proposed protocol are the adaptability to dynamic network topology change and the localized and parameterless behavior. The reduction in communication overhead for broadcasting operation is measured experimentally. From the simulation results, our protocol not only has the similar performance as the approximation approaches in the static network, but also outperforms existing ones in the adaptability to host moving.
67

A neural network-based sensor validation scheme within aircraft control laws

Stolarik, Brian M. January 2001 (has links)
Thesis (M.S.)--West Virginia University, 2001. / Title from document title page. Document formatted into pages; contains xiii, 117 p. : ill. (some col.). Vita. Includes abstract. Includes bibliographical references (p. 95-96).
68

Geostatistical data integration in complex reservoirs

Elahi Naraghi, Morteza 03 February 2015 (has links)
One of the most challenging issues in reservoir modeling is to integrate information coming from different sources at disparate scales and precision. The primary data are borehole measurements, but in most cases, these are too sparse to construct accurate reservoir models. Therefore, in most cases, the information from borehole measurements has to be supplemented with other secondary data. The secondary data for reservoir modeling could be static data such as seismic data or dynamic data such as production history, well test data or time-lapse seismic data. Several algorithms for integrating different types of data have been developed. A novel method for data integration based on the permanence of ratio hypothesis was proposed by Journel in 2002. The premise of the permanence of ratio hypothesis is to assess the information from each data source separately and then merge the information accounting for the redundancy between the information sources. The redundancy between the information from different sources is accounted for using parameters (tau or nu parameters, Krishnan, 2004). The primary goal of this thesis is to derive a practical expression for the tau parameters and demonstrate the procedure for calibrating these parameters using the available data. This thesis presents two new algorithms for data integration in reservoir modeling. The algorithms proposed in this thesis overcome some of the limitations of the current methods for data integration. We present an extension to the direct sampling based multiple-point statistics method. We present a methodology for integrating secondary soft data in that framwork. The algorithm is based on direct pattern search through an ensemble of realizations. We show that the proposed methodology is sutiable for modeling complex channelized reservoirs and reduces the uncertainty associated with production performance due to integration of secondary data. We subsequently present the permanence of ratio hypothesis for data integration in great detail. We present analytical equations for calculating the redundancy factor for discrete or continuous variable modeling. Then, we show how this factor can be infered using available data for different scenarios. We implement the method to model a carbonate reservoir in the Gulf of Mexico. We show that the method has a better performance than when primary hard and secondary soft data are used within the traditional geostatistical framework. / text
69

Formulating Evaluation Measures for Structured Document Retrieval using Extended Structural Relevance

Ali, Mir Sadek 06 December 2012 (has links)
Structured document retrieval (SDR) systems minimize the effort users spend to locate relevant information by retrieving sub-documents (i.e., parts of, as opposed to entire, documents) to focus the user's attention on the relevant parts of a retrieved document. SDR search tasks are differentiated by the multiplicity of ways that users prefer to spend effort and gain relevant information in SDR. The sub-document retrieval paradigm has required researchers to undertake costly user studies to validate whether new IR measures, based on gain and effort, accurately capture IR performance. We propose the Extended Structural Relevance (ESR) framework as a way, akin to classical set-based measures, to formulate SDR measures that share the common basis of our proposed pillars of SDR evaluation: relevance, navigation and redundancy. Our experimental results show how ESR provides a flexible way to formulate measures, and addresses the challenge of testing measures across related search tasks by replacing costly user studies with low-cost simulation.
70

Design of Fault Tolerant Control System for Electric Vehicles with Steer-By-Wire and In-Wheel Motors

Hayakawa, Yoshikazu, Ito, Akira 09 1900 (has links)
7th IFAC Symposium on Advances in Automotive Control, The International Federation of Automatic Control, September 4-7, 2013. Tokyo, Japan

Page generated in 0.177 seconds