• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1752
  • 650
  • 251
  • 236
  • 138
  • 71
  • 55
  • 38
  • 26
  • 19
  • 18
  • 15
  • 15
  • 12
  • 11
  • Tagged with
  • 3764
  • 3764
  • 729
  • 722
  • 601
  • 543
  • 543
  • 475
  • 474
  • 427
  • 404
  • 380
  • 347
  • 332
  • 273
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
981

A Hilbert Curve-Based Algorithm for Order-Sensitive Moving KNN Queries

Feng, Fei-Chung 11 July 2012 (has links)
¡@¡@Due to wireless communication technologies, positioning technologies, and mobile computing develop quickly, mobile services are becoming practical and important on big spatiotemporal databases management. Mobile service users move only inside a spatial space, e:g: a country. They often issue the K Nearest Neighbor (kNN) query to obtain data objects reachable through the spatial database. The challenge problem of mobile services is how to efficiently answer the data objects which users interest to the corresponding mobile users. One type of kNN query problems is the order-sensitive moving kNN (order-sensitive MkNN) query problem. In the order-sensitive MkNN query problem, the query point is dynamic and unpredictable, the kNN answers should be responded in real time and sorted by the distance in the ascending order. Therefore, how to respond the kNN answers effectively, incrementally and correctly is an important issue. Nutanong et al: have proposed the V*-kNN algorithm to process the order-sensitive MkNN query. The V*-kNN algorithm uses their the V*-diagram algorithm to generate the safe region. It also uses the Incremental Rank Updates algorithm (IRU) to handle the events while the query point passing the bisectors or the boundary of the safe region. However, the V*-kNN algorithm uses the BF-kNN algorithm to retrieve NNs, which is non-incremental. This makes the search time increase while the density of the object increases. Moreover, they do not consider the situation that there are multiple objects at the same order, and the situation that there are multiple events happen in a single step. These situations may cause that the kNN answers are incorrect. Therefore, in this thesis, we propose the Hilbert curve-based kNN algorithm (HC-kNN) algorithm to process the ordersensitive MkNN query. The HC-kNN algorithm can handle the situation that there are multiple events happen in a single step. We also propose new data structure of the kNN answers. Next, we propose the Intersection of Perpendicular Bisectors algorithm (IPB) in order to handle order update events of the kNN answers. The IPB algorithm handles the situation which there are multiple objects at the same order. Finally, based on the Hilbert curve index, we propose the ONHC-kNN algorithm to get NNs incrementally and to generate the safe region. The safe region will not be affected while the density of the object increases. The safe region of our algorithm is larger than that of the V*-kNN algorithm. From our simulation result, we show that the HC-kNN algorithm provides better performance than the V*-kNN algorithm.
982

Human step-length recognition and real-time localization base on embedded systems

Yeh, Jiun-Ying 03 September 2012 (has links)
Along with the development of localization and navigation technologies, the Global Positioning System (GPS) plays an important role in our daily life, but it is confined in outdoor environments. The technology of human localization has been developed in recent years. This technology utilizes sensors to determine the movement of human and measure the distance of walking, which is not only used to solve the problem of GPS out-of-lock, but also used for the indoor localization. This thesis describes a human step-length recognition and real-time localization base on an embedded system. The goal of this system is to develop a gait pattern classification and pedestrian dead reckoning (PDR) method for human localization. Through the information of an Inertial Measurement Unit (IMU) and two force sensors mounted on a shoe, the wireless transmission module is used to send data of sensors to an embedded platform. Then the functions of step detection, step length estimation and gait pattern recognition can be achieved. According to coordinate transformation and the ZUPT algorithm, the accumulated error of velocity can be corrected. The dead reckoning method is used to obtain the information of location. Finally, the information of human location and gait patterns is sent to the Android system for remote monitoring.
983

Study The Change Of Blood Enteric Bacterial DNA Load In Patients With Systemic Inflammatory Response Syndrome

Yang, Ming-chieh 12 September 2012 (has links)
Early detection of infection, identification of microorganism, and correct choice of antibiotics are critical in the management of sepsis. Quantitative real-time polymerase chain reaction (RT-PCR) has the potential to improve the timeliness, sensitivity, and accuracy of detecting pathogens. In this study we utilize this method to detect the enteric bacterial counts in the blood from patients with systemic inflammatory response syndrome (SIRS) in the emergency department (ED). The universal primers utilized in RT-PCR are specific for 23S ribosomal DNA (rDNA) and wec F gene. The results show that in SIRS patients with positive culture results from specimen collected within 10 days after presenting to ED, and patients surviving for less than 28 days, the serum bacterial DNA load of enteric Gram negative bacilli is higher. In SIRS patients with shock, patients fulfilling both white blood cell counts and respiratory criteria of SIRS, and patients fulfilling both white blood cell counts and respiratory criteria of SIRS with Acute Physiology and Chronic Health Evaluation II score more than 20, the serum bacterial DNA load of enteric Gram negative bacilli and 28-day mortality are both higher. These results suggest that bacterial translocation may happen in patients with SIRS and may be related to higher mortality in patients with SIRS.
984

Estimating forest structural characteristics with airborne lidar scanning and a near-real time profiling laser systems

Zhao, Kaiguang 15 May 2009 (has links)
LiDAR (Light Detection and Ranging) directly measures canopy vertical structures, and provides an effective remote sensing solution to accurate and spatiallyexplicit mapping of forest characteristics, such as canopy height and Leaf Area Index. However, many factors, such as large data volume and high costs for data acquisition, precludes the operational and practical use of most currently available LiDARs for frequent and large-scale mapping. At the same time, a growing need is arising for realtime remote sensing platforms, e.g., to provide timely information for urgent applications. This study aims to develop an airborne profiling LiDAR system, featured with on-the-fly data processing, for near real- or real- time forest inventory. The development of such a system involves implementing the on-board data processing and analysis as well as building useful regression-based models to relate LiDAR measurements with forest biophysical parameters. This work established a paradigm for an on-the-fly airborne profiling LiDAR system to inventory regional forest resources in real- or near real- time. The system was developed based on an existing portable airborne laser system (PALS) that has been previously assembled at NASA by Dr. Ross Nelson. Key issues in automating PALS as an on-the-fly system were addressed, including the design of an archetype for the system workflow, the development of efficient and robust algorithms for automatic data processing and analysis, the development of effective regression models to predict forest biophysical parameters from LiDAR measurements, and the implementation of an integrated software package to incorporate all the above development. This work exploited the untouched potential of airborne laser profilers for realtime forest inventory, and therefore, documented an initial step toward developing airborne-laser-based, on-the-fly, real-time, forest inventory systems. Results from this work demonstrated the utility and effectiveness of airborne scanning or profiling laser systems for remotely measuring various forest structural attributes at a range of scales, i.e., from individual tree, plot, stand and up to regional levels. The system not only provides a regional assessment tool, one that can be used to repeatedly, remotely measure hundreds or thousands of square kilometers with little/no analyst interaction or interpretation, but also serves as a paradigm for future efforts in building more advanced airborne laser systems such as real-time laser scanners.
985

Development of a Real-Time Detection Strategy for Material Accountancy and Process Monitoring During Nuclear Fuel Reprocessing Using the Urex+3A Method

Goddard, Braden 2009 December 1900 (has links)
Reprocessing nuclear fuel is becoming more viable in the United States due to the anticipated increase in construction of nuclear power plants, the growing stockpile of existing used nuclear fuel, and a public desire to reduce the amount of this fuel. However, a new reprocessing facility in non-weapon states must be safeguarded and new reprocessing facilities in weapon states will likely have safeguards due to political and material accountancy reasons. These facilities will have state of the art controls and monitoring methods to safeguard special nuclear materials, as well as to provide real-time monitoring. The focus of this project is to enable the development of a safeguards strategy that uses well established photon measurement methods to characterize samples from the UREX+3a reprocessing method using a variety of detector types and measurement times. It was determined that the errors from quantitative measurements were too large for traditional safeguards methods; however, a safeguards strategy based on qualitative gamma ray and neutron measurements is proposed. The gamma ray detection equipment used in the safeguard strategy could also be used to improve the real-time process monitoring in a yet-to-be built facility. A facility that had real-time gamma detection equipment could improve product quality control and provide additional benefits, such as waste volume reduction. In addition to the spectral analyses, it was determined by Monte Carlo N Particle (MCNP) simulations that there is no noticeable self shielding for internal pipe diameters less than 2 inches, indicating that no self shielding correction factors are needed. Further, it was determined that HPGe N-type detectors would be suitable for a neutron radiation environment. Finally, the gamma ray spectra for the measured samples were simulated using MCNP and then the model was extended to predict the responses from an actual reprocessing scenario from UREX+3a applied to fuel that had a decay time of three years. The 3-year decayed fuel was more representative of commercially reprocessed fuel than the acquired UREX+3a samples. This research found that the safeguards approach proposed in this paper would be best suited as an addition to existing safeguard strategies. Real-time gamma ray detection for process monitoring would be beneficial to a reprocessing facility and could be done with commercially available detectors.
986

Real-time Water Waves with Wave Particles

Yuksel, Cem 2010 August 1900 (has links)
This dissertation describes the wave particles technique for simulating water surface waves and two way fluid-object interactions for real-time applications, such as video games. Water exists in various different forms in our environment and it is important to develop necessary technologies to be able to incorporate all these forms in real-time virtual environments. Handling the behavior of large bodies of water, such as an ocean, lake, or pool, has been computationally expensive with traditional techniques even for offline graphics applications, because of the high resolution requirements of these simulations. A significant portion of water behavior for large bodies of water is the surface wave phenomenon. This dissertation discusses how water surface waves can be simulated efficiently and effectively at real-time frame rates using a simple particle system that we call "wave particles." This approach offers a simple, fast, and unconditionally stable solution to wave simulation. Unlike traditional techniques that try to simulate the water body (or its surface) as a whole with numerical techniques, wave particles merely track the deviations of the surface due to waves forming an analytical solution. This allows simulation of seemingly infinite water surfaces, like an open ocean. Both the theory and implementation of wave particles are discussed in great detail. Two-way interactions of floating objects with water is explained, including generation of waves due to object interaction and proper simulation of the effect of water on the object motion. Timing studies show that the method is scalable, allowing simulation of wave interaction with several hundreds of objects at real-time rates.
987

New Insights Into the Role of Equine Infectious Anemia Virus S2 Protein in Disease Expression

Covaleda Salas, Lina M. 2010 May 1900 (has links)
Equine infectious anemia virus (EIAV) is an important animal model to study the contribution of macrophages in viral persistence during lentiviral infections. EIAV is unique amongst the lentiviruses in that it causes a rapid, rather than the very slow disease progression, characteristic of other lentiviral infections. The accessory gene, S2, unique to EIAV, is an important determinant in viral pathogenesis. A functional S2 gene is required to achieve high-titer viremia and the development of disease in infected horses. Despite its essential role, the mechanisms by which S2 influences EIAV pathogenesis remain elusive. The goal of this research was to gain insight into the role of S2 in pathogenesis. To accomplish this goal we: (i) Examined the effects of EIAV and its S2 protein in the regulation of the cytokine and chemokine responses in macrophages, (ii) Assessed the influence of EIAV infection and the effect of S2 on global gene expression in macrophages and (iii) Identified host cellular proteins that interact with S2 as a starting point for the identification of host factors implicated in S2 function. The results from this study provide evidence for a role of S2 in enhancing a proinflammatory cytokine and chemokine response in infected macrophages. Specifically, S2 enhances the expression of IL-1 alpha, IL-1 beta IL-8, MCP-2, MIP-1 beta, IP-10 and a newly discovered cytokine, IL-34. Involvement of S2 in cytokine and chemokine dysregulation may contribute to disease development by optimizing the host cell environment to promote viral dissemination and replication. Microarray analyses revealed an interesting set of differentially expressed genes upon EIAV infection. Genes affected by EIAV were involved in the immune response, transcription, translation, cell cycle and cell survival. Finally, we used the yeast two-hybrid system to identify S2 host cellular interacting proteins. We identified osteosarcoma amplified 9 (OS-9) and proteasome 26S ATPase subunit 3 (PSMC3) proteins as interacting partners of S2. Additional evidence is needed to demonstrate the physiological relevance of these interactions in vivo. In summary, the results from this study contribute towards our understanding of the role S2 in disease expression and allow the formulation of new hypotheses as to the potential mechanisms of action of S2 during EIAV infection.
988

On Design and Realization of New Generation Misson-critial Application Systems

Mai, Zhibin 2011 May 1900 (has links)
Mission-critical system typically refers to a project or system for which the success is vital to the mission of the underlying organization. The failure or delayed completion of the tasks in mission-critical systems may cause severe financial loss, even human casualties. For example, failure of an accurate and timely forecast of Hurricane Rita in September 2005 caused enormous financial loss and several deaths. As such, real-time guarantee and reliability have always been two key foci of mission-critical system design. Many factors affect real-time guarantee and reliability. From the software design perspective, which is the focus of this paper, three aspects are most important. The first of these is how to design a single application to effectively support real-time requirement and improve reliability, the second is how to integrate different applications in a cluster environment to guarantee real-time requirement and improve reliability, and the third is how to effectively coordinate distributed applications to support real-time requirements and improve reliability. Following these three aspects, this dissertation proposes and implements three novel methodologies: real-time component based single node application development, real-time workflow-based cluster application integration, and real-time distributed admission control. For ease of understanding, we introduce these three methodologies and implementations in three real-world mission-critical application systems: single node mission-critical system, cluster environment mission-critical system, and wide-area network mission-critical system. We study full-scale design and implementation of these mission-critical systems, more specifically: 1) For the single node system, we introduce a real-time component based application model, a novel design methodology, and based on the model and methodology, we implement a real-time component based Enterprise JavaBean (EJB) System. Through component based design, efficient resource management and scheduling, we show that our model and design methodology can effectively improve system reliability and guarantee real-time requirement. 2) For the system in a cluster environment, we introduce a new application model, a real-time workflow-based application integration methodology, and based on the model and methodology, we implement a data center management system for the Southeastern Universities Research Association (SURA) project. We show that our methodology can greatly simplify the design of such a system and make it easier to meet deadline requirements, while improving system reliability through the reuse of fully tested legacy models. 3) For the system in a wide area network, we narrow our focus to a representative VoIP system and introduce a general distributed real-time VoIP system model, a novel system design methodology, and an implementation. We show that with our new model and architectural design mechanism, we can provide effective real-time requirement for Voice over Internet Protocol (VoIP).
989

Control and Optimization of a Compact 6-Degree-of-Freedom Precision Positioner Using Combined Digital Filtering Techniques

Silva Rivas, Jose Christian 2011 December 1900 (has links)
This thesis presents the multivariable controller design and implementation for a high-precision 6-degree-of-freedom (6-DOF) magnetically levitated (maglev) positioner. The positioner is a triangular single-moving part that carries three 3-phase permanent-magnet linear-levitation-motor armatures. The three planar levitation motors not only generate the vertical force to levitate the triangular platen but control the platen's position in the horizontal plane. All 6-DOF motions are controlled by magnetic forces only. The positioner moves over a Halbach magnet matrix using three sets of two-axis Hall-effect sensors to measure the planar motion and three Nanogage laser distance sensors for the vertical motion. However, the Hall-effect sensors and the Nanogage laser distance sensors can only provide measurements of the displacement of all 6-axis. Since we do not have full-state feedback, I designed two Linear Quadratic Gaussian (LQG) multivariable controllers using a recursive discrete-time observer. A discrete hybrid H2/H(infinity) filter is implemented to obtain optimal estimates of position and orientation, as well as additional estimates of velocity and angular velocity for all 6 axes. In addition, an analysis was done on the signals measured by the Hall-effect sensors, and from there several digital filters were tested to optimize the readings of the sensors and obtain the best estimates possible. One of the multivariable controllers was designed to close the control loop for the three-planar-DOF motion, and the other to close the loop for the vertical motion, all at a sampling frequency of 800 Hz. Experimental results show a position resolution of 1.5 micrometers with position noise of 0.545 micrometers rms in the x-and y-directions and a resolution of less than 110 nm with position noise of 49.3 nm rms in z.
990

A Prescription for Partial Synchrony

Sastry, Srikanth 2011 May 1900 (has links)
Algorithms in message-passing distributed systems often require partial synchrony to tolerate crash failures. Informally, partial synchrony refers to systems where timing bounds on communication and computation may exist, but the knowledge of such bounds is limited. Traditionally, the foundation for the theory of partial synchrony has been real time: a time base measured by counting events external to the system, like the vibrations of Cesium atoms or piezoelectric crystals. Unfortunately, algorithms that are correct relative to many real-time based models of partial synchrony may not behave correctly in empirical distributed systems. For example, a set of popular theoretical models, which we call M_*, assume (eventual) upper bounds on message delay and relative process speeds, regardless of message size and absolute process speeds. Empirical systems with bounded channel capacity and bandwidth cannot realize such assumptions either natively, or through algorithmic constructions. Consequently, empirical deployment of the many M_*-based algorithms risks anomalous behavior. As a result, we argue that real time is the wrong basis for such a theory. Instead, the appropriate foundation for partial synchrony is fairness: a time base measured by counting events internal to the system, like the steps executed by the processes. By way of example, we redefine M_* models with fairness-based bounds and provide algorithmic techniques to implement fairness-based M_* models on a significant subset of the empirical systems. The proposed techniques use failure detectors — system services that provide hints about process crashes — as intermediaries that preserve the fairness constraints native to empirical systems. In effect, algorithms that are correct in M_* models are now proved correct in such empirical systems as well. Demonstrating our results requires solving three open problems. (1) We propose the first unified mathematical framework based on Timed I/O Automata to specify empirical systems, partially synchronous systems, and algorithms that execute within the aforementioned systems. (2) We show that crash tolerance capabilities of popular distributed systems can be denominated exclusively through fairness constraints. (3) We specify exemplar system models that identify the set of weakest system models to implement popular failure detectors.

Page generated in 0.0687 seconds