• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2202
  • 363
  • 282
  • 176
  • 98
  • 72
  • 38
  • 36
  • 34
  • 25
  • 24
  • 21
  • 21
  • 20
  • 20
  • Tagged with
  • 4020
  • 527
  • 472
  • 469
  • 425
  • 425
  • 417
  • 403
  • 383
  • 362
  • 338
  • 315
  • 288
  • 284
  • 279
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

User Importance Modelling in Social Information Systems An Interaction Based Approach

Aggarwal, Anupam 2009 December 1900 (has links)
The past few years have seen the rapid rise of all things “social” on the web from the growth of online social networks like Facebook, to real-time communication services like Twitter, to user-contributed content sites like Flickr and YouTube, to content aggregators like Digg. Beyond these popular Web 2.0 successes, the emer- gence of Social Information Systems is promising to fundamentally transform what information we encounter and digest, how businesses market and engage with their customers, how universities educate and train a new generation of researchers, how the government investigates terror networks, and even how political regimes interact with their citizenry. Users have moved from being passive consumers of information (via querying or browsing) to becoming active participants in the creation of data and knowledge artifacts, actively sorting, ranking, and annotating other users and artifacts. This fundamental shift to social systems places new demands on providing de- pendable capabilities for knowing whom to trust and what information to trust, given the open and unregulated nature of these systems. The emergence of large-scale user participation in Social Information Systems suggests the need for the development of user-centric approaches to information quality. As a step in this direction this research proposes an interaction-based approach for modeling the notion of user im- portance. The interaction-based model is centered around the uniquely social aspects of these systems, by treating who communicates with whom (an interaction) as a core building block in evaluating user importance. We first study the interaction characteristics of Twitter, one of the most buzzworthy recent Social Web successes, examining the usage statistics, growth patterns, and user interaction behavior of over 2 million participants on Twitter. We believe this is the first large-scale study of dynamic interactions on a real-world Social Information System. Based on the anal- ysis of the interaction structure of Twitter, the second contribution of this thesis research is an exploration of approaches for measuring user importance. As part of this exploration, we study several different approaches that build on the inherent interaction-based framework of Social Information Systems. We explore this model through an experimental study over an interaction graph consisting of 800,000 nodes and about 1.9 million interaction edges. The user importance modeling approaches that we present can be applied to any Social Information System in which interactions between users can be monitored.
232

Effects of Various Random Sources on Surface-Generated Ambient Noise

Shih, Guo-Fong 02 August 2004 (has links)
Ambient noise generated by surface random processes is the primary contribution to the noise-field energy in the intermediate frequency band, and thus is important in many applications of underwater sound. In this study, the noise field is analyzed with respect to the effects of random source spectrum, waveguide structure of the water column, and seabed stratification upon the noise-field intensity as well as spatial correlation. Based upon a noise-generation model due to continuous random sources, incorporating several analytical models for seabed stratification, a formulation may then be derived to facilitate the numerical implementation. Many results shall be generated and analyzed. In this study considers the noise field generated by surface random processes in an oceanic environment with a sediment layer possessing a continuously varying density and sound-speed profile. This model closely resembles the oceanic waveguide environment and therefore enables the simulation of surface noise generation. Many results of the noise field were generated, including the noise intensity distribution, vertical and horizontal correlations. It is demonstrated that the noise intensity may be affected by the stratification mainly through the continuous spectrum, in that the continuous spectrum is equally important as the normal modes in the present analysis. Moreover, the results for the correlations show that the noise field in the horizontal direction becomes more coherent when the noise sources are more correlated, while in the vertical direction, the results tend to reverse. The horizontal correlations of the noise field due to surface random sources with non-isotropic power spectrum, such as nonisotropic Gaussian and Pierson-Moskowitz, were generated and analyzed.
233

Design and Implementaion of a High-Performance Memory Generator

Lee, Wan-Ping 18 August 2004 (has links)
The SRAM memory generator in this thesis is divided into four parts: row decoder, storage cell, column decoder, and sense amplifier & write controller. The row decoder is designed using pass-transistors logic with better area and regularity compared with conventional NAND based decoders. Two different column decoders, tree structure and NOR based predecoder, are provided in current version. Although only SRAM is implemented in this thesis, the memory generator platform is complete with all the necessary models required in the embedded design. In the future, other memories, such as cache, shift register, FIFO, stacks, ROM, register files, and content addressable memory, can be integrated in this memory generator platform.
234

THE OPTICAL ALIGNMENT OF A PHASE KEY IN RANDOM PHASE ENCODED VOLUME HOLOGRAPHIC STORAGE SYSTEM BY USING A HOLOGRAPHIC CORRELATOR

Kao, Hung-Jei 26 June 2006 (has links)
Phase key, which uses optical encoding techniques for system security, plays an important role in optical storage, optical communication, and optical display. It employs a random phase generator with a volume hologram for optical encoding. The advantages of using phase keys for optical communication is: (1) it is hard to be duplicated and (2) it requires sensitively alignment to decode the desired signal. Thus, it ensures security of the optical system. However, the adjunctive challenge of using a phase key is the difficulty of alignment by users. In this paper, we propose a method for optical alignment of the phase key in a random phase encoded volume holographic storage system. In this method, a holographic correlator is applied to help the optical alignment of the phase key. It has been shown that the desired signal from the random phase encoded volume holographic storage system can be retrieved easily with high security.
235

The study of the phase transition from first-order to second-order in the two dimensional Potts model due to random applied fields

Huang, Shih-Yuan 17 July 2003 (has links)
Abstract In this paper, we study the nature of phase transition of the two-dimensional six-state Potts model under the external random magnetic field. The six-state Potts model exist temperature-dependent first-order phase transition. When the external random field is applied, the nature of phase can be altered from first-order to second-order.By employing the Monte Carlo simulation method, we inspected the energy histogram and Binder parameter of the six-state Potts model under the external random magnetic field. According to our analyses, the evidences reveal that the phase transition does not change until the external magnetic field is greater then 0.02
236

Analysis of beacon triangulation in random graphs

Kakarlapudi, Geetha 17 February 2005 (has links)
Our research focusses on the problem of finding nearby peers in the Internet. We focus on one particular approach, Beacon Triangulation that is widely used to solve the peer-finding problem. Beacon Triangulation is based on relative distances of nodes to some special nodes called beacons. The scheme gives an error when a new node that wishes to join the network has the same relative distance to two or more nodes. One of the reasons for the error is that two or more nodes have the same distance vectors. As a part of our research work, we derive the conditions to ensure the uniqueness of distance vectors in any network given the shortest path distribution of nodes in that network. We verify our analytical results for G(n, p) graphs and the Internet. We also derive other conditions under which the error in the Beacon Triangulation scheme reduces to zero. We compare the Beacon Triangulation scheme to another well-known distance estimation scheme known as Global Network Positioning (GNP).
237

Capacity dynamics of feed-forward, flow-matching networks exposed to random disruptions

Savachkin, Aliaksei 30 October 2006 (has links)
While lean manufacturing has greatly improved the efficiency of production operations, it has left US enterprises in an increasingly risky environment. Causes of manufacturing disruptions continue to multiply, and today, seemingly minor disruptions can cause cascading sequences of capacity losses. Historically, enterprises have lacked viable tools for addressing operational volatility. As a result, each year US companies forfeit billions of dollars to unpredictable capacity disruptions and insurance premiums. In this dissertation we develop a number of stochastic models that capture the dynamics of capacity disruptions in complex multi-tier flow-matching feed-forward networks (FFN). In particular, we relax basic structural assumptions of FFN, introduce random propagation times, study the impact of inventory buffers on propagation times, and make initial efforts to model random network topology. These stochastic models are central to future methodologies supporting strategic risk management and enterprise network design.
238

Effects of Waveguide Properties on Surface-Generated Ambient Noise: Simulation and Analyzed

Lin, Yi-wei 29 August 2008 (has links)
Ambient noise generated by surface random processes is the primary contribution to the noise-field energy in the intermediate frequency band, and thus is important in many applications of underwater sound. In this study, the noise field is analyzed with respect to the effects of random source spectrum, waveguide structure of the water column, and seabed stratifica¬tion upon the noise-field intensity as well as spatial correlation. Based upon a noise-generation model due to continuous random sources, incorporating several analytical models for seabed stratification, a formulation may then be derived to facilitate the numerical implementation. Many results shall be generated and analyzed. In this study considers the noise field generated by wave in an oceanic environment with a sediment layer possessing a constant density and sound-speed profile. This model closely resembles the oceanic waveguide environment and therefore enables the simulation of surface noise generation. Many results of the noise field were generated, in¬cluding the noise intensity distribution, vertical and horizontal correlations. It is demonstrated that the noise intensity may be affected by the strat¬ification mainly through the continuous spectrum, in that the continuous spectrum is equally important as the normal modes in the present analysis. Moreover, the results for the correlations show that the noise field in the horizontal direction becomes more coherent when the noise sources are more correlated, while in the vertical direction, the results tend to reverse. The horizontal correlations of the noise field due to surface random sources with non-isotropic power spectrum, such as non-isotropic Gaussian and, were generated and analyzed.
239

Design and Implementation of the OFDM Demodulator for DVB-T and the Random Number Generator

Huang, Jian-ming 15 October 2008 (has links)
Digital video broadcasting for Terrestrial (DVB-T) is one of the major standards for the fixed reception of digital television services, and the orthogonal frequency division multiplexing (OFDM) demodulator is a critical module of DVB-T receivers. As the remarkable advace of the VLSI (very large scale integration) circuits, the SOC (system-on-a-chip) of the DVB-T receiver is an inevitabel evolution. Considering the integration of the mixed-signal circuits, the issues ot beat could be the frequency synthesis and the calibration of the mixed-signal circuits. Hence, this thesis proposes an OFDM demodulator and discusses the design issues emerged from the SOC integration. The proposed OFDM demodulator is composed of four blocks: time synchronization, frequency synchronization, 2K/8K mode FFT (fast Fourier transform), and channel estimation. The demodulator utilizes the pilot signals embeded in OFDM symbols to estimate the frequency offset and the channel response. Besides, the demodulator use the cyclic prefix of an OFDM symbol to find the correct starting position of an OFDM symbol, and consequently the payload data of an OFDM symbol can be transmitted to the 2K/8K FFT for further processing. As the demand for a low noise frequency signal, we propose a direct digital frequency synthesizer (DDFS) based on the quadruple angle approximation. According to the proposed trigonometric 2nd-order quadruple angle approximation, the DDFS can produce a high-resolution and low-phase noise digital sinusoid without any ROM (read only memory). The digital calibration is an effective scheme to prevent ADCs (analog-to-digital converter) from the interference of noise. A random number generator (RNG) is an essential component for the calibration circuitry. However, the realization of the RNG is an important but long ignored issue. This thesis proposes a RNG based on a chaotic system wherein the coefficients of the system is dynamically changed to attain an ideal random bit stream with flat power spectrum density.
240

An energy efficient cache design using spin torque transfer (STT) RAM

Rasquinha, Mitchelle 23 August 2011 (has links)
The advent of many core architectures has coincided with the energy and power limited design of modern processors. Projections for main memory clearly show widening of the processor-memory gap. Cache capacity increased to help reduce this gap will lead to increased energy and area usage and due to small growth in die size, impede performance scaling that has accompanied Moore's Law to date. Among the dominant sources of energy consumption is the on-chip memory hierar- chy, specically the L2 cache and the Last Level Cache (LLC). This work explores the use of a novel non-volatile memory technology - Spin Torque Transfer RAM (STT RAM)" for the design of the L2/LLC caches. While STTRAM is a promising memory technology, it has some limitations, particularly in terms of write energy and write latencies. The main objectives of this thesis is to use a novel cell design for a non-volatile 1T1MTJ cell and demonstrate its use at the L2 and LLC cache levels with architectural optimizations to maximize energy reduction. The proposed cache hierarchy dissipates significantly lesser energy (both leakage and dynamic) and uses less area in comparison to a conventional SRAM based cache designs.

Page generated in 0.0529 seconds