• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 362
  • 149
  • 78
  • 28
  • 10
  • 10
  • 9
  • 8
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • Tagged with
  • 855
  • 120
  • 112
  • 110
  • 106
  • 106
  • 95
  • 74
  • 63
  • 60
  • 59
  • 58
  • 58
  • 57
  • 57
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Velocity Fluctuations and Extreme Events in Microscopic Traffic Data

Piepel, Moritz 06 December 2022 (has links)
Vehicle velocity distributions are of utmost relevance for the efficiency, safety, and sustainability of road traffic. Yet, due to technical limitations, they are often empirically analyzed using spatiotemporal averages. Here, we instead study a novel set of microscopic traffic data from Dresden comprising 346 million data points with a resolution of one vehicle from 145 detector sites with a particular focus on extreme events and distribution tails. By fitting q-exponential and Generalized Extreme Value distributions to the right flank of the empirical velocity distributions, we establish that their tails universally exhibit a power-law behavior with similar decay exponents. We also find that q-exponentials are best suitable to model the vast extent to which speed limit violations in the data occur. Furthermore, combining velocity and time headway distributions, we obtain estimates for free flow velocities that always exceed average velocities and sometimes even significantly exceed speed limits. Likewise, congestion effects are found to play a very minor, almost negligible role in traffic flow at the detector sites. These results provide insights into the current state of traffic in Dresden, hinting toward potentially necessary policy amendments regarding road design, speed limits, and speeding prosecution. They also reveal the potentials and limitations of the data set at hand and thereby lay the groundwork for further, more detailed traffic analyses.
352

Dependence of Set, Reset and Breakdown Voltages of a MIM Resistive Memory Device on the Input Voltage Waveform

Ghosh, Gargi 27 May 2015 (has links)
Owing to its excellent scaling potential, low power consumption, high switching speed, and good retention, and endurance properties, Resistive Random Access Memory (RRAM) is one of the prime candidates to supplant current Nonvolatile Memory (NVM) based on the floating gate (FG) MOSFET transistor, which is at the end of its scaling capability. The RRAM technology comprises two subcategories: 1) the resistive phase change memory (PCM), which has been very recently deployed commercially, and 2) the filamentary conductive bridge RAM (CBRAM) which holds the promise of even better scaling potential, less power consumption, and faster access times. This thesis focuses on several aspects of the CBRAM technology. CBRAM devices are based on nanoionics transport and chemo-physical reactions to create filamentary conductive paths across a dielectric sandwiched between two metal electrodes. These nano-size filaments can be formed and ruptured reliably and repeatedly by application of appropriate voltages. Although, there exists a large body of literature on this topic, many aspects of the CBRAM mechanisms and are still poorly understood. In the next paragraph, the aspects of CBRAM studied in this thesis are spelled out in more detail. CBRAM cell is not only an attractive candidate for a memory cell but is also a good implementation of a new circuit element, called memristor, as postulated by Leon Chua. Basically, a memristor, is a resistor with a memory. Such an element holds the promise to mimic neurological switching of neuron and synapses in human brain that are much more efficient than the Neuman computer architecture with its current CMOS logic technology. A memristive circuitry can possibly lead to much more powerful neural computers in the future. In the course of the research undertaken in this thesis, many memristive properties of the resistive cells have been found and used in models to describe the behavior of the resistive switching devices. The research performed in this study has also an immediate commercial application. Currently, the semiconductor industry is faced with so-called latency scaling dilemma. In the past, the bottleneck for the signal propagation was the time delay of the transistor. Today, the transistors became so fast that the bottleneck for the signal propagation is now the RC time delay of the interconnecting metal lines. Scaling drives both, resistance and parasitic capacitance of the metal lines to very high values. In this context, one observes that resistive switching memory does not require a Si substrate. It is therefore an excellent candidate for its implementation as an o n-chip memory above the logic circuits in the CMOS back-end, thus making the signal paths between logic and memory extremely short. In the framework of a Semiconductor Research Corporation (SRC) project with Intel Corporation, this thesis investigated the breakdown and resistive switching properties of currently deployed low k interlayer dielectrics to understand the mechanisms and potential of different material choices for a realization of an RRAM memory to be implemented in the back-end of a CMOS process flow. / Master of Science
353

Faraday Rotation Distributions from Stellar Magnetism in Wind-Blown Bubbles.

Ignace, Richard, Pingel, N. 01 March 2013 (has links) (PDF)
Faraday rotation is a valuable tool for detecting magnetic fields. Here, the technique is considered in relation to wind-blown bubbles. In the context of spherical winds with azimuthal or split monopole stellar magnetic field geometries, we derive maps of the distribution of position angle (P.A.) rotation of linearly polarized radiation across projected bubbles. We show that the morphology of maps for split monopole fields are distinct from those produced by the toroidal field topology; however, the toroidal case is the one most likely to be detectable because of its slower decline in field strength with distance from the star. We also consider the important case of a bubble with a spherical sub-volume that is field-free to approximate crudely a “swept-up” wind interaction between a fast wind (or possibly a supernova ejecta shell) overtaking a slower magnetized wind from a prior state of stellar evolution. With an azimuthal field, the resultant P.A. map displays two arc-like features of opposite rotation measure, similar to observations of the supernova remnant G296.5+10.0. We illustrate how P.A. maps can be used to disentangle Faraday rotation contributions made by the interstellar medium versus the bubble. Although our models involve simplifying assumptions, their consideration leads to a number of general robust conclusions for use in the analysis of radio mapping data sets.
354

An Analytical Model for On-Chip Interconnects in Multimedia Embedded Systems

Wu, Y., Min, Geyong, Zhu, D., Yang, L.T. January 2013 (has links)
No / The traffic pattern has significant impact on the performance of network-on-chip. Many recent studies have shown that multimedia applications can be supported in on-chip interconnects. Driven by the motivation of evaluating on-chip interconnects in multimedia embedded systems, a new analytical model is proposed to investigate the performance of the fat-tree based on-chip interconnection network under bursty multimedia traffic and nonuniform message destinations. Extensive simulation experiments are conducted to validate the accuracy of the model, which is then adopted as a cost-efficient tool to investigate the effects of bursty multimedia traffic with nonuniform destinations on the network performance.
355

A Frequency Selective Bolometer Camera for Measuring Millimeter Spectral Energy Distributions

Logan, Daniel William 01 May 2009 (has links)
Bolometers are the most sensitive detectors for measuring millimeter and submillimeter wavelength astrophysical signals. Cameras comprised of arrays of bolometers have already made significant contributions to the field of astronomy. A challenge for bolometer cameras is obtaining observations at multiple wavelengths. Traditionally, observing in multiple bands requires a partial disassembly of the instrument to replace bandpass filters, a task which prevents immediate spectral interrogation of a source. More complex cameras have been constructed to observe in several bands using beam splitters and dichroic filters, but the added complexity leads to physically larger instruments with reduced efficiencies. The SPEctral Energy Distribution camera (SPEED) is a new type of bolometer camera designed to efficiently observe in multiple wavebands without the need for excess bandpass filters and beam splitters. SPEED is a ground-based millimeter-wave bolometer camera designed to observe at 2.1, 1.3, 1.1, and 0.85 mm simultaneously. SPEED makes use of a new type of bolometer, the frequency selective bolometer (FSB), to observe all of the wavebands within each of the camera's four pixels. FSBs incorporate frequency selective dipole surfaces as absorbing elements allowing each detector to absorb a single, narrow band of radiation and pass all other radiation with low loss. Each FSB also contains a superconducting transition-edge sensor (TES) that acts as a sensitive thermistor for measuring the temperature of the FSB. This thesis describes the development of the SPEED camera and FSB detectors. The design of the detectors used in the instrument is described as well as the the general optical performance of frequency selective dipole surfaces. Laboratory results of both the optical and thermal properties of millimeter-wave FSBs are also presented. The SPEED instrument and its components are highlighted and the optical design of the optics which couple SPEED to the Heinrich Hertz Telescope is given. This thesis concludes with an introduction to the jiggle mapping data analysis of bolometer instruments like SPEED.
356

Revisiting parts of the verb in Southern Nambikwara: Towards a definition of subjectivity as a grammatical category

Roosvall, Emilia January 2022 (has links)
This study investigates verb-final morphemes in Southern Nambikwara — a polysynthetic language spoken in the Mato Grosso region of southwestern Brazil. The verb-final morphemes -wa2 and -ɾa2/-la2 have previously been described as denoting an aspect distinction between imperfectivity and perfectivity (da Silva 2021; B. Kroeker 1982; M. Kroeker 2001; Lowe 1999). However, upon closer examination, this description appears flawed. By revisiting the data found in the previous literature as well as analyzing new first-hand data and the responses to a questionnaire, this study aims to describe the function and meaning of -wa2 and -ɾa2/-la2 with a focus on patterns of co-distribution with other verbal categories. The results suggest that -wa2 and -ɾa2/-la2 are markers of subjectivity and non-subjectivity, respectively. The present definition of subjectivity is based on Du Bois (2007) and Nuyts (2001), who emphasize that the distinctive factor between these categories is whether knowledge or evidence, and by extension the conclusion drawn from them, is exclusive to the speaker (or the speech act participants) or shared by a larger group. In addition, the shared or exclusive responsibility for an utterance as well as its sequential context are also of importance to the marking of subjectivity. Conclusively, the distribution of -wa2 and -ɾa2/-la2 varies with many grammatical categories, such as tense, person, mood/aspect, engagement, and polarity. The subjectivity marker -wa2 encodes exclusive knowledge and epistemic responsibility, while the non-subjectivity marker -ɾa2/-la2 denotes its counterpart, shared knowledge and shared epistemic responsibility.
357

Calculations of Neutron Emission in the Thermal Neutron Fission of U235

Brubaker, Calvin David 10 1900 (has links)
No abstract provided. / Thesis / Master of Science (MSc) / Scope and contents: The probability of fission as a function of primary fragment velocities has been obtained by removing the neutron emission and instrumental dispersions from the velocities determined by Stein with time-of-flight techniques for the thermal neutron fission of u235. Each velocity was increased by 0.69% to make the average kinetic energy per fission agree with the calorimetric value of 167.1 Mev. Excitation energy distributions were obtained by using the primary fragment masses given by Cameron and assuming that the most probable charge distribution for a given mass ratio i s that which leads to the greatest energy release. Evaporation theory was used to determine the number of prompt neutrons emitted. When the excitation energy is divided equally between the fragments and a nuclear temperature of 0.59 Mev is used, the average number of neutrons emitted is 2.95 per fission.
358

Wireless Network Dimensioning and Provisioning for Ultra-reliable Communication: Modeling and Analysis

Gomes Santos Goncalves, Andre Vinicius 28 November 2023 (has links)
A key distinction between today's and tomorrow's wireless networks is the appetite for reliability to enable emerging mission-critical services such as ultra-reliable low-latency communication (URLLC) and hyper-reliable low-latency communication (HRLLC), the staple mission-critical services in IMT-2020 (5G) and IMT-2023 (6G), for which reliable and resilient communication is a must. However, achieving ultra-reliable communication is challenging because of these services' stringent reliability and latency requirements and the stochastic nature of wireless networks. A natural way of increasing reliability and reducing latency is to provision additional network resources to compensate for uncertainty in wireless networks caused by fading, interference, mobility, and time-varying network load, among others. Thus, an important step to enable mission-critical services is to identify and quantify what it takes to support ultra-reliable communication in mobile networks -- a process often referred to as dimensioning. This dissertation focuses on resource dimensioning, notably spectrum, for ultra-reliable wireless communication. This dissertation proposes a set of methods for spectrum dimensioning based on concepts from risk analysis, extreme value theory, and meta distributions. These methods reveal that each ``nine'' in reliability (e.g., five-nines in 99.999%) roughly translates into an order of magnitude increase in the required bandwidth. In ultra-reliability regimes, the required bandwidth can be in the order of tens of gigahertz, far beyond what is typically available in today's networks, making it challenging to provision resources for ultra-reliable communication. Accordingly, this dissertation also investigates alternative approaches to provide resources to enable ultra-reliable communication services in mobile networks. Particularly, this dissertation considers multi-operator network sharing and multi-connectivity as alternatives to make additional network resources available to enhance network reliability and proposes multi-operator connectivity sharing, which combines multi-operator network sharing with multi-connectivity. Our studies, based on simulations, real-world data analysis, and mathematical models, suggest that multi-operator connectivity sharing -- in which mobiles multi-connect to base stations of operators in a sharing arrangement -- can reduce the required bandwidth significantly because underlying operators tend to exhibit characteristics attractive to reliability, such as complementary coverage during periods of impaired connectivity, facilitating the support for ultra-reliable communication in future mobile networks. / Doctor of Philosophy / A key distinction between today's and tomorrow's wireless networks is the appetite for reliability to enable emerging mission-critical services in 5G and 6G, for which ultra-reliable communication is a must. However, achieving ultra-reliable communication is challenging because of these services' stringent reliability and latency requirements and the stochastic nature of wireless networks. Reliability often comes at the cost of additional network resources to compensate for uncertainty in wireless networks. Thus, an important step to enable ultra-reliable communication is to identify and quantify what it takes to support mission-critical services in mobile networks -- a process often denoted as dimensioning. This dissertation focuses on spectrum dimensioning and proposes a set of methods to identify suitable spectrum bands and required bandwidth for ultra-reliable communication. These methods reveal that the spectrum needs for ultra-reliable communication can be beyond what is typically available in today's networks, making it challenging to provide adequate resources to support ultra-reliable communication services in mobile networks. Alternatively, we propose multi-operator connectivity sharing: mobiles simultaneously connect to multiple base stations of different operators. Our studies suggest that multi-operator connectivity sharing can reduce the spectrum needs in ultra-reliability regimes significantly, being an attractive alternative to enable ultra-reliable communication in future mobile networks.
359

New Visualization Techniques for Multi-Dimensional Variables in Complex Physical Domains

Vickery, Rhonda J 13 December 2003 (has links)
This work presents the new Synthesized Cell Texture (SCT) algorithm for visualizing related multiple scalar value fields within the same 3D space. The SCT method is particularly well suited to scalar quantities that could be represented in the physical domain as size fractionated particles, such as in the study of sedimentation, atmospheric aerosols, or precipitation. There are two components to this contribution. First a Scaling and Distribution (SAD) algorithm provides a means of specifying a multi-scalar field in terms of a maximum cell resolution (or density of represented values). This information is used to scale the multi-scalar field values for each 3D cell to the maximum values found throughout the data set, and then randomly distributes those values as particles varying in number, size, color, and opacity within a 2D cell slice. This approach facilitates viewing of closely spaced layers commonly found in sigma-coordinate grids. The SAD algorithm can be applied regardless of how the particles are rendered. The second contribution provides the Synthesized Cell Texture (SCT) algorithm to render the multi-scalar values. In this approach, a texture is synthesized from the location information computed by the SAD algorithm, which is then applied to each cell as a 2D slice within the volume. The SCT method trades off computation time (to synthesize the texture) and texture memory against the number of geometric primitives that must be sent through the graphics pipeline of the host system. Analysis results from a user study prove the effectiveness of the algorithm as a browsing method for multiple related scalar fields. The interactive rendering performance of the SCT method is compared with two common basic particle representations: flat-shaded color-mapped OpenGL points and quadrilaterals. Frame rate statistics show the SCT method to be up to 44 times faster, depending on the volume to be displayed and the host system. The SCT method has been successfully applied to oceanographic sedimentation data, and can be applied to other problem domains as well. Future enhancements include the extension to time-varying data and parallelization of the texture synthesis component to reduce startup time.
360

Changes in Trajectories of Foraging Agents Under Spatial Learning

Mirmiran, Camille 28 November 2022 (has links)
The goal of this thesis is to identify differences and consistencies in the trajectories taken by foraging agents before and after they have learned the location of a target. The challenge is that these agents do not go directly towards the target after learning and keep a certain amount of randomness in their paths. We use different versions of discrete curvature and head angle as tools in this analysis. We also build models of foraging agents using stochastic processes with data supported parameters.

Page generated in 0.0991 seconds