• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 426
  • 93
  • 71
  • 41
  • 33
  • 23
  • 23
  • 23
  • 23
  • 23
  • 23
  • 19
  • 16
  • 10
  • 9
  • Tagged with
  • 936
  • 201
  • 173
  • 147
  • 91
  • 88
  • 87
  • 81
  • 80
  • 79
  • 69
  • 66
  • 59
  • 49
  • 49
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Techniques For Low Power Motion Estimation In Video Encoders

Gupte, Ajit D 06 1900 (has links) (PDF)
This thesis looks at hardware algorithms that help reduce dynamic power dissipation in video encoder applications. Computational complexity of motion estimation and the data traffic between external memory and the video processing engine are two main reasons for large power dissipation in video encoders. While motion estimation may consume 50% to 70% of total video encoder power, the power dissipated in external memory such as the DDR SDRAM can be of the order of 40% of the total system power. Reducing power dissipation in video encoders is important in order to improve battery life of mobile devices such as the smart phones and digital camcorders. We propose hardware algorithms which extract only the important features in the video data to reduce the complexity of computations, communications and storage, thereby reducing average power dissipation. We apply this concept to design hardware algorithms for optimizing motion estimation matching complexity, and reference frame storage and access from the external memory. In addition, we also develop techniques to reduce searching complexity of motion estimation. First, we explore a set of adaptive algorithms that reduce average power dissipated due to motion estimation. We propose that by taking into account the macro-block level features in the video data, the average matching complexity of motion estimation in terms of number of computations in real-time hardwired video encoders can be significantly reduced when compared against traditional hardwired implementations, that are designed to handle most demanding data sets. Current macro-block features such as pixel variance and Hadamard transform coefficients are analyzed, and are used to adapt the matching complexity. The macro-block is partitioned based on these features to obtain sub-block sums, which are used for matching operations. Thus, simple macro-blocks, without many features can be matched with much less computations compared to the macro-blocks with complex features, leading to reduction in average power dissipation. Apart from optimizing the matching operation, optimizing the search operation is a powerful way to reduce motion estimation complexity. We propose novel search optimization techniques including (1) a center-biased search order and (2) skipping unlikely search positions, both applied in the context of real time hardware implementation. The proposed search optimization techniques take into account and are compatible with the reference data access pattern from the memory as required by the hardware algorithm. We demonstrate that the matching and searching optimization techniques together achieve nearly 65% reduction in power dissipation due to motion estimation, without any significant degradation in motion estimation quality. A key to low power dissipation in video encoders is minimizing the data traffic between the external memory devices such as DDR SDRAM and the video processor. External memory power can be as high as 50% of the total power budget in a multimedia system. Other than the power dissipation in external memory, the amount of data traffic is an important parameter that has significant impact on the system cost. Large memory traffic necessitates high speed external memories, high speed on-chip interconnect, and more parallel I/Os to increase the memory throughput. This leads to higher system cost. We explore a lossy, scalar quantization based reference frame compression technique that can be used to reduce the amount of reference data traffic from external memory devices significantly. In this scheme, the quantization is adapted based on the pixel range within each block being compressed. We show that the error introduced by the scalar quantization is bounded and can be represented by smaller number of bits compared to the original pixel. The proposed reference frame compression scheme uses this property to minimize the motion compensation related traffic, thereby improving the compression scheme efficiency. The scheme maintains a fixed compression ratio, and the size of the quantization error is also kept constant. This enables easy storage and retrieval of reference data. The impact of using lossy reference on the motion estimation quality is negligible. As a result of reduction in DDR traffic, the DDR power is reduced significantly. The power dissipation due to additional hardware required for reference frame compression is very small compared to the reduction in DDR power. 24% reduction in peak DDR bandwidth and 23% net reduction in average DDR power is achieved. For video sequences with larger motion, the amount of bandwidth reduction is even higher (close to 40%) and reduction in power is close to 30%.
352

Recording Bass-Cabinet: Microphone Choice and Microphone Placement

Carmona Velazquez, Diana January 2023 (has links)
Research on microphone choice and placement has been previously made for a variety of instruments, such as drums, vocals, and guitar, to name a few. However, in comparison, very little research has been made on the bass-cabinet. With help of the different methodologies used for previous research on electric guitar and snare drum, the optimal placement and choice of microphone for a 4x10” bass-cabinet were investigated. In a listening test, pre-recorded basslines were mixed with other instruments since it is more common to hear it in a mix of instruments rather than listening to it by itself. To make sure that the topic is not too broad, focus on the genre of rock was taken for this investigation. Both qualitative and quantitative data were collected in a set of listening tests where the subjects rated their preference for their most and least preferable basslines. The subjects were asked to describe their preference utilizing different characteristics of the stimuli. An ANOVA test provided evidence that there is a statistical difference between the preference of one microphone model at a 15 cm distance from the bass-cabinet, in the category bass with band.
353

Extracting Command Signals From Peripheral Nerve Recordings

Wodlinger, Brian January 2010 (has links)
No description available.
354

GENRE AS ADMINISTERED SOCIAL AND RHETORICAL ACTIVITY: THE DEVELOPMENT OF FAMILY SERVICE CASE RECORDING

McFadden, James J. 25 September 2002 (has links)
No description available.
355

Markerless Motion Capture and Analysis System to Enhance Exercise Professional Effectiveness: Preliminary Study

Hanson, Andrew Todd January 2016 (has links)
No description available.
356

Legal analysis of fair dealing relating to music works in the digital environment

Groenewald, Louise 11 1900 (has links)
Many people might think that downloading music without paying for it is not a big issue. Copyright owners disagree with this kind of reasoning because to them, music is intellectual property with substantial commercial value. Copyright law is the primary form of protection for intellectual property and is based on the fundamental principle that copyright works cannot be reproduced without the express consent of the copyright owner. During the late 90’s however, new technology made it possible for millions of people to download music from the Internet without the express consent of copyright owners. The mere act of downloading songs illegally violates the exclusive right of the copyright owner to reproduce the work. It has also created problems within copyright law that was not foreseeable in the 17th century when the Statute of Anne was enacted. In law, there is always an exception to the rule and it is no different with copyright law. Although copyright owners have the exclusive right to reproduce their work, the general public has been granted exceptions to make fair dealing of copyright works for private or personal use, purposes of research, private study, criticism, review or for reporting current events in a magazine, newspaper or periodical, broadcasting or by using the work in a cinematograph film by virtue of s12 of the Copyright Act 98 of 1978. However, the list of exceptions supra may be changed and/or extended, provided that it remains in line with the international conventions and agreements that South Africa is a member to. The three-step test is inter alia provided for in Art. 9(2) of the Berne Convention1 (Paris Text of 1971) and permit exceptions to the reproduction right of the copyright owner: 1) in certain special cases; 2) that do not conflict with the normal exploitation of the work and; 3) that does not unreasonably prejudice the legitimate interests of the author/rights holder. S17 provides that certain subsections of s12 shall apply mutatis mutandis with reference to sound recordings. However, s12(1)(a) is not one of the subsections mentioned in s17 which means that fair dealing in sound recordings for purposes of research or private study, or for personal or private use is NOT permitted. Fair dealing however, is not absolute nor is it an easy doctrine to interpret. The legal interpretation and application of fair dealing has been fraught with complexity since the English courts first dealt with fair abridgement of literary works between the 17th and 18th century but this complexity has been compounded even more by new technology, especially in relation to music works. The legal uncertainty of fair dealing with regard to music works is the reason why this comparative research has been undertaken in the jurisdictions of South Africa, the United Kingdom, Australia and the United States. Hopefully it will shed more light on the doctrine and lift the veil of confusion. / Jurisprudence / LLM
357

Software Decommutation and Integration

Guadiana, Juan, Benitez, Jesus, Pasillas, Roger 10 1900 (has links)
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada / The Telemetry Data Center (TDC) at White Sands Missile Range (WSMR), New Mexico recently targeted analog best source selectors for replacement along with their associated signal handling equipments. The commercial selectors available offered no better performance, so TDC engineers circulated a "White Paper" on real time correlation based compositing. Within two years a Correlating Source Selector (CSS) was fielded successfully. The CSS’s bridging feature unexpectedly opened the door to a ubiqituous software decommutator (decom) that has catalyzed a complete “make-over” of the entire TDC architecture. Hardware and software interaction in a decom is different with the CSS. While performing its correlation tasks the CSS is able to provide raw data over TCP/IP directly to the end application. The CSS places the data in computer friendly frame aligned form and the decommutation may be performed in software. The converse is similarly simple, a data file maybe transferred to the CSS for commutation into PCM. This white paper describes the morphing of software decommutation into a commodity, integrated into each end device, be it graphics display, Disk or Chart recorder. The result is an interesting consolidation that spawns a new functionally integrated Telemetry Data Center ( iTDC). This integrated Display Decom (iDD) concept has been demonstrated on Apple G5 RISC computers.
358

Telemetry Recorders and Disruptive Technologies

Kortick, David 10 1900 (has links)
ITC/USA 2009 Conference Proceedings / The Forty-Fifth Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2009 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Telemetry data recorders are not immune to the effects that a number of disruptive technologies have had on the telemetry industry. Data recorder designs today make use of data buses, storage types and graphical user interfaces that are constantly evolving based on the advances of personal computer and consumer electronics technologies. Many of these recorders use embedded designs that integrate disruptive technologies such as PCI Express for realtime data and signal processing, SATA interfaces for data storage and touchscreen technologies to provide an intuitive operator interface. Solid state drives also play a larger role in the latest recorder designs. This paper will explore the effects of these technologies on the latest telemetry recorders in terms of the benefits to the users, cost of implementation, obsolescence management, and integration considerations. The implications of early adoption of disruptive technologies will also be reviewed.
359

A ROADMAP TO STANDARDIZING THE IRIG 106 CHAPTER 10 COMPLIANT DATA FILTERING AND OVERWRITNG SOFTWARE PROCESS

Berard, Alfredo, Manning, Dennis, Kim, Jeong Min 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / High speed digital recorders have revolutionized the way Major Range and Test Facility Bases collect instrumentation data. One challenge facing these organizations is the need for a validated process for the separation of specific data channels and/or data from multiplexed recordings. Several organizations within Eglin Air Force Base have joined forces to establish the requirements and validate a software process compliant with the IRIG-106 Chapter 10 Digital Recording Standard (which defines allowable media access, data packetization, and error controls mechanics). This paper describes a roadmap to standardizing the process to produce this software process, Data Overwriting and Filtering Application (DOFA).
360

AATIS AND CAIS DATA RECORDING

Gaddis, William R. Jr, Sandland, Sawn 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada / DOD flight test centers need affordable, small-format, flight-qualified digital instrumentation recording solutions to support existing and future flight testing. The Advanced Airborne Test Instrumentation System (AATIS) is today's primary data acquisition system at the Air Force Flight Test Center (AFFTC). Digital Recorder (DR) 1995 is planned to provide full support for AATIS output capabilities and satisfy initial recording requirements for the Common Airborne Instrumentation System (CAIS). The follow-on to the AATIS, the CAIS is a tri-service development to satisfy future DOD flight test data acquisition requirements. DR 2000 is planned as the future recording solution for CAIS and will be able to fully satisfy the 50 Mbps recording requirement. In the developments of DR 1995 and DR 2000, commonality and interoperability have emerged as significant issues. This paper presents an overview of these recording solutions and examines commonality and interoperability issues.

Page generated in 0.199 seconds