• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 51
  • 9
  • 6
  • 1
  • 1
  • 1
  • Tagged with
  • 76
  • 17
  • 17
  • 16
  • 16
  • 15
  • 14
  • 14
  • 14
  • 14
  • 12
  • 12
  • 11
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

A postmodern approach to postmodernism a survey and evaluation of contemporary evangelical responses to postmodernism /

Blumenstock, James A. January 2001 (has links)
Thesis (Th. M.)--Calvin Theological Seminary, 2001. / Abstract. Includes bibliographical references (leaves 101-102).
32

A postmodern approach to postmodernism a survey and evaluation of contemporary evangelical responses to postmodernism /

Blumenstock, James A. January 2001 (has links)
Thesis (Th. M.)--Calvin Theological Seminary, 2001. / Abstract. Includes bibliographical references (leaves 101-102).
33

A postmodern approach to postmodernism a survey and evaluation of contemporary evangelical responses to postmodernism /

Blumenstock, James A. January 2001 (has links) (PDF)
Thesis (Th. M.)--Calvin Theological Seminary, 2001. / Abstract. Includes bibliographical references (leaves 101-102).
34

Digital watermarking of still images

Ahmed, Kamal Ali January 2013 (has links)
This thesis presents novel research work on copyright protection of grey scale and colour digital images. New blind frequency domain watermarking algorithms using one dimensional and two dimensional Walsh coding were developed. Handwritten signatures and mobile phone numbers were used in this project as watermarks. In this research eight algorithms were developed based on the DCT using 1D and 2D Walsh coding. These algorithms used the low frequency coefficients of the 8 × 8 DCT blocks for embedding. A shuffle process was used in the watermarking algorithms to increase the robustness against the cropping attacks. All algorithms are blind since they do not require the original image. All algorithms caused minimum distortion to the host images and the watermarking is invisible. The watermark is embedded in the green channel of the RGB colour images. The Walsh coded watermark is inserted several times by using the shuffling process to improve its robustness. The effect of changing the Walsh lengths and the scaling strength of the watermark on the robustness and image quality were studied. All algorithms are examined by using several grey scale and colour images of sizes 512 × 512. The fidelity of the images was assessed by using the peak signal to noise ratio (PSNR), the structural similarity index measure (SSIM), normalized correlation (NC) and StirMark benchmark tools. The new algorithms were tested on several grey scale and colour images of different sizes. Evaluation techniques using several tools with different scaling factors have been considered in the thesis to assess the algorithms. Comparisons carried out against other methods of embedding without coding have shown the superiority of the algorithms. The results have shown that use of 1D and 2D Walsh coding with DCT Blocks offers significant improvement in the robustness against JPEG compression and some other image processing operations compared to the method of embedding without coding. The originality of the schemes enables them to achieve significant robustness compared to conventional non-coded watermarking methods. The new algorithms offer an optimal trade-off between perceptual distortion caused by embedding and robustness against certain attacks. The new techniques could offer significant advantages to the digital watermark field and provide additional benefits to the copyright protection industry.
35

An exploration of how jazz improvisation is taught

Griffin, Timothy Joel 01 April 2020 (has links)
The purpose of this study was to explore how master jazz pedagogues and artist-level jazz musicians used pedagogical content knowledge to sequence their instructional methods when teaching jazz improvisation. Pedagogical content knowledge served as the theoretical framework for this study. To gain insights into how they used their knowledge when teaching jazz improvisation, I first sought to explore how they learned to improvise. For this study, an overarching research question “How did the participants learn to improvise in jazz?” aided me with contextualizing how they learned content and pedagogy when they began to improvise. Then, the following questions guided my investigation into how these participants used their pedagogical and content knowledge when they taught jazz improvisation: (1) How, if at all, did the participants’ curriculum knowledge influence their approaches to teaching jazz improvisation? (2) How, if at all, did the participants’ pedagogical knowledge influence their approaches to teaching jazz improvisation? (3) How, if at all, did the participants’ content knowledge influence their approaches to teaching jazz improvisation? In this study both the artist-level musicians and master jazz pedagogues all subscribed to an organic mode of teaching jazz improvisation, and not a one size fits all approach that many published jazz materials espouse. Most of these participants did not utilize an established curriculum for teaching, but rather relied on the knowledge of their students and their own content knowledge of what they know and how they learned for the best practices of teaching. Based on the pedagogical content knowledge they provided in this study, I devised a model for teaching jazz improvisation to undergraduate students. I organized this model by developing an eight-semester, or four-year sequence, of pedagogy and content for instruction. For each academic year, I present a description of what I learned from the participants, and how this pedagogical content knowledge can be used with students to learn how to improvise in jazz. I then present a two-semester outline (one academic year) that demonstrates how the pedagogical principles and content knowledge shared by the participants in this study can be sequenced. Each of the participants in this study taught their students based on their own content knowledge and the knowledge of their students. In order to teach jazz and jazz improvisation, preservice teachers need more than just a casual experience with jazz pedagogy, and should look to increase their own content knowledge in the area of jazz through both formal and informal educational opportunities. Furthermore, the scope of this study was limited to world renowned jazz musicians and educators who taught at the university level and only considered the perspectives of jazz educators. Additional studies could focus on active school music teachers who identify as jazz educators or could involve researchers studying the perspectives of the students regarding how they learn pedagogy and content and how they use/retain this knowledge with improvisation. Keywords: jazz, jazz pedagogy, jazz improvisation, pedagogical content knowledge, jazz education
36

Energy Aware Signal Processing and Transmission for System Condition Monitoring

Kadrolkar, Abhijit 01 January 2010 (has links) (PDF)
The operational life of wireless sensor network based distributed sensing systems is limited by the energy provided by a portable battery pack. Owing to the inherently resource constrained nature of wireless sensor networks and nodes, a major research thrust in this field is the search for energy-aware methods of operation. Communication is among the most energy-intensive operations on a wireless device. It is therefore, the focus of our efforts to develop an energy-aware method of communication and to introduce a degree of reconfigurability to ensure autonomous operation of such devices. Given this background, three research tasks have been identified and investigated during the course of this research. 1) Devising an energy-efficient method of communication in a framework of reconfigurable operation: The dependence of the energy consumed during communication on the number of bits transmitted (and received) was identified from prior research work. A novel method of data compression was designed to exploit this dependence. This method uses the time-limited, orthonormal Walsh functions as basis functions for representing signals. The L2 norm of this representation is utilized to further compress the signals. From Parseval’s relation, the square of the L2 norm represents the energy content of a signal. The application of this theorem to our research makes it possible to use the L2 norm as a control knob. The operation of this control knob makes it possible to optimize the number of terms required to represent signals. The time-limited nature of the Walsh functions was leveraged to inject dynamic behaviour into our coding method. This time-limited nature allows decomposition of finite time-segments, without attendant limitations like loss of resolution that are inherent to derived, discrete transforms like the discrete Fourier transform or the discrete time Fourier transform. This decomposition over successive, finite time-segments, coupled with innovative operation of the previously mentioned control knob on every segment, gives us a dynamic scaling technique. The amount of data to be transmitted is in turn based on the magnitude of the coefficients of decomposition of each time-segment, leading to the realization of a variable word length coding method. This dynamic coding method can identify evolving changes or events in the quantity being sensed. The coefficients of decomposition represent features present in successive time-segments of signals and therefore enable identification of evolving events. The ability to identify events as they occur enables the algorithm to react to events as they evolve in the system. In other words the data transmission and the associated energy consumption are imparted a reconfigurable, event-driven nature by implementation of the coding algorithm. Performance evaluation of this method via simulations on machine generated (bearing vibration) and biometric (electro-cardio gram) signals shows it be a viable method for energy-aware communication. 2) Developing a framework for reconfigurable triggering: A framework for completely autonomous triggering of the coding method has been developed. This is achieved by estimating correlations of the signal with the representative Walsh functions. The correlation coefficient of a signal segment with a Walsh function gives a picture of the amount of energy localized by the function. This information is used to autonomously tune the abovementioned control knob or, in more proper terms, the degree of thresholding used in compression. Evaluation of this framework on bearing vibration and electro-cardio gram signals has shown results consistent with those of previous simulations. 3) Devising a computationally compact method of feature classification: A method of investigating time series measurements of dynamic systems in order to classify features buried in the signal measurements was investigated. The approach involves discretizing time-series measurements into strings of pre-defined symbols. These strings are transforms of the original time-series measurements and are a representation of the system dynamics. A method of statistically analyzing the symbol strings is presented and its efficacy is studied through representative simulations and experimental investigation of vibration signals recorded from a rolling bearing element. The method is computationally compact because it obviates the need for local signal processing tasks like denoising, detrending and amplification. Results indicate that the method can effectively classify deteriorating machine health, changing operating conditions and evolving defects. In addition to these major foci, another research task was the design and implementation of a wireless network testbed. This testbed consists of a network of netbooks, connected together wirelessly and was utilized for experimental verification of the variable word length coding method.
37

A Chorus Line: Does It Abide By Rules Established By Actors' Equity Association For The Audition Process?

Hardin, Mark 01 January 2006 (has links)
I have been cast as "Bobby" in A Chorus Line at Orlando Broadway Dinner Theatre in Orlando. I will use this opportunity as my thesis role. As part of my thesis defense, I will combine an analysis of the character of "Bobby" in A Chorus Line with an assessment of Actors' Equity Association's audition policies from 1970 to the present, and investigate whether the audition held in the show abides by the policies established by AEA for Broadway calls. "Bobby" has an interesting arc of development as he actually gives the director what he (the director) does not want, yet is still cast in the fictitious Broadway show. Why he would choose to stray from the director's instructions is an interesting question and demanding study. To facilitate my research on the character (aside from script and score analysis), I will interview Thommie Walsh (about for whom the role was written and the original "Bobby" on Broadway) as well as other men who have played the role to get insights into the character that will enhance my performance. Mr. Walsh will also elaborate on his real-life relationship with Michael Bennett and how that compares and contrasts with the relationship between "Bobby" and "Zach." I also will interview as many of the original cast members as possible (namely Baayork Lee) to get contributing memories and anecdotal evidence from the original production. A Chorus Line captures the one element all performers experience – the audition. The audition process has changed over the years, and I will focus on the development of protocol from the early 1970's (when A Chorus Line takes place) to the present. I will explore the manner in which the process has evolved and what A Chorus Line's contribution was (if any) to that process. This show has become so much a part of the musical theatre vernacular that historical exploration of procedures would also clarify how this work was structured. Were actors subjected to that intense style of audition on a huge stage in the early 1970s? Are they still today? My research will trace the history and rules governing auditions, performers and staff as delineated by Actors' Equity Association. I will also include a comparison of Equity to the variety of non-Equity auditions. Other sources will include rulebooks from AEA and interviews with dancers (past and present), AEA staff and Patrick Quinn, President of AEA.
38

Design and Implementation of a High-Speed Inverse Walsh Transform Apparatus

Mikhail, Samia R. 05 1900 (has links)
<p> In this thesis, a high-speed inverse Walsh transform apparatus was designed and built which sums over the sixteen most dominant coefficients in the time base period. The transform includes a maximum of 64 terms. The Walsh function generator used works with a clock rate up to 10 MHz to produce 64 different sequency terms with accurate timing and hazard free operation. A synchronizing pulse is produced by the circuit to determine the beginning of the Walsh transform period. The final adder stage limits the speed of the apparatus to a 1 MHz square wave. An application of the instrument was made to reconstruct one line of an actual video signal.</p> / Thesis / Master of Engineering (MEngr)
39

An examination of the history and effect of American sex offense laws and offender registration

Shabat-Love, David 01 May 2012 (has links)
America's Sex Offense statutes and cases are some of the most controversial sections of modern law, both for the extreme sensitivity of their subject matter as well as the scope and application of those laws. This thesis is an analysis and overview of both the objective and subjective issues posed by the current state of those very laws: the subjective portion explored the development of current laws and the diverse attendant legal issues such as over-broadness and excessive or misdirected effect as compared to the Legislative and public intent which directly led to the development of these laws. Additionally a more objective study of their efficacy was conducted through the use of data regarding offense rates by locality. This objective data was procured from both the United States Census and Bureau of Justice statistics, which contained national averages such as the overall violent crime rate, and from the Florida Department of Law Enforcement Statistics and was supplemented with additional data from other academic sources. It is both the subjective conclusion and the interpretation of objective data that while the rate of sex offenses has lowered in recent decades this effect is a part of the overall trend of reduction in all violent offenses, and that the extreme stance of modern sex offense laws have arguably resulted in the net-negative of creating a class of individuals ostracized from all but other sex offenders who are virtually incapable of supporting themselves or at times of even finding legal habitation post-release. With little to no chance of a productive life, there is the strong possibility of recidivism and little incentive to avoid re-offending.
40

Sex Offender Policy and Practice: Comparing the SORNA Tier Classification System and Static-99 Risk Levels

Ticknor, Bobbie 10 October 2014 (has links)
No description available.

Page generated in 0.0697 seconds