• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2077
  • 469
  • 321
  • 181
  • 169
  • 71
  • 68
  • 65
  • 53
  • 51
  • 49
  • 43
  • 28
  • 23
  • 22
  • Tagged with
  • 4366
  • 717
  • 538
  • 529
  • 506
  • 472
  • 432
  • 408
  • 390
  • 323
  • 316
  • 306
  • 296
  • 286
  • 275
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
561

Identifikace aerodynamických charakteristik atmosférického letadla z výsledků letových měření / Aerodynamic Characteristics Identification of Atmospheric Airplane from Flight Measurement Results

Zikmund, Pavel January 2013 (has links)
The thesis deals with aerodynamic characteristics identification from flight measurement. The topic is part of flight mechanic – handling qualities. The first theoretic part consists of three identification methods description: Error equation method, Output error method and Filter error method. Mathematical model of an airplane is defined and restricted to the motion with 3 degree of freedom. There is also introduced simulation of flight measurement for identification software validation. Practical part is focused on experiment preparation, execution and evaluation. The airplane VUT 700 Specto had been chosen to carry out flight tests. The airplane was modified to the new electric powered VUT 700e Specto after first measurement flights with combustion engine. Data record from on-board measurement unit was completed by telemetric data from autopilot and remote control system. Flight tests were carried out in stabilised mode of autopilot in symmetric flight. The results were confronted with analytical analysis results and DATCOM+ software parameter estimation.
562

Error Awareness and Apathy in Moderate-to-Severe Traumatic Brain Injury

Logan, Dustin Michael 01 June 2014 (has links) (PDF)
Moderate-to-severe traumatic brain injury (M/S TBI) is a growing public health concern with significant impact on the cognitive functioning of survivors. Cognitive control and deficits in awareness have been linked to poor recovery and rehabilitation outcomes. One way to research cognitive control is through awareness of errors using electroencephalogram and event-related potentials (ERPs). Both the error-related negativity and the post-error positivity components of the ERP are linked to error awareness and cognitive control processes. Attentional capacity and levels of apathy influence error awareness in those with M/S TBI. There are strong links between awareness, attention, and apathy. However, limited research has examined the role of attention, awareness, and apathy using electrophysiological indices of error awareness to further understand cognitive control in a M/S TBI sample. The current study sought to elucidate the role of apathy in error awareness in those with M/S TBI. Participants included 75 neurologically-healthy controls (divided randomly into two control groups) and 24 individuals with M/S TBI. All participants completed self-report measures of mood, apathy, and executive functioning, as well as a brief neuropsychological battery to measure attention and cognitive ability. To measure awareness, participants completed the error awareness task (EAT), a modified Stroop go/no-go task. Participants signaled awareness of errors committed on the previous trial. The M/S TBI group decreased accuracy while improving or maintaining error awareness compared to controls over time. There were no significant between-group differences for ERN and Pe amplitudes. Levels of apathy in the M/S TBI group were included in three multiple regression analyses predicting proportion of unaware errors, ERN amplitude, and Pe amplitude. Apathy was predictive of error awareness, although not in the predicted direction. Major analyses were replicated using two distinct control groups to determine potential sample effects. Results showed consistent results comparing both control groups to a M/S TBI group. Findings show variable levels of awareness and accuracy over time for those with M/S TBI when compared to controls. Conclusions include varying levels of attention and awareness from the M/S TBI group over time, evidenced by improving awareness of errors when they are happening, but an inability to regulate performance sufficiently to improve accuracy. Levels of apathy are playing a role in error awareness, however, not in predicted directions. The study provides support for the role of attentional impairments in error awareness and encourages future studies to look for varying levels of performance within a given task when using populations linked to elevated levels of apathy and attentional deficits.
563

The Efficacy of Dynamic Written Corrective Feedback on Intermediate-high ESL Learners' Writing Accuracy

Lee, Soon Yeun 28 November 2009 (has links) (PDF)
This study investigated the efficacy of dynamic written corrective feedback (DWCF) on intermediate-high students' writing accuracy when compared to a traditional grammar instruction approach. DWCF is an innovative written corrective feedback method that requires a multifaceted process and interaction between the teacher and the students in order to help the students improve their writing accuracy. The central principle of DWCF is that feedback should be manageable, meaningful, timely, and constant. The research question was raised based on the positive effects of DWCF found in advanced-low and advanced-mid proficiency level students (Evans et al., in press; Evans, Hartshorn, & Strong-Krause, 2009; Hartshorn, 2008; Hartshorn et al., in press). Similar to previous studies, this study attempted to examine the effectiveness of DWCF in terms of proficiency level. It further explored students' perspectives and attitudes towards DWCF. Two groups of ESL students participated in this study: a control group (n=18) that was taught using a traditional grammar instruction method, and a treatment group (n=35) that was taught using a DWCF approach. The findings in this study revealed that both methods improved the intermediate-high students' linguistic accuracy in writing. However, the findings of this study suggest that the instruction utilizing DWCF is preferable to traditional grammar instruction when it comes to improving intermediate-high students' writing accuracy for two reasons: first, DWCF was slightly more effective than the traditional grammar instruction used, and second, students strongly preferred the instruction using DWCF to traditional grammar instruction. The findings of this study further validate other work suggesting the positive effects found in advanced proficiency levels. This study indicates that ESL learners benefit from manageable, meaningful, timely, and constant error feedback in improving their linguistic accuracy in writing. Furthermore, this study suggests the desirability of applying DWCF to other contexts.
564

The Econometrics of Piecewise Linear Budget Constraints With Skewed Error Distributons: An Application To Housing Demand In The Presence Of Capital Gains Taxation

Yan, Zheng 14 August 1999 (has links)
This paper examines the extent to which thin markets in conjunction with tax induced kinks in the budget constraint cause consumer demand to be skewed. To illustrate the principles I focus on the demand for owner-occupied housing. Housing units are indivisible and heterogeneous while tastes for housing are at least partly idiosyncratic, causing housing markets to be thin. In addition, prior to 1998, capital gains tax provisions introduced a sharp kink in the budget constraint of existing owner-occupiers in search of a new home: previous homeowners under age 55 paid no capital gains tax if they bought up, but were subject to capital gains tax if they bought down. I first characterize the economic conditions under which households err on the up or down side when choosing a home in the presence of a thin market and a kinked budget constraint. I then specify an empirical model that takes such effects into account. Results based on Monte Carlo experiments indicate that failing to allow for skewness in the demand for housing leads to biased estimates of the elasticities of demand when such skewness is actually present. In addition, estimates based on American Housing Survey data suggest that such bias is substantial: controlling for skewness reduces the price elasticity of demand among previous owner-occupiers from 1.6 to 0.3. Moreover, 58% of previous homeowners err on the up while only 42% err on the down side. Thus, housing demand is skewed. / Ph. D.
565

An analysis of a relationship between Remuneration and Labour Productivity in South Africa / Johannes Tshepiso Tsoku

Tsoku, Johannes Tshepiso January 2014 (has links)
This study analyses the relationship between remuneration (real wage) and labour productivity in South Africa at the macroeconomic level, using time series and econometric techniques. The results depict that there is a significant evidence of a structural break in 1990. The break appears to have affected the employment level and subsequently fed through into employees' remuneration (real wage) and productivity. A long run cointegrating relationship was found between remuneration and labour productivity for the period 1990 to 2011. In the long run, 1% increase in labour productivity is linked with an approximately 1.98% rise in remuneration. The coefficient of the error correction term in the labour productivity is large, indicating a rapid adjustment of labour productivity to equilibrium. However, remuneration does not Granger cause labour productivity and vice versa. / Thesis (M.Com.(Statistics) North-West University, Mafikeng Campus, 2014
566

JOINT SOURCE/CHANNEL CODING FOR TRANSMISSION OF MULTIPLE SOURCES

Wu, Zhenyu, Bilgin, Ali, Marcellin, Michael W. 10 1900 (has links)
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada / A practical joint source/channel coding algorithm is proposed for the transmission of multiple images and videos to reduce the overall reconstructed source distortion at the receiver within a given total bit rate. It is demonstrated that by joint coding of multiple sources with such an objective, both improved distortion performance as well as reduced quality variation can be achieved at the same time. Experimental results based on multiple images and video sequences justify our conclusion.
567

Decoding and Turbo Equalization for LDPC Codes Based on Nonlinear Programming

Iltis, Ronald A. 10 1900 (has links)
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California / Decoding and Turbo Equalization (TEQ) algorithms based on the Sum-Product Algorithm (SPA) are well established for LDPC codes. However there is increasing interest in linear and nonlinear programming (NLP)-based decoders which may offer computational and performance advantages over the SPA. We present NLP decoders and Turbo equalizers based on an Augmented Lagrangian formulation of the decoding problem. The decoders update estimates of both the Lagrange multipliers and transmitted codeword while solving an approximate quadratic programming problem. Simulation results show that the NLP decoder performance is intermediate between the SPA and bit-flipping algorithms. The NLP may thus be attractive in some applications as it eliminates the tanh/atanh computations in the SPA.
568

ENHANCING THE PCM/FM LINK - WITHOUT THE MATH

Fewer, Colm, Wilmot, Sinbad 10 1900 (has links)
ITC/USA 2007 Conference Proceedings / The Forty-Third Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2007 / Riviera Hotel & Convention Center, Las Vegas, Nevada / Since the 1970s PCM/FM has been the dominant modulation scheme used for RF telemetry. However more stringent spectrum availability as well as increasing data rates means that more advanced transmission methods are required to keep pace with industry demands. ARTM Tier-I and Tier-II are examples of how the PCM/FM link can be enhanced. However these techniques require a significant increase in the complexity of the receiver/detector for optimal recovery. This paper focuses on a quantitative approach to improving the rate and quality of data using existing PCM/FM links. In particular ACRA CONTROL and BAE SYSTEMS set themselves the goal of revisiting the pre-modulation filter, diversity combiner and bit-sync. By implementing programmable adaptive hardware, it was possible to explore the various tradeoffs offered by modifying pulse shapes and spectral occupancy, inclusion of forward error correction and smart source selection. This papers looks at the improvements achieved at each phase of the evaluation.
569

Fraud in Scots law

Reid, Dot January 2013 (has links)
This thesis seeks to provide a deeper understanding of the Scots law of fraud. Adopting a method that is both historical and doctrinal, it provides a critical analysis of the current understanding of fraud and argues for an approach that is more consistent with Scotland’s legal history which, in turn, was profoundly influenced by a much older tradition of European legal thought. It begins by exploring the historical scope of fraud in both a criminal and civil context with specific focus on questions of definition and the extent to which “fraud” was used in the broader sense of activities not involving deceit. A detailed analysis is given of the widespread concept of presumptive fraud by means of which Scots law was able to provide a remedy for unfair or unwarrantable behaviour without any requirement for a deceitful intention and for misstatements made unintentionally. The argument is made that presumptive fraud was a mechanism for delivering substantive justice and that its conceptual roots lie in an Aristotelian understanding of justice as equality. A comparison is made between the scholastic doctrine of restitution, which was developed by Thomas Aquinas as the outworking of the Aristotelian virtue of justice, and the scheme of Scots law created in the Institutions of the Law of Scotland by Viscount Stair (1619-1695), who is said to be the founding father of Scots law. It is suggested that the religious and philosophical conditions which existed in seventeenth century Scotland were particularly fertile soil for scholastic legal ideas which conceptualised law within a moral and religious framework. The second half of the thesis undertakes a doctrinal analysis of fraud in three parts. First, the complex relationship between fraud, error and misrepresentation is examined and the case is made that misrepresentation, whether intentional or unintentional, sits more comfortably in the law of fraud than in the law of error. Secondly, modern legal literature is critically assessed and the dominant modern narrative – that error induced by misrepresentation is a native concept in Scots law – is questioned. Thirdly, a new taxonomy of fraud is proposed which distinguishes between primary and secondary fraud. The operation of secondary fraud (which amounts to “participation” in the primary fraud of another and therefore involves three-party situations) is explored through the application of two familiar legal maxims: the “fraud” principle (that no one should be enriched through the fraud of another) and the good faith purchaser for value. In the context of secondary fraud, it is argued that the criteria for its operation - mala fides and a gratuitous transaction - are both core components of the older concept of presumptive fraud. The thesis comes full circle as it is suggested that while the broader equitable definition of fraud, rooted in equality, may have disappeared in the context of primary fraud, secondary fraud retains it.
570

HYPERSPECTRAL IMAGE COMPRESSION

Hallidy, William H., Jr., Doerr, Michael 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Systems & Processes Engineering Corporation (SPEC) compared compression and decompression algorithms and developed optimal forms of lossless and lossy compression for hyperspectral data. We examined the relationship between compression-induced distortion and additive noise, determined the effect of errors on the compressed data, and showed that the data could separate targets from clutter after more than 50:1 compression.

Page generated in 0.0391 seconds