• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 585
  • 210
  • 136
  • 53
  • 48
  • 41
  • 20
  • 19
  • 16
  • 15
  • 8
  • 7
  • 6
  • 6
  • 5
  • Tagged with
  • 1441
  • 130
  • 127
  • 103
  • 102
  • 99
  • 95
  • 80
  • 79
  • 78
  • 76
  • 70
  • 70
  • 69
  • 67
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Assessing Resolution Tradeoffs Of Remote Sensing Data Via Classification Accuracy Cubes For Sensor Selection And Design

Johnson, Darrell Wesley 13 May 2006 (has links)
In order to aid federal agencies and private companies in the ever-growing problem of invasive species target detection, an investigation has been done on classification accuracy data cubes for use in the determination of spectral, spatial, and temporal sensor resolution requirements. The data cube is the result of a developed automated target recognition system that begins with ?ideal? hyperspectral data, and then reduces and combines spectral and spatial resolutions. The reduced data is subjected to testing methods using the Best Spectral Bands (BSB) and the All Spectral Bands (ASB) approaches and classification methods using nearest mean (NM), nearest neighbor (NN), and maximum likelihood (ML) classifiers. The effectiveness of the system is tested via two target-nontarget case studies, namely, terrestrial Cogongrass (Imperata cylindrica)-Johnsongrass (Sorghum halepense), and aquatic Water Hyacinth (Eichhornia crassipes)-American Lotus (Nelumbo lutea). Results reveal the effects, or trade-offs, of spectral-spatial-temporal resolution combinations on the ability of an ATR system to accurately detect the target invasive species. For example, in the aquatic vegetation case study, overall classification accuracies of around 90% or higher can be obtained during the month of August for spectral resolutions of 80 ? 1000nm FWHM for target abundances of 70 ? 100% per pixel. Furthermore, the ATR system demonstrates the use of resolution cubes that can be readily used to design or select cost-effective sensors for use in invasive species target detection, since lower resolution combinations may be acceptable in order to gain satisfactory classification accuracy results.
112

Effect of machine vision based traffic data collection accuracy on traffic noise

Nadella, Sunita January 2002 (has links)
No description available.
113

A Correlational Study of the Relationships Between Syntactical Accuracy, Lexical Accuracy and the Quality of Turkish EFL Student Writing

Gonulal, Talip 22 June 2012 (has links)
No description available.
114

Quality Assessment of Spatial Data: Positional Uncertainties of the National Shoreline Data of Sweden

Hast, Isak January 2014 (has links)
This study investigates on the planimetric (x, y) positional accuracy of the National Shoreline (NSL) data, produced in collaboration between the Swedish mapping agency Lantmäteriet and the Swedish Maritime Administration (SMA). Due to the compound nature of shorelines, such data is afflicted by substantial positional uncertainties. In contrast, the positional accuracy requirements of NSL data are high. An apparent problem is that Lantmäteriet do not measure the positional accuracy of NSL in accordance to the NSL data product specification. In addition, currently, there is little understanding of the latent positional changes of shorelines affected by the component of time, in direct influence of the accuracy of NSL. Therefore, in accordance to the two specific aims of this study, first, an accuracy assessment technique is applied so that to measure the positional accuracy of NSL. Second, positional time changes of NSL are analysed. This study provides with an overview of potential problems and future prospects of NSL, which can be used by Lantmäteriet to improve the quality assurance of the data. Two line-based NSL data sets within the NSL classified regions of Sweden are selected. Positional uncertainties of the selected NSL areas are investigated using two distinctive methodologies. First, an accuracy assessment method is applied and accuracy metrics by the root-means-square error (RMSE) are derived. The accuracy metrics are checked toward specification and standard accuracy tolerances. Results of the assessment by the calculated RMSE metrics in comparison to tolerances indicate on an approved accuracy of tested data. Second, positional changes of NSL data are measured using a proposed space-time analysis technique. The results of the analysis reveal significant discrepancies between the two areas investigated, indicating that one of the test areas are influenced by much greater positional changes over time. The accuracy assessment method used in this study has a number of apparent constraints. One manifested restriction is the potential presence of bias in the derived accuracy metrics. In mind of current restrictions, the method to be preferred in assessment of the positional accuracy of NSL is a visual inspection towards aerial photographs. In regard of the result of the space-time analysis, one important conclusion can be made. Time-dependent positional discrepancies between the two areas investigated, indicate that Swedish coastlines are affected by divergent degrees of positional changes over time. Therefore, Lantmäteriet should consider updating NSL data at different time phases dependent on the prevailing regional changes so that to assure the currently specified positional accuracy of the entire data structure of NSL.
115

Results for Precise GPS Trajectography Computation without Base Station

Morán, Guillermo Martinez 10 1900 (has links)
ITC/USA 2015 Conference Proceedings / The Fifty-First Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2015 / Bally's Hotel & Convention Center, Las Vegas, NV / The use of differential GPS post-procesing for precise trajectography computation has been widely used since early 90s. Up to recent dates, installation of a GPS receiver in a well known position (base station) has been mandatory. Operating range from this base station varies from 50 km up to 100 km, depending on the accuracy required, which impose single or dual frequency GPS technique. Nowadays, the huge amount of GPS base stations continuous logging data worldwide have allowed to improve the error models a lot. Using these precise models, it is possible to achieve centimeter accuracy in GPS trajectography by using only one GPS receiver without range to a base station restrictions. This technique is called Precise Point Positioning (PPP). The performance results for PPP obtained after a real 10 flights campaign will be presented.
116

Improving data quality : data consistency, deduplication, currency and accuracy

Yu, Wenyuan January 2013 (has links)
Data quality is one of the key problems in data management. An unprecedented amount of data has been accumulated and has become a valuable asset of an organization. The value of the data relies greatly on its quality. However, data is often dirty in real life. It may be inconsistent, duplicated, stale, inaccurate or incomplete, which can reduce its usability and increase the cost of businesses. Consequently the need for improving data quality arises, which comprises of five central issues of improving data quality, namely, data consistency, data deduplication, data currency, data accuracy and information completeness. This thesis presents the results of our work on the first four issues with regards to data consistency, deduplication, currency and accuracy. The first part of the thesis investigates incremental verifications of data consistencies in distributed data. Given a distributed database D, a set S of conditional functional dependencies (CFDs), the set V of violations of the CFDs in D, and updates ΔD to D, it is to find, with minimum data shipment, changes ΔV to V in response to ΔD. Although the problems are intractable, we show that they are bounded: there exist algorithms to detect errors such that their computational cost and data shipment are both linear in the size of ΔD and ΔV, independent of the size of the database D. Such incremental algorithms are provided for both vertically and horizontally partitioned data, and we show that the algorithms are optimal. The second part of the thesis studies the interaction between record matching and data repairing. Record matching, the main technique underlying data deduplication, aims to identify tuples that refer to the same real-world object, and repairing is to make a database consistent by fixing errors in the data using constraints. These are treated as separate processes in most data cleaning systems, based on heuristic solutions. However, our studies show that repairing can effectively help us identify matches, and vice versa. To capture the interaction, a uniform framework that seamlessly unifies repairing and matching operations is proposed to clean a database based on integrity constraints, matching rules and master data. The third part of the thesis presents our study of finding certain fixes that are absolutely correct for data repairing. Data repairing methods based on integrity constraints are normally heuristic, and they may not find certain fixes. Worse still, they may even introduce new errors when attempting to repair the data, which may not work well when repairing critical data such as medical records, in which a seemingly minor error often has disastrous consequences. We propose a framework and an algorithm to find certain fixes, based on master data, a class of editing rules and user interactions. A prototype system is also developed. The fourth part of the thesis introduces inferring data currency and consistency for conflict resolution, where data currency aims to identify the current values of entities, and conflict resolution is to combine tuples that pertain to the same real-world entity into a single tuple and resolve conflicts, which is also an important issue for data deduplication. We show that data currency and consistency help each other in resolving conflicts. We study a number of associated fundamental problems, and develop an approach for conflict resolution by inferring data currency and consistency. The last part of the thesis reports our study of data accuracy on the longstanding relative accuracy problem which is to determine, given tuples t1 and t2 that refer to the same entity e, whether t1[A] is more accurate than t2[A], i.e., t1[A] is closer to the true value of the A attribute of e than t2[A]. We introduce a class of accuracy rules and an inference system with a chase procedure to deduce relative accuracy, and the related fundamental problems are studied. We also propose a framework and algorithms for inferring accurate values with users’ interaction.
117

Non-orthogonal spin-adaptation and application to coupled cluster up to quadruple excitations

Matthews, Devin Alexander 24 October 2014 (has links)
The theory of non-orthogonal spin-adaptation for closed-shell molecular systems is presented, with an emphasis on application to the coupled cluster family of electronic structure methods. To aid in the derivation of efficient and compact working equations, a new diagrammatic interpretation of the Goldstone diagrams is derived which only requires a small number of the many distinct diagrams and which directly produces equations in a factored form in terms of “spin-summed” tensor elements. This diagrammatic interpretation is applied to coupled cluster methods with quadruple excitations (CCSDTQ), including coupled cluster with a perturbative correction for quadruple excitations (CCSDT(Q)) and to CCSDTQ gradients and properties. The advantages of the non-orthogonal spin-adaption with respect to simplification and factorization of the working equations and to efficient implementation are presented and discussed. Additionally, specific optimizations of the implementation for often-overlooked issues such as tensor transposition, disk access, and removal of redundant and/or unnecessary operations are detailed. The resulting algorithm is implemented for the CCSDTQ and CCSDT(Q) methods and compared to existing codes, where a one to two order-of-magnitude improvement in efficiency is observed. The new implementation is also used for calculations on several larger molecular systems to illustrate the scalability of the method. / text
118

The Early Detection of Depression from Social Networking Sites

Holleran, Shannon January 2010 (has links)
Depression has a high prevalence among college students. Because it is a highly private (i.e. experiential) and socially stigmatized mental illness, it often goes undetected in daily life. The basic research question behind this line of research is how students' postings on their social networking websites can be used for the early detection of depression. The current research investigates how well depression can be gauged from MySpace profiles (Study 1) and Facebook profiles (Study 2 & Study 3). Across studies, the results reveal that depression can be assessed with a moderate degree of accuracy. In addition, Study 3 presents evidence that viewing "mini-blogs" allows for similar levels of accuracy compared to viewing an entire profile and the degree to which a person is Extraverted or censors information about themselves (e.g. Impression Management, Public Self-Consciousness) influences the degree of accuracy. Overall, the results speak to the idea that social networking sites can be a cost effective and clinically relevant tool to detecting depression.
119

My Spider-Sense Needs Calibrating: Anticipated Reactions to Spider Stimuli Poorly Predict Initial Responding

Burger, Sarah Beth January 2012 (has links)
The present study attempted to answer two general questions: (1) what is the relation between expected and actual reactions to a spider in individuals afraid of spiders? and (2) are inaccurate expectancies updated on the basis of experience? Behavioral and cognitive-behavioral learning models of fear, treatment protocols developed in relation to these, and recent findings from our laboratory necessitated answers to two additional questions: (3) does the expectation accuracy of individuals who meet DSM-IV criteria for diagnosis with a specific phobia differ from that of individuals who are fearful but do not meet criteria? and (4) does expectation accuracy vary as a function of context? Two final questions were obvious: (5) do the actual reactions of individuals who meet criteria for diagnosis differ predictably from those of fearful individuals? and (6) do reactions vary contextually? Student participants reported and tested a series of trial-specific expectancies about their reactions to a live, mechanical, or virtual tarantula over seven trials. Participants then completed three final trials in the presence of a live tarantula. Participants poorly anticipated the quality and intensity of their initial reactions, but expectation accuracy increased quickly. No clear tendencies for over- or under-prediction emerged. Participants updated expectancies in relation to prior trial expectation accuracy, either increasing or decreasing their predicted reactions relative to the original expectancy. Participants who met criteria for diagnosis with a specific phobia consistently anticipated and reported more intense reactions than did those who were fearful, but diagnostic status was not predictive of expectation accuracy. Participants in the live and virtual spider groups reported similar levels of fear that were greater than those in the mechanical spider group. Participants in the virtual spider group more readily reduced the distance maintained between themselves and the spider stimulus than did those in the live or mechanical spider groups. Expectation accuracy did not vary contextually. Results are discussed in light of the theoretical models presented, with findings lending greater support to behavioral models of fear learning relative to cognitive models that postulate a substantial role for conscious processing and appraisal in specific fear. Practical recommendations are made to researchers and clinicians based on present findings.
120

Effects of keyboard layout on children's performance and interaction with computers

Roussos, Petros January 1992 (has links)
No description available.

Page generated in 0.2951 seconds