• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 596
  • 210
  • 136
  • 54
  • 52
  • 41
  • 20
  • 19
  • 16
  • 16
  • 8
  • 7
  • 6
  • 6
  • 5
  • Tagged with
  • 1458
  • 135
  • 129
  • 106
  • 102
  • 99
  • 95
  • 82
  • 81
  • 79
  • 76
  • 71
  • 70
  • 70
  • 69
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Examining Accuracy : Drönare och drönarangrepp: retorik, praktik och historia

Elvander, Adam January 2014 (has links)
The military conflicts of the early 21st century have seen the introduction and rise of a new military technology: the armed drone. With the United States acting as the driving force behind this technological advancement, the U.S Air Force and intelligence agency CIA have madedrones their weapon of choice for pursuing suspected terrorists and insurgents in various remotelocations. American military leaders and policy makers assert that the armed drone’s high levelof accuracy make it the best available weapons platform for this task. However, new researchshows that the use of drones may result in more civilian casualties than previously thought, andmay in fact be more fallible than conventional aircraft in this respect. This paper examines this discrepancy between rhetoric and practice, and attempts to find potential causes for this in the development and early use of the first armed drone, the MQ-1 Predator. The paper cites statements from President Barack Obama and CIA director John Brennan and contrasts them with a recent research report on drone-caused civilian casualties, as well as examples of drone strikes where the wrong targets were struck. The analysis of the development and early use of the Predator Drone draws comparisons to Donald Mackenzie’s account of the development of accuracy for cold-war-era intercontinental ballistic missiles, applying the science and technology-concepts he uses to the case of the armed drone. The paper concludes with the argument that the accuracy of the early armed drones is fundamentally misunderstood or overestimated by U.S leaders, and that there are circumstances in the development-history of the system that may have contributed to this inconsistency.
172

Increasing sales forecast accuracy with technique adoption in the forecasting process

Orrebrant, Richard, Hill, Adam January 2014 (has links)
Abstract   Purpose - The purpose with this thesis is to investigate how to increase sales forecast accuracy.   Methodology – To fulfil the purpose a case study was conducted. To collect data from the case study the authors performed interviews and gathered documents. The empirical data was then analysed and compared with the theoretical framework.   Result – The result shows that inaccuracies in forecasts are not necessarily because of the forecasting technique but can be a result from an unorganized forecasting process and having an inefficient information flow. The result further shows that it is not only important to review the information flow within the company but in the supply chain as whole to improve a forecast’s accuracy. The result also shows that time series can generate more accurate sales forecasts compared to only using qualitative techniques. It is, however, necessary to use a qualitative technique when creating time series. Time series only take time and sales history into account when forecasting, expertise regarding consumer behaviour, promotion activity, and so on, is therefore needed. It is also crucial to use qualitative techniques when selecting time series technique to achieve higher sales forecast accuracy. Personal expertise and experience are needed to identify if there is enough sales history, how much the sales are fluctuating, and if there will be any seasonality in the forecast. If companies gain knowledge about the benefits from each technique the combination can improve the forecasting process and increase the accuracy of the sales forecast.   Conclusions – This thesis, with support from a case study, shows how time series and qualitative techniques can be combined to achieve higher accuracy. Companies that want to achieve higher accuracy need to know how the different techniques work and what is needed to take into account when creating a sales forecast. It is also important to have knowledge about the benefits of a well-designed forecasting process, and to do that, improving the information flow both within the company and the supply chain is a necessity.      Research limitations – Because there are several different techniques to apply when creating a sales forecast, the authors could have involved more techniques in the investigation. The thesis work could also have used multiple case study objects to increase the external validity of the thesis.
173

The Epistemology of Measurement: A Model-based Account

Tal, Eran 07 January 2013 (has links)
Measurement is an indispensable part of physical science as well as of commerce, industry, and daily life. Measuring activities appear unproblematic when performed with familiar instruments such as thermometers and clocks, but a closer examination reveals a host of epistemological questions, including: 1. How is it possible to tell whether an instrument measures the quantity it is intended to? 2. What do claims to measurement accuracy amount to, and how might such claims be justified? 3. When is disagreement among instruments a sign of error, and when does it imply that instruments measure different quantities? Currently, these questions are almost completely ignored by philosophers of science, who view them as methodological concerns to be settled by scientists. This dissertation shows that these questions are not only philosophically worthy, but that their exploration has the potential to challenge fundamental assumptions in philosophy of science, including the distinction between measurement and prediction. The thesis outlines a model-based epistemology of physical measurement and uses it to address the questions above. To measure, I argue, is to estimate the value of a parameter in an idealized model of a physical process. Such estimation involves inference from the final state (‘indication’) of a process to the value range of a parameter (‘outcome’) in light of theoretical and statistical assumptions. Idealizations are necessary preconditions for the possibility of justifying such inferences. Similarly, claims to accuracy, error and quantity individuation can only be adjudicated against the background of an idealized representation of the measurement process. Chapters 1-3 develop this framework and use it to analyze the inferential structure of standardization procedures performed by contemporary standardization bureaus. Standardizing time, for example, is a matter of constructing idealized models of multiple atomic clocks in a way that allows consistent estimates of duration to be inferred from clock indications. Chapter 4 shows that calibration is a special sort of modeling activity, i.e. the activity of constructing and testing models of measurement processes. Contrary to contemporary philosophical views, the accuracy of measurement outcomes is properly evaluated by comparing model predictions to each other, rather than by comparing observations.
174

The Epistemology of Measurement: A Model-based Account

Tal, Eran 07 January 2013 (has links)
Measurement is an indispensable part of physical science as well as of commerce, industry, and daily life. Measuring activities appear unproblematic when performed with familiar instruments such as thermometers and clocks, but a closer examination reveals a host of epistemological questions, including: 1. How is it possible to tell whether an instrument measures the quantity it is intended to? 2. What do claims to measurement accuracy amount to, and how might such claims be justified? 3. When is disagreement among instruments a sign of error, and when does it imply that instruments measure different quantities? Currently, these questions are almost completely ignored by philosophers of science, who view them as methodological concerns to be settled by scientists. This dissertation shows that these questions are not only philosophically worthy, but that their exploration has the potential to challenge fundamental assumptions in philosophy of science, including the distinction between measurement and prediction. The thesis outlines a model-based epistemology of physical measurement and uses it to address the questions above. To measure, I argue, is to estimate the value of a parameter in an idealized model of a physical process. Such estimation involves inference from the final state (‘indication’) of a process to the value range of a parameter (‘outcome’) in light of theoretical and statistical assumptions. Idealizations are necessary preconditions for the possibility of justifying such inferences. Similarly, claims to accuracy, error and quantity individuation can only be adjudicated against the background of an idealized representation of the measurement process. Chapters 1-3 develop this framework and use it to analyze the inferential structure of standardization procedures performed by contemporary standardization bureaus. Standardizing time, for example, is a matter of constructing idealized models of multiple atomic clocks in a way that allows consistent estimates of duration to be inferred from clock indications. Chapter 4 shows that calibration is a special sort of modeling activity, i.e. the activity of constructing and testing models of measurement processes. Contrary to contemporary philosophical views, the accuracy of measurement outcomes is properly evaluated by comparing model predictions to each other, rather than by comparing observations.
175

Rejection Sensitivity, Information Processing Deficits, Attachment Style and Empathic Accuracy in Violent Relationships

Laurance Robillard Unknown Date (has links)
Relationship violence is a serious social problem. Given the prevalence and detrimental effects of relationship violence, much research has been undertaken to investigate the various risk factors that may be associated with this type of violence. In the present research, I examined the interrelationships among several correlates of violence (including rejection sensitivity, cognitive biases, decoding deficits and attachment style) in order to understand what differentiates physically abusive from non-abusive individuals. Hence, the current program of studies examined aggressive behaviours between partners with a focus on risk factors for violent behaviour in men and women and in particular on the role of rejection sensitivity in physically aggressive behaviour. In order to examine these constructs, the thesis includes six chapters. Following a review of the literature, a rationale was provided for the creation of an amended measure of rejection sensitivity as Downey and Feldman’s (1996) Rejection Sensitivity Questionnaire was not suitable for the purposes of the current thesis. Hence, a series of validation studies were conducted in Chapter 2 to test and develop a revised measure of rejection sensitivity that would be applicable to a wider range of intimate relationships (dating, cohabiting and married) and contexts. The study reported in Chapter 3, investigated the role of rejection sensitivity, hostile attributions and attachment patterns in the etiology of intimate partner violence. This study provided preliminary support for insecure attachment and negative attributions as the link between expectations of rejection and intimate partner violence, with a stronger link for male-perpetrated violence compared to female-perpetrated violence. Consistent with the marital violence literature, when mediator and moderator relationships existed, these occurred predominantly in married relationships (as opposed to dating or defacto relationships). The studies reported in Chapters 4 and 5 built on the foundations of Chapter 3 by incorporating two constructs, the ‘overattribution bias’ and empathic accuracy into the investigation of the associations between rejection sensitivity and violence. Specifically, the study reported in Chapter 4 examined the decoding deficits and inferential biases of maritally-violent and maritally-violent rejection-sensitive men when interpreting their own partner’s messages whilst engaging in a laboratory-based decoding task. Overall, results showed that maritally-violent partner rejection-sensitive men were less accurate than were maritally non-violent partner rejection-sensitive men when interpreting their wives’ positive messages and more accurate when interpreting their wives’ negative messages. Likewise, maritally-violent rejection-sensitive men displayed an inferential bias to perceive their wives’ messages as being more negative, critical and rejecting in intent than did maritally non-violent rejection-sensitive men. In addition, maritally-violent men as a group were less accurate for their own partner’s positive and neutral messages than were maritally non-violent men and more accurate for their own wives’ negative messages than were maritally non-violent men. Finally, maritally-violent men tended to attribute their wives’ messages as being significantly more negative, critical and rejecting in intent than did maritally non-violent men. Overall, the data suggested that both rejection sensitivity and marital violence were key factors that were associated with married men’s decoding problems and biased interpretation of their own wives’ messages. In extending the previous findings, the study reported in Chapter 5 examined the decoding accuracy and inferential biases of both maritally-violent and maritally-violent rejection-sensitive men and women in relation to female strangers’ messages. There were no differences between maritally-violent rejection-sensitive women and maritally non-violent rejection-sensitive women on decoding deficits and inferential biases for female strangers. However, there was a trend for maritally-violent women to be more negatively biased than were maritally non-violent women when interpreting female strangers’ messages. Additionally, in contrast to the findings of Chapter 4, the data pointed to independent relationships among rejection sensitivity, violence and married men’s decoding deficits and biases for female strangers’ messages. In particular, there were no differences in decoding deficits or inferential biases between maritally violent rejection-sensitive and maritally non-violent rejection-sensitive men when decoding female strangers’ messages. Instead, the data revealed that maritally-violent men were poor decoders of female strangers’ positive messages compared to maritally non-violent men and maritally-violent women. In relation to negative messages, maritally-violent men were more accurate for female strangers’ negative messages than were maritally non-violent men. Maritally violent men had the highest decoding accuracy for negative messages. Maritally-violent men also tended to attribute female strangers’ messages as being significantly more negative, critical and rejecting in intent than did maritally non-violent men and maritally-violent women. Finally, the results showed that maritally-violent rejection-sensitive men’s decoding deficits and biases were relationship specific whereas maritally-violent men’s decoding deficits and cognitive biases were global deficits that extended to women other than the men’s wives. Implications of the findings were discussed, as well as the strengths and limitations of the study. The discussion concludes with implications for theory and practice and suggestions for future research.
176

Dimensional Changes of Investment Cast H13 Tool Steel : Measurement and Numerical Modelling

Morwood, Gregory David Unknown Date (has links)
The recent development of prototyping systems which can produce patterns for investment casting with significant time reduction from traditional techniques, has raised interest in the use of casting as a method to produce tooling for downstream prototype testing. However, the accuracy of the casting process remains a major obstacle to the use of these tools. Simultaneous development of numerical modelling techniques suggest that it will be possible to predict casting contraction and distortion. If this were possible, corrections could be made before castings are produced, resulting in time and cost savings, as well as potential improvement in the accuracy. Before these models can be applied, there is a need for both material property data and experimental data with which to validate the numerical models. The aims of this work are to: 1) Develop further understanding of the processes in investment casting that contribute to the dimensional changes and variability. 2) Develop the required data for numerical modelling and apply this to simulate the dimensional changes in investment casting. An apparatus has been designed to measure the dimensional and thermal history of investment castings with displacement transducers and thermocouples. Casting dimensions were also accurately measured to determine the final contraction of nominally unconstrained and thermally constrained castings. Numerical simulations of the temperatures, stress and distortion were compared with the experimental results and provide a detailed explanation of the processes involved. Data for these simulations were developed using a combination of direct measurement and iterative inverse modelling.
177

Advances in Modeling of Physical Systems Using Explicitly Correlated Gaussian Functions

Kirnosov, Nikita January 2015 (has links)
In this dissertation recent advances in modeling various atomic and molecular systems with quantum mechanical calculations employing explicitly correlated Gaussian functions are presented. The author has utilized multiple approaches and considered a number of approximations to develop optimal calculation frameworks. Electronic and muonic molecules and atoms have been considered. A number of unique calculations have been performed and some novel and interesting results, including high accuracy description of the charge asymmetry in the heteronuclear systems and lifetimes of rotationless vibrational levels of diatomic molecules, have been generated.
178

Khamapirad radiologic criteria as a predictor of pneumonia's bacterial etiology

Bustamante Heinsohn, Diego Victor 11 1900 (has links)
Revisión por pares
179

Three-dimensional analysis of mandibular landmarks, planes and shape, and the symphyseal changes associated with growth and orthodontic treatment

Deller, Cecilia Mercedes 25 October 2017 (has links)
OBJECTIVE: To test reliability of 3D mandibular landmarks, planes of reference and surfaces and assess their correlation to conventional 2D cephalometric measurements. To analyze changes in three-dimensional shape of the symphysis due to growth and orthodontic treatment. METHODS: This was a retrospective analysis of CBCTs of healthy orthodontic patients. 32 subjects were included, 16 males and 16 females. Mean ages of 10.6 ± 1.5 years and 15.0 ± 0.9 years before and after treatment, respectively. The mean follow up time was 4.3 years. Subjects free of any craniofacial anomalies, and no observable pathology on panoramic radiograph were. 15 subjects had CVM 1 and 17 subjects had CVM 2 before orthodontic treatment. All subjects had CVM 5 after orthodontic treatment. For the first phase, 3D mandibular landmark identifications were digitized. Planes and landmarks were constructed and compared with conventional 2D mandibular measurements. For the second phase, mandibles were isolated by removing surrounding structures. Pearson correlation and paired t-test were performed to test for correlation and differences between 2D and 3D measurements, respectively. Statistical analysis was performed using SAS 9.4. Software. MorphoJ software (Version 2.0, www.flywings.org.uk) was used for symphysis shape analysis; and Discriminant Function Analysis (DFA) between pre-treatment and post-treatment was used for statistical analysis of the symphysis. RESULTS: We found statistical significant positive correlation between 2D and 3D pre-treatment ramus height (P-value =0.01), post-treatment ramus height (P-value < 0.0001), pre-treatment corpus length (P-value 0.0003), post-treatment corpus length (P-value 0.04), pre-treatment gonial angle (P-value <0.0001), and post-treatment gonial angle (P-value=0.05). Also, statistically significant differences in 2D ramus height (P=0.001), 3D ramus height (P-value=0.002), 2D corpus length (P-value <0.01), and 3D corpus length (P-value <0.01). For symphysis shape comparing between pre-treatment and post-treatment, we found that there is no statistically significant difference between them (P-value= 0.99). CONCLUSION: These results demonstrated statistically significant positive correlation between certain 2D and 3D measurements, pre-treatment and post-treatment differences in 2D and 3D measurements showed consistent results. Symphysis shapes do break out as distinctly separate groups, but the differences between the means is small.
180

REMOTE SENSING BASED DETECTION OF FORESTED WETLANDS: AN EVALUATION OF LIDAR, AERIAL IMAGERY, AND THEIR DATA FUSION

Suiter, Ashley E. 01 May 2015 (has links)
Multi-spectral imagery provides a robust and low-cost dataset for assessing wetland extent and quality over broad regions and is frequently used for wetland inventories. However in forested wetlands, hydrology is obscured by tree canopy making it difficult to detect with multi-spectral imagery alone. Because of this, classification of forested wetlands often includes greater errors than that of other wetlands types. Elevation and terrain derivatives have been shown to be useful for modelling wetland hydrology. But, few studies have addressed the use of LiDAR intensity data detecting hydrology in forested wetlands. Due the tendency of LiDAR signal to be attenuated by water, this research proposed the fusion of LiDAR intensity data with LiDAR elevation, terrain data, and aerial imagery, for the detection of forested wetland hydrology. We examined the utility of LiDAR intensity data and determined whether the fusion of Lidar derived data with multispectral imagery increased the accuracy of forested wetland classification compared with a classification performed with only multi-spectral image. Four classifications were performed: Classification A - All Imagery, Classification B - All LiDAR, Classification C - LiDAR without Intensity, and Classification D - Fusion of All Data. These classifications were performed using random forest and each resulted in a 3-foot resolution thematic raster of forested upland and forested wetland locations in Vermilion County, Illinois. The accuracies of these classifications were compared using Kappa Coefficient of Agreement. Importance statistics produced within the random forest classifier were evaluated in order to understand the contribution of individual datasets. Classification D, which used the fusion of LiDAR and multi-spectral imagery as input variables, had moderate to strong agreement between reference data and classification results. It was found that Classification A performed using all the LiDAR data and its derivatives (intensity, elevation, slope, aspect, curvatures, and Topographic Wetness Index) was the most accurate classification with Kappa: 78.04%, indicating moderate to strong agreement. However, Classification C, performed with LiDAR derivative without intensity data had less agreement than would be expected by chance, indicating that LiDAR contributed significantly to the accuracy of Classification B.

Page generated in 0.207 seconds