• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2568
  • 1073
  • 779
  • 356
  • 271
  • 251
  • 93
  • 67
  • 54
  • 41
  • 37
  • 36
  • 34
  • 34
  • 32
  • Tagged with
  • 6674
  • 871
  • 779
  • 718
  • 638
  • 577
  • 550
  • 545
  • 407
  • 374
  • 354
  • 343
  • 334
  • 294
  • 260
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
701

Automatic Features Identification with Infrared Thermography in Fever Screening

Surabhi, Vijaykumar 12 January 2012 (has links)
The goal of this thesis is to develop an algorithm to process infrared images and achieve automatic identification of moving subjects with fever. The identification is based on two main features: the distinction between the geometry of a human face and other objects in the field of view of the camera, and the temperature of the radiating object. Infrared thermography is a remote sensing technique used to measure temperatures based on emitted infrared radiation. Applications include fever screening in major public places such as airports and hospitals. Current accepted practice of screening requires people to stay in a line and temperature measurements are carried out for one person at a time. However in the case of mass screening of moving people the accuracy of the measurements is still under investigation. An algorithm constituting of image processing to threshold objects based on the temperature, template matching and hypothesis testing is proposed to achieve automatic identification of fever subjects. The algorithm was first tested on training data to obtain a threshold value (used to discriminate between face and non face shapes) corresponding to a false detection rate of 5%, which in turn corresponds to 85% probability of detection using Neyman-Pearson criterion. By testing the algorithm on several simulated and experimental images (which reflect relevant scenarios characterizing crowded places) it is observed that it can be beneficially implemented to introduce automation in the process of detecting moving subjects with fever.
702

Deserting Gender: A Feminist Rhetorical Approach to Vietnam War Novels

Womack, Anne-Marie 2011 May 1900 (has links)
Female characters and references to femininity throughout American war literature disrupt discursive and biological divisions of the masculine and feminine. In examining gender and war literature over the twentieth century, I propose an alternative genealogy of American war literature in which narratives since the end of the nineteenth century initiate two related patterns of gender representation that Vietnam War literature dramatically expands: they critique aggression, camaraderie, and heroism, rejecting these traditional sites of masculinity through desertion narratives, and they harness sentimentality, domesticity, motherhood, and penetration, embracing these traditional sites of femininity in ways that disrupt gender norms. By examining these sites of cross-gender identification through psychoanalytic, rhetorical, and feminist methods, I argue that narratives by Stephen Crane, Ernest Hemingway, Kurt Vonnegut, Tim O'Brien, Stephen Wright, and Larry Heinemann reveal the power of contemporary redefinitions of gender by absorbing feminist discourse into the performance of masculinity.
703

Automatic Features Identification with Infrared Thermography in Fever Screening

Surabhi, Vijaykumar 12 January 2012 (has links)
The goal of this thesis is to develop an algorithm to process infrared images and achieve automatic identification of moving subjects with fever. The identification is based on two main features: the distinction between the geometry of a human face and other objects in the field of view of the camera, and the temperature of the radiating object. Infrared thermography is a remote sensing technique used to measure temperatures based on emitted infrared radiation. Applications include fever screening in major public places such as airports and hospitals. Current accepted practice of screening requires people to stay in a line and temperature measurements are carried out for one person at a time. However in the case of mass screening of moving people the accuracy of the measurements is still under investigation. An algorithm constituting of image processing to threshold objects based on the temperature, template matching and hypothesis testing is proposed to achieve automatic identification of fever subjects. The algorithm was first tested on training data to obtain a threshold value (used to discriminate between face and non face shapes) corresponding to a false detection rate of 5%, which in turn corresponds to 85% probability of detection using Neyman-Pearson criterion. By testing the algorithm on several simulated and experimental images (which reflect relevant scenarios characterizing crowded places) it is observed that it can be beneficially implemented to introduce automation in the process of detecting moving subjects with fever.
704

Essays in Industrial Organization and Econometrics

Blevins, Jason Ryan January 2010 (has links)
<p>This dissertation consists of three chapters relating to</p> <p>identification and inference in dynamic microeconometric models</p> <p>including dynamic discrete games with many players, dynamic games with</p> <p>discrete and continuous choices, and semiparametric binary choice and</p> <p>duration panel data models.</p> <p>The first chapter provides a framework for estimating large-scale</p> <p>dynamic discrete choice models (both single- and multi-agent models)</p> <p>in continuous time. The advantage of working in continuous time is</p> <p>that state changes occur sequentially, rather than simultaneously,</p> <p>avoiding a substantial curse of dimensionality that arises in</p> <p>multi-agent settings. Eliminating this computational bottleneck is</p> <p>the key to providing a seamless link between estimating the model and</p> <p>performing post-estimation counterfactuals. While recently developed</p> <p>two-step estimation techniques have made it possible to estimate</p> <p>large-scale problems, solving for equilibria remains computationally</p> <p>challenging. In many cases, the models that applied researchers</p> <p>estimate do not match the models that are then used to perform</p> <p>counterfactuals. By modeling decisions in continuous time, we are able</p> <p>to take advantage of the recent advances in estimation while</p> <p>preserving a tight link between estimation and policy experiments. We</p> <p>also consider estimation in situations with imperfectly sampled data,</p> <p>such as when we do not observe the decision not to move, or when data</p> <p>is aggregated over time, such as when only discrete-time data are</p> <p>available at regularly spaced intervals. We illustrate the power of</p> <p>our framework using several large-scale Monte Carlo experiments.</p> <p>The second chapter considers semiparametric panel data binary choice</p> <p>and duration models with fixed effects. Such models are point</p> <p>identified when at least one regressor has full support on the real</p> <p>line. It is common in practice, however, to have only discrete or</p> <p>continuous, but possibly bounded, regressors. We focus on</p> <p>identification, estimation, and inference for the identified set in</p> <p>such cases, when the parameters of interest may only be partially</p> <p>identified. We develop a set of general results for</p> <p>criterion-function-based estimation and inference in partially</p> <p>identified models which can be applied to both regular and irregular</p> <p>models. We apply our general results first to a fixed effects binary</p> <p>choice panel data model where we obtain a sharp characterization of</p> <p>the identified set and propose a consistent set estimator,</p> <p>establishing its rate of convergence under different conditions.</p> <p>Rates arbitrarily close to <italic>n<super>-1/3</super></italic> are</p> <p>possible when a continuous, but possibly bounded, regressor is</p> <p>present. When all regressors are discrete the estimates converge</p> <p>arbitrarily fast to the identified set. We also propose a</p> <p>subsampling-based procedure for constructing confidence regions in the</p> <p>models we consider. Finally, we carry out a series of Monte Carlo</p> <p>experiments to illustrate and evaluate the proposed procedures. We</p> <p>also consider extensions to other fixed effects panel data models such</p> <p>as binary choice models with lagged dependent variables and duration</p> <p>models.</p> <p>The third chapter considers nonparametric identification of dynamic</p> <p>games of incomplete information in which players make both discrete</p> <p>and continuous choices. Such models are commonly used in applied work</p> <p>in industrial organization where, for example, firms make discrete</p> <p>entry and exit decisions followed by continuous investment decisions.</p> <p>We first review existing identification results for single agent</p> <p>dynamic discrete choice models before turning to single-agent models</p> <p>with an additional continuous choice variable and finally to</p> <p>multi-agent models with both discrete and continuous choices. We</p> <p>provide conditions for nonparametric identification of the utility</p> <p>function in both cases.</p> / Dissertation
705

A Study of Exploring the Identification Relationships between Consumers and Companies

Lai, Ssu-chi 13 September 2012 (has links)
For companies, the cost of acquiring a new customer is several times the cost of retaining an existing customer. Furthermore, the purchasing power of an existing customer is usually more predominant than a new one. Therefore, more and more companies attempt to establish long-term relationships with customers in order to make them patronize again or do other useful behaviors. The concept of consumer-company identification (C-C identification) has been used to qualify the deep, meaningful, and long-term relationships between consumers and companies. This research analyzes the antecedents and consequences of C-C identification. We suggest that high identity distinctiveness and identity prestige may lead to high C-C identification. Also, high C-C identification should enhance consumers¡¦ repurchase intention and word-of-mouth (WOM). Our survey with questionnaire conducted to investigate and collect data is sent to the consumers who have shopping experience recently in department stores of Taiwan. The valid sample with questionnaire returned is 294. We analyze the results by using structural equation modeling (SEM) and find that identity distinctiveness and identity prestige have a positive relationship with C-C identification separately. Moreover, C-C identification also has a positive effect to repurchase intention and WOM. Eventually, the managerial implications of this study may help corporate managers to ponder their marketing strategies or even improve their firms¡¦ performance.
706

Subsurface Flow Management and Real-Time Production Optimization using Model Predictive Control

Lopez, Thomas Jai 2011 December 1900 (has links)
One of the key challenges in the Oil & Gas industry is to best manage reservoirs under different conditions, constrained by production rates based on various economic scenarios, in order to meet energy demands and maximize profit. To address the energy demand challenges, a transformation in the paradigm of the utilization of "real-time" data has to be brought to bear, as one changes from a static decision making to a dynamical and data-driven management of production in conjunction with real-time risk assessment. The use of modern methods of computational modeling and simulation may be the only means to account for the two major tasks involved in this paradigm shift: (1) large-scale computations; and (2) efficient utilization of the deluge of data streams. Recently, history matching and optimization were brought together in the oil industry into an integrated and more structured approach called optimal closed-loop reservoir management. Closed-loop control algorithms have already been applied extensively in other engineering fields, including aerospace, mechanical, electrical and chemical engineering. However, their applications to porous media flow, such as - in the current practices and improvements in oil and gas recovery, in aquifer management, in bio-landfill optimization, and in CO2 sequestration have been minimal due to the large-scale nature of existing problems that generate complex models for controller design and real-time implementation. Their applicability to a realistic field is also an open topic because of the large-scale nature of existing problems that generate complex models for controller design and real-time implementation, hindering its applicability. Basically, three sources of high-dimensionality can be identified from the underlying reservoir models: size of parameter space, size of state space, and the number of scenarios or realizations necessary to account for uncertainty. In this paper we will address type problem of high dimensionality by focusing on the mitigation of the size of the state-space models by means of model-order reduction techniques in a systems framework. We will show how one can obtain accurate reduced order models which are amenable to fast implementations in the closed-loop framework .The research will focus on System Identification (System-ID) (Jansen, 2009) and Model Predictive Control (MPC) (Gildin, 2008) to serve this purpose. A mathematical treatment of System-ID and MPC as applied to reservoir simulation will be presented. Linear MPC would be studied on two specific reservoir models after generating low-order reservoir models using System-ID methods. All the comparisons are provided from a set of realistic simulations using the commercial reservoir simulator called Eclipse. With the improvements in oil recovery and reductions in water production effectively for both the cases that were considered, we could reinforce our stance in proposing the implementation of MPC and System-ID towards the ultimate goal of "real-time" production optimization.
707

Combined analytical and experimental approaches to dynamic component stress prediction

Chierichetti, Maria 28 June 2012 (has links)
In modern times, the ability to investigate the aeroelastic behavior of dynamic components on rotorcraft has become essential for the prediction of their useful fatigue life. At the same time, the aeroelastic modeling of a rotorcraft is particularly complex and costly. Inaccuracies in numerical predictions are mostly due to imprecisions in the structural modeling, to the presence of structural degradation or to the limited information on aerodynamic loads. The integration of experimental measurements on dynamic components such as rotor blades has the potential to improve fatigue estimation, augment the knowledge of the dynamic behavior and inform numerical models. The objective of this research is the development of a combined numerical and experimental approach, named Confluence Algorithm, that accurately predicts the response of dynamic components with a limited set of experimental data. The integration of experimental measurements into a numerical algorithm enables the continuous and accurate tracking of the dynamic strain and stress fields. The Confluence Algorithm systematically updates the numerical model of the external loads, and mass and stiffness distributions to improve the representation and extrapolation of the experimental data, and to extract information on the response of the system at non-measured locations. The capabilities of this algorithm are first verified in a numerical framework and with well-controlled lab experiments. Numerical results from a comprehensive UH-60A multibody model are then compared with available experimental data. These analyses demonstrate that the integration of the Confluence Algorithm improves the accuracy of the numerical prediction of the dynamic response of systems characterized by a periodic behavior, even in presence of non-linearities. The algorithm enables the use of simplified models that are corrected through experimental data to achieve accurate tracking of the system.
708

Model and System Inversion with Applications in Nonlinear System Identification and Control

Markusson, Ola January 2001 (has links)
No description available.
709

Regression on Manifolds with Implications for System Identification

Ohlsson, Henrik January 2008 (has links)
<p>The trend today is to use many inexpensive sensors instead of a few expensive ones, since the same accuracy can generally be obtained by fusing several dependent measurements. It also follows that the robustness against failing sensors is improved. As a result, the need for high-dimensional regression techniques is increasing.</p><p>As measurements are dependent, the regressors will be constrained to some manifold. There is then a representation of the regressors, of the same dimension as the manifold, containing all predictive information. Since the manifold is commonly unknown, this representation has to be estimated using data. For this, manifold learning can be utilized. Having found a representation of the manifold constrained regressors, this low-dimensional representation can be used in an ordinary regression algorithm to find a prediction of the output. This has further been developed in the <em>Weight Determination by Manifold Regularization</em> (WDMR) approach.</p><p>In most regression problems, prior information can improve prediction results. This is also true for high-dimensional regression problems. Research to include physical prior knowledge in high-dimensional regression i.e., gray-box high-dimensional regression, has been rather limited, however. We explore the possibilities to include prior knowledge in high-dimensional manifold constrained regression by the means of regularization. The result will be called <em>gray-box WDMR</em>. In gray-box WDMR we have the possibility to restrict ourselves to predictions which are physically plausible. This is done by incorporating dynamical models for how the regressors evolve on the manifold.</p> / MOVIII
710

Data-based Harmonic Source Identification

Erfanian Mazin, Hooman Unknown Date
No description available.

Page generated in 0.1829 seconds