• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 764
  • 222
  • 87
  • 68
  • 60
  • 33
  • 30
  • 24
  • 20
  • 15
  • 10
  • 7
  • 7
  • 6
  • 5
  • Tagged with
  • 1554
  • 272
  • 203
  • 188
  • 154
  • 147
  • 144
  • 143
  • 128
  • 125
  • 87
  • 87
  • 85
  • 81
  • 81
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
701

Control of EGR and VGT for emission control and pumping work minimization in diesel engines

Wahlström, Johan January 2006 (has links)
<p>Legislators steadily increase the demands on lowered emissions from heavy duty vehicles. To meet these demands it is necessary to integrate technologies like Exhaust Gas Recirculation (EGR) and Variable Geometry Turbochargers (VGT) together with advanced control systems. A control structure with PID controllers and selectors is proposed and investigated for coordinated control of EGR valve and VGT position in heavy duty diesel engines. Main control goals are to fulfill the legislated emission levels, to reduce the fuel consumption, and to fulfill safe operation of the turbocharger. These goals are achieved through regulation of normalized oxygen/fuel ratio and intake manifold EGR-fraction. These are chosen as main performance variables since they are strongly coupled to the emissions, compared to manifold pressure or air mass flow, which makes it easy to adjust set-points depending on e.g. measured emissions during an emission calibration process. In addition a mechanism for fuel efficient operation is incorporated in the structure, this is achieved by minimizing the pumping work. To design a successful control structure, a mean value model of a diesel engine is developed and validated. The intended applications of the model are system analysis, simulation, and development of model-based control systems. Model equations and tuning methods for the model parameters are described for each subsystem in the model. Static and dynamic validations of the entire model show mean relative errors that are less than 12%. Based on a system analysis of the model, a key characteristic behind the control structure is that oxygen/fuel ratio is controlled by the EGR-valve and EGR-fraction by the VGT-position, in order to handle a sign reversal in the system from VGT to oxygen/fuel ratio. For efficient calibration an automatic controller tuning method is developed. The controller objectives are captured in a cost function, that is evaluated utilizing a method choosing representative transients. The performance is evaluated on the European Transient Cycle. It is demonstrated how the weights in the cost function influence behavior, and that the tuning method is important in order to improve the control performance compared to if only a standard method is used. It is also demonstrated that the controller structure performs well regarding all control objectives. In combination with its efficient tuning, the controller structure thus fulfills all requirements for successful application.</p> / Report code: LiU-TEK-LIC-2006:52.
702

The Performance of Technical Analysis : A case study in Chinese domestic A share

Geng, Haoming, Wang, Cheng January 2010 (has links)
<p>In this thesis, we conduct a case study by applying simple technical trading rules on Chinese stock market. The technical trading rules we tested are moving average rules and trading range breakout rules. The stock indices we tested are SSE A (Shanghai A) and SZSE (Shenzhen A) share, these shares are limited to the Chinese domestic traders. Our main trading rule frameworks are mainly from Brock, Lakonishok& Lebaron (1992), which including the most basic technical trading rules and covered various length of period, however we add the 25 days moving average to our frame work. We obtained our data from DataStream; the data are the daily closing prices of two indices we mentioned above.</p><p>We compared the mean return and Sharpe ratio with buy and hold. We further calculated breakeven transaction costs to test whether the technical trading rules can still add wealth to investors after adjusting the transaction costs. Our results showed that most technical trading rules perform better than buy and hold. VMA perform better than FMA and TRB, short period (25 and 50 days) performed better than longer period. On mean return, our data violated the assumption of parametric statistical test. We performed non-parametric tests, all the trading rules showed statistical significance at 95% level than buy and hold except FMA (1, 25,0), all the trading rules resulted higher Sharpe ratio than buy and hold. On transaction costs, 7 trading rules on SSE A are performed poorer than buy and hold, all the other rules provided positive breakeven transaction costs. Across the entire trading rule, both stock markets offered positive break-even transaction costs, 0.436% for SSE A and 1.369% for SZSE A. and they are both higher than the maximum transaction costs one bears.</p>
703

Pooling Data from Similar Randomized Clinical Trials Comparing Latanoprost with Timolol; Medical Results and Statistical Aspects

Hedman, Katarina January 2003 (has links)
<p>Two different principles were studied. 1st - statistical analysis techniques were used to obtain medical results from a patient population. 2nd - the patient population was used to study the statistical analysis techniques. </p><p>Medical conclusions: latanoprost and timolol treatment showed a statistically significant and clinically useful mean IOP-reduction in a typical worldwide clinical trial population. Latanoprost reduced the IOP 1.6 mm Hg more than timolol. The IOP-reduction was maintained with timolol and slightly enforced with latanoprost up to 6 months of treatment. The mean IOP-reduction was maintained during 2 years of latanoprost treatment. The overall risk of withdrawal due to insufficient IOP-reduction with latanoprost was 8%. </p><p>The statistical methodological issues are of a general and reoccurring character in trial design of the IOP-reduction: should the statistical hypothesis testing be based on the mean intraocular pressure (IOP) or the proportion of patients who reach a specific IOP level, should the estimate of the IOP or IOP-reduction be based on single eyes, mean of bilaterally eligible and identically treated eyes or the difference between an eye with active treatment and a placebo treated contralateral eye, and is mean of replicated recordings useful? Statistical methodological conclusions: the most effective response variable varies with the selected patient population. Therefore, the trial design process should include a comparison of the variability, test power and required sample size for the possible response variables in a sample of the target population. At minimum a statistical consideration should be done.</p>
704

Nonparametric statistical inference for dependent censored data

El Ghouch, Anouar 05 October 2007 (has links)
A frequent problem that appears in practical survival data analysis is censoring. A censored observation occurs when the observation of the event time (duration or survival time) may be prevented by the occurrence of an earlier competing event (censoring time). Censoring may be due to different causes. For example, the loss of some subjects under study, the end of the follow-up period, drop out or the termination of the study and the limitation in the sensitivity of a measurement instrument. The literature about censored data focuses on the i.i.d. case. However in many real applications the data are collected sequentially in time or space and so the assumption of independence in such case does not hold. Here we only give some typical examples from the literature involving correlated data which are subject to censoring. In the clinical trials domain it frequently happens that the patients from the same hospital have correlated survival times due to unmeasured variables like the quality of the hospital equipment. Censored correlated data are also a common problem in the domain of environmental and spatial (geographical or ecological) statistics. In fact, due to the process being used in the data sampling procedure, e.g. the analytical equipment, only the measurements which exceed some thresholds, for example the method detection limits or the instrumental detection limits, can be included in the data analysis. Many other examples can also be found in other fields like econometrics and financial statistics. Observations on duration of unemployment e.g., may be right censored and are typically correlated. When the data are not independent and are subject to censoring, estimation and inference become more challenging mathematical problems with a wide area of applications. In this context, we propose here some new and flexible tools based on a nonparametric approach. More precisely, allowing dependence between individuals, our main contribution to this domain concerns the following aspects. First, we are interested in developing more suitable confidence intervals for a general class of functionals of a survival distribution via the empirical likelihood method. Secondly, we study the problem of conditional mean estimation using the local linear technique. Thirdly, we develop and study a new estimator of the conditional quantile function also based on the local linear method. In this dissertation, for each proposed method, asymptotic results like consistency and asymptotic normality are derived and the finite sample performance is evaluated in a simulation study.
705

Effects of interference on carrier tracking in fading and symbol synchronization

Emad, Amin 11 1900 (has links)
Synchronization is a very important part of every digital communication receiver. While in bandpass coherent transmission, frequency and phase synchronization play a very important role in reliable transmission, symbol timing recovery is a necessary part of every baseband and bandpass coherent receiver. This dissertation deals with the problem of synchronization in the presence of fading and interference. First, the performance of an automatic frequency control loop is investigated using two parameters of average switching rate and mean time to loss of lock. These parameters are derived in closed-form or as integral-form formulas for different scenarios of modulated and unmodulated signals in different fading channels when there is one interference signal present at the input of the AFC. Then, the results are generalized to the noisy fading scenario and it is shown that in Rayleigh fading case, the performance of AFC becomes better when the desired signal is noisier. In the second part, the problem of symbol timing recovery is investigated in systems that are subject to intersymbol interference and non-data-aided maximum likelihood synchronizer is derived in these channels. Then, a new simple bound on the performance of synchronizers is derived and compared to the previously known lower bounds. It is shown that while this lower bound solves the shortcomings of the well known modified Cramer-Rao bound at small values of signal-to-noise-ratio, it is much easier to compute compared to another well known bound, the detection theory bound. / Communications
706

Queer Threats and Abject Desires in Four Films from New American Cinema

Gay, Christian 10 August 2009 (has links)
This dissertation is an in-depth critical analysis of four American films made during the 1970s, with emphasis placed on the films' construction of gender and sexuality. This dissertation draws from the tradition of queer film criticism presented in the writings of such theorists as Barbara Creed, Alexander Doty, Richard Dyer, Vito Russo, and Eve Kosofsky Sedgwick. Taking a queer perspective, these film readings explore how particular works implement queer codes and foster a sexually ambiguous world on film. While not typically included in discussions of Queer Cinema or New American Cinema, these four films, Martin Scorsese's Mean Streets (1973), Francis Ford Coppola's The Conversation (1974), Steven Spielberg's Jaws (1975), and Stanley Kubrick's The Shining (1980), exhibit a family resemblance and as a cycle are products of a particular period in American cinematic experimentation. A detailed scene-by-scene analysis is enacted in order to bring to light queer moments in the films and queer concerns of the films' makers. Raising questions about how the camera constructs character identities in these films, this study is reflective of the ways queer perspectives inflect filmmaking from this era.
707

Nonparametric statistical inference for dependent censored data

El Ghouch, Anouar 05 October 2007 (has links)
A frequent problem that appears in practical survival data analysis is censoring. A censored observation occurs when the observation of the event time (duration or survival time) may be prevented by the occurrence of an earlier competing event (censoring time). Censoring may be due to different causes. For example, the loss of some subjects under study, the end of the follow-up period, drop out or the termination of the study and the limitation in the sensitivity of a measurement instrument. The literature about censored data focuses on the i.i.d. case. However in many real applications the data are collected sequentially in time or space and so the assumption of independence in such case does not hold. Here we only give some typical examples from the literature involving correlated data which are subject to censoring. In the clinical trials domain it frequently happens that the patients from the same hospital have correlated survival times due to unmeasured variables like the quality of the hospital equipment. Censored correlated data are also a common problem in the domain of environmental and spatial (geographical or ecological) statistics. In fact, due to the process being used in the data sampling procedure, e.g. the analytical equipment, only the measurements which exceed some thresholds, for example the method detection limits or the instrumental detection limits, can be included in the data analysis. Many other examples can also be found in other fields like econometrics and financial statistics. Observations on duration of unemployment e.g., may be right censored and are typically correlated. When the data are not independent and are subject to censoring, estimation and inference become more challenging mathematical problems with a wide area of applications. In this context, we propose here some new and flexible tools based on a nonparametric approach. More precisely, allowing dependence between individuals, our main contribution to this domain concerns the following aspects. First, we are interested in developing more suitable confidence intervals for a general class of functionals of a survival distribution via the empirical likelihood method. Secondly, we study the problem of conditional mean estimation using the local linear technique. Thirdly, we develop and study a new estimator of the conditional quantile function also based on the local linear method. In this dissertation, for each proposed method, asymptotic results like consistency and asymptotic normality are derived and the finite sample performance is evaluated in a simulation study.
708

Pooling Data from Similar Randomized Clinical Trials Comparing Latanoprost with Timolol; Medical Results and Statistical Aspects

Hedman, Katarina January 2003 (has links)
Two different principles were studied. 1st - statistical analysis techniques were used to obtain medical results from a patient population. 2nd - the patient population was used to study the statistical analysis techniques. Medical conclusions: latanoprost and timolol treatment showed a statistically significant and clinically useful mean IOP-reduction in a typical worldwide clinical trial population. Latanoprost reduced the IOP 1.6 mm Hg more than timolol. The IOP-reduction was maintained with timolol and slightly enforced with latanoprost up to 6 months of treatment. The mean IOP-reduction was maintained during 2 years of latanoprost treatment. The overall risk of withdrawal due to insufficient IOP-reduction with latanoprost was 8%. The statistical methodological issues are of a general and reoccurring character in trial design of the IOP-reduction: should the statistical hypothesis testing be based on the mean intraocular pressure (IOP) or the proportion of patients who reach a specific IOP level, should the estimate of the IOP or IOP-reduction be based on single eyes, mean of bilaterally eligible and identically treated eyes or the difference between an eye with active treatment and a placebo treated contralateral eye, and is mean of replicated recordings useful? Statistical methodological conclusions: the most effective response variable varies with the selected patient population. Therefore, the trial design process should include a comparison of the variability, test power and required sample size for the possible response variables in a sample of the target population. At minimum a statistical consideration should be done.
709

The Performance of Technical Analysis : A case study in Chinese domestic A share

Geng, Haoming, Wang, Cheng January 2010 (has links)
In this thesis, we conduct a case study by applying simple technical trading rules on Chinese stock market. The technical trading rules we tested are moving average rules and trading range breakout rules. The stock indices we tested are SSE A (Shanghai A) and SZSE (Shenzhen A) share, these shares are limited to the Chinese domestic traders. Our main trading rule frameworks are mainly from Brock, Lakonishok&amp; Lebaron (1992), which including the most basic technical trading rules and covered various length of period, however we add the 25 days moving average to our frame work. We obtained our data from DataStream; the data are the daily closing prices of two indices we mentioned above. We compared the mean return and Sharpe ratio with buy and hold. We further calculated breakeven transaction costs to test whether the technical trading rules can still add wealth to investors after adjusting the transaction costs. Our results showed that most technical trading rules perform better than buy and hold. VMA perform better than FMA and TRB, short period (25 and 50 days) performed better than longer period. On mean return, our data violated the assumption of parametric statistical test. We performed non-parametric tests, all the trading rules showed statistical significance at 95% level than buy and hold except FMA (1, 25,0), all the trading rules resulted higher Sharpe ratio than buy and hold. On transaction costs, 7 trading rules on SSE A are performed poorer than buy and hold, all the other rules provided positive breakeven transaction costs. Across the entire trading rule, both stock markets offered positive break-even transaction costs, 0.436% for SSE A and 1.369% for SZSE A. and they are both higher than the maximum transaction costs one bears.
710

Object Tracking System With Seamless Object Handover Between Stationary And Moving Camera Modes

Emeksiz, Deniz 01 November 2012 (has links) (PDF)
As the number of surveillance cameras and mobile platforms with cameras increases, automated detection and tracking of objects on these systems gain importance. There are various tracking methods designed for stationary or moving cameras. For stationary cameras, correspondence based tracking methods along with background subtraction have various advantages such as enabling detection of object entry and exit in a scene. They also provide robust tracking when the camera is static. However, they fail when the camera is moving. Conversely, histogram based methods such as mean shift enables object tracking on moving camera cases. Though, with mean shift object&rsquo / s entry and exit cannot be detected automatically which means a new object&rsquo / s manual initialization is required. In this thesis, we propose a dual-mode object tracking system which combines the benefits of correspondence based tracking and mean shift tracking. For each frame, a reliability measure based on background update rate is calculated. Interquartile Range is used for finding outliers on this measure and camera movement is detected. If the camera is stationary, correspondence based tracking is used and when camera is moving, the system switches to the mean shift tracking mode until the reliability of correspondence based tracking is sufficient according to the reliability measure. The results demonstrate that, in stationary camera mode, new objects can be detected automatically by correspondence based tracking along with background subtraction. When the camera starts to move, generation of false objects by correspondence based tracking is prevented by switching to mean shift tracking mode and handing over the correct bounding boxes with a seamless operation which enables continuous tracking.

Page generated in 0.1144 seconds