• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 205
  • 100
  • 35
  • 32
  • 31
  • 7
  • 6
  • 6
  • 5
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 524
  • 524
  • 84
  • 81
  • 66
  • 60
  • 46
  • 46
  • 39
  • 38
  • 37
  • 36
  • 35
  • 31
  • 31
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
371

Structural Health Monitoring With Emphasis On Computer Vision, Damage Indices, And Statistical Analysis

Zaurin, Ricardo 01 January 2009 (has links)
Structural Health Monitoring (SHM) is the sensing and analysis of a structure to detect abnormal behavior, damage and deterioration during regular operations as well as under extreme loadings. SHM is designed to provide objective information for decision-making on safety and serviceability. This research focuses on the SHM of bridges by developing and integrating novel methods and techniques using sensor networks, computer vision, modeling for damage indices and statistical approaches. Effective use of traffic video synchronized with sensor measurements for decision-making is demonstrated. First, some of the computer vision methods and how they can be used for bridge monitoring are presented along with the most common issues and some practical solutions. Second, a conceptual damage index (Unit Influence Line) is formulated using synchronized computer images and sensor data for tracking the structural response under various load conditions. Third, a new index, Nd , is formulated and demonstrated to more effectively identify, localize and quantify damage. Commonly observed damage conditions on real bridges are simulated on a laboratory model for the demonstration of the computer vision method, UIL and the new index. This new method and the index, which are based on outlier detection from the UIL population, can very effectively handle large sets of monitoring data. The methods and techniques are demonstrated on the laboratory model for damage detection and all damage scenarios are identified successfully. Finally, the application of the proposed methods on a real life structure, which has a monitoring system, is presented. It is shown that these methods can be used efficiently for applications such as damage detection and load rating for decision-making. The results from this monitoring project on a movable bridge are demonstrated and presented along with the conclusions and recommendations for future work.
372

Performance Modelling and Evaluation of Active Queue Management Techniques in Communication Networks. The development and performance evaluation of some new active queue management methods for internet congestion control based on fuzzy logic and random early detection using discrete-time queueing analysis and simulation.

Abdel-Jaber, Hussein F. January 2009 (has links)
Since the field of computer networks has rapidly grown in the last two decades, congestion control of traffic loads within networks has become a high priority. Congestion occurs in network routers when the number of incoming packets exceeds the available network resources, such as buffer space and bandwidth allocation. This may result in a poor network performance with reference to average packet queueing delay, packet loss rate and throughput. To enhance the performance when the network becomes congested, several different active queue management (AQM) methods have been proposed and some of these are discussed in this thesis. Specifically, these AQM methods are surveyed in detail and their strengths and limitations are highlighted. A comparison is conducted between five known AQM methods, Random Early Detection (RED), Gentle Random Early Detection (GRED), Adaptive Random Early Detection (ARED), Dynamic Random Early Drop (DRED) and BLUE, based on several performance measures, including mean queue length, throughput, average queueing delay, overflow packet loss probability, packet dropping probability and the total of overflow loss and dropping probabilities for packets, with the aim of identifying which AQM method gives the most satisfactory results of the performance measures. This thesis presents a new AQM approach based on the RED algorithm that determines and controls the congested router buffers in an early stage. This approach is called Dynamic RED (REDD), which stabilises the average queue length between minimum and maximum threshold positions at a certain level called the target level to prevent building up the queues in the router buffers. A comparison is made between the proposed REDD, RED and ARED approaches regarding the above performance measures. Moreover, three methods based on RED and fuzzy logic are proposed to control the congested router buffers incipiently. These methods are named REDD1, REDD2, and REDD3 and their performances are also compared with RED using the above performance measures to identify which method achieves the most satisfactory results. Furthermore, a set of discrete-time queue analytical models are developed based on the following approaches: RED, GRED, DRED and BLUE, to detect the congestion at router buffers in an early stage. The proposed analytical models use the instantaneous queue length as a congestion measure to capture short term changes in the input and prevent packet loss due to overflow. The proposed analytical models are experimentally compared with their corresponding AQM simulations with reference to the above performance measures to identify which approach gives the most satisfactory results. The simulations for RED, GRED, ARED, DRED, BLUE, REDD, REDD1, REDD2 and REDD3 are run ten times, each time with a change of seed and the results of each run are used to obtain mean values, variance, standard deviation and 95% confidence intervals. The performance measures are calculated based on data collected only after the system has reached a steady state. After extensive experimentation, the results show that the proposed REDD, REDD1, REDD2 and REDD3 algorithms and some of the proposed analytical models such as DRED-Alpha, RED and GRED models offer somewhat better results of mean queue length and average queueing delay than these achieved by RED and its variants when the values of packet arrival probability are greater than the value of packet departure probability, i.e. in a congestion situation. This suggests that when traffic is largely of a non bursty nature, instantaneous queue length might be a better congestion measure to use rather than the average queue length as in the more traditional models.
373

Detection of driver sleepiness during daylight and darkness

Eklind, Johanna, Meyerson, Amanda January 2023 (has links)
Driving sleepiness is a serious problem worldwide. It is of interest to develop reliable sleepiness detection systems to implement in vehicles, and for such a system both physi-ological data and driver performance data can be used. The reasons for driver sleepiness can be many, where an interesting factor to consider is the light condition of the environment, specifically daylight and darkness. Daylight and darkness has shown to affect human sleepiness in general and it is therefore of importance to investigate the effect of it on driver sleepiness independent of other factors. This thesis aimed to investigate whether light condition is a parameter that should be considered when developing a sleepiness detection system in a vehicle. This was done by investigating if the course of sleepiness would be affected by daylight and darkness, and if adding light condition information as a parameter to a classification model improved the performance of the sleepiness classification. To achieve this, the study was based upon data collected from driving simulator tests conducted by the Swedish National Road and Transport Research Institute (VTI). Test subjects drove in simulated daylight and darkness during both daytime while rested and nighttime while sleep-deprived. An exploratory and statistical analysis was conducted of several sleepiness indicators extracted from physio-logical data and simulator data. Three different classification models were implemented. The indicators pointed to a higher level of driver sleepiness during night compared to during day, as well as an increase with time on task. However, no clear trends pointed to daylight and darkness having affected the sleepiness of the driver. The classification models showed a marginal improvement when including light condition as a feature, however not large enough to draw any specific conclusion regarding the effect. The conclusion was that an effect of daylight and darkness on the course of driver sleepiness could not be seen in this thesis. The adding of light and dark as a feature did not significantly improve the classification models’ performances. In summary, further investigations of the effect of daylight and darkness in relation to driver sleepiness are needed.
374

Noise and Hotel Revenue Management in Simulation-based Optimization

Dalcastagnè, Manuel 14 October 2021 (has links)
Several exact and approximate dynamic programming formulations have already been proposed to solve hotel revenue management (RM) problems. To obtain tractable solutions, these methods are often bound by simplifying assumptions which prevent their application on large and dynamic complex systems. This dissertation introduces HotelSimu, a flexible simulation-based optimization approach for hotel RM, and investigates possible approaches to increase the efficiency of black-box optimization methods in the presence of noise. In fact, HotelSimu employs black-box optimization and stochastic simulation to find the dynamic pricing policy which is expected to maximize the revenue of a given hotel in a certain period of time. However, the simulation output is noisy and different solutions should be compared in a statistically significant manner. Various black-box heuristics based on variations of random local search are investigated and integrated with statistical analysis techniques in order to manage efficiently the optimization budget.
375

Development of a geovisual analytics environment using parallel coordinates with applications to tropical cyclone trend analysis

Steed, Chad A 13 December 2008 (has links)
A global transformation is being fueled by unprecedented growth in the quality, quantity, and number of different parameters in environmental data through the convergence of several technological advances in data collection and modeling. Although these data hold great potential for helping us understand many complex and, in some cases, life-threatening environmental processes, our ability to generate such data is far outpacing our ability to analyze it. In particular, conventional environmental data analysis tools are inadequate for coping with the size and complexity of these data. As a result, users are forced to reduce the problem in order to adapt to the capabilities of the tools. To overcome these limitations, we must complement the power of computational methods with human knowledge, flexible thinking, imagination, and our capacity for insight by developing visual analysis tools that distill information into the actionable criteria needed for enhanced decision support. In light of said challenges, we have integrated automated statistical analysis capabilities with a highly interactive, multivariate visualization interface to produce a promising approach for visual environmental data analysis. By combining advanced interaction techniques such as dynamic axis scaling, conjunctive parallel coordinates, statistical indicators, and aerial perspective shading, we provide an enhanced variant of the classical parallel coordinates plot. Furthermore, the system facilitates statistical processes such as stepwise linear regression and correlation analysis to assist in the identification and quantification of the most significant predictors for a particular dependent variable. These capabilities are combined into a unique geovisual analytics system that is demonstrated via a pedagogical case study and three North Atlantic tropical cyclone climate studies using a systematic workflow. In addition to revealing several significant associations between environmental observations and tropical cyclone activity, this research corroborates the notion that enhanced parallel coordinates coupled with statistical analysis can be used for more effective knowledge discovery and confirmation in complex, real-world data sets.
376

Evaluation of the critical parameters and polymeric coat performance in compressed multiparticulate systems

Benhadia, Abrehem M.A. January 2019 (has links)
Compression of coated pellets is a practical alternative to capsule filling. The current practice is to add cushioning agents to minimize the stress on the coated pellets. Cushioning agents however add bulkiness and reduce the overall drug loading capacity. In this study, we investigated the performance of compressed coated pellets with no cushioning agent to evaluate the feasibility of predicting the coat behaviour using thermo-mechanical and rheological analysis techniques. Different coating formulations were made of ethyl cellulose (EC) as a coating polymer and two different kinds of additives were incorporated into the polymeric coating solution. Triethyl Citrate (TEC) and Polyethylene glycol 400(PEG400) were used as plasticizers at different levels to the coating formulations (10%, 20%, 30%). Thermal, mechanical and rheological measurements of the coating film formulations were achieved to investigate the effect of plasticizers. Thermal gravimetric analysis results (TGA) showed higher residual moisture content in films plasticised with PEG 400 compared to their TEC counterparts. Differential Scanning Calorimetry (DSC), Dynamic Mechanical Analysis (DMA) and Parallel Plate Shear Rheometer (PPSR) were used to study the influence of the level and type of plasticisers incorporated in coating film formulation on the performance of the coating film. In this study, both DSC and DMA were used to investigate the Tg for each film coating formulation in order to evaluate the effect of the additives. In general DMA results for the Tg value of the films were always higher by 10-20% than those measured by the DSC. Furthermore, clamp size and the frequency of the oscillation have an influence on the evaluation of Tg. Complex viscosity for different coating film formulations revealed that the shear hinning gradient changes with temperature and plasticiser type and concentration. The value of complex viscosity from DMA and PPSR exhibits power law behaviour. The rheological moduli were indirectly affected by the level of plasticiser. There was a discrepancy between the complex viscosity results obtained from both DMA and PPSR at similar temperature but they follow the same trend. The non plasticized polymer showed a 10 time higher complex viscosity values when measured by DMA over that measured by PPSR. The difference was smaller in plasticized films but it was not consistent. Therefore a consistent coefficient to correlate the DMA and PPSR couldn’t be accurately determined Coated pellets were compressed and key process parameters were evaluated. The obtained results revealed that the coating thickness has a significant effect on the release profile of the final products. It was found that by increasing the coating film thickness, the percentage released decreased. Also the compression force has lower influence on the drug release profile, while the dwell time has very low effect on the percentage release from the final products. Optimum release profile was obtained at a coating level of 5.5% w/w and a compression force of 4700N In conclusion, the elasticity of the plasticised EC films in this study meant that the internal stress is not dissipated during compression and the dwell time range that was used in this experiment. Increasing the thickness therefore was necessary to enhance the strength of the film and avoid cracking. The mechanical and rheological profiling was helpful therefore to understand the behaviour of the coated pellets and predict the film properties at various steps of the process of coating and compression (i.e., various shear rate regimes). Experimental design approach to studying the key process and formulation parameters helped identify the optimum values for the process.
377

Retail Location Analysis: A Case Study of Burger King & McDonald’s in Portage & Summit Counties, Ohio

Niti, Duggal 28 November 2007 (has links)
No description available.
378

Environment Change: An Analysis of College Football Operations

Yazawa, Daigo 07 August 2014 (has links)
No description available.
379

Consider the Big Picture: A Quantitative Analysis of Readability and the Novel Genre, 1800-1922

Pruitt, Marie 18 July 2022 (has links)
No description available.
380

Surgical Skills and Ergonomics Evaluation for Laparoscopic Surgery Training

Kyaw, Thu Zar 10 1900 (has links)
<p>Training and ergonomics evaluation for laparoscopic surgery is an important tool for the assessment of trainees. Timely and objective assessment helps surgeons improve hand dexterity and movement precision, and perform surgery in an ergonomic manner. Traditionally, skill is evaluated by expert surgeons observing trainees, but this approach is both expensive and subjective. The approach proposed by this research employs an Ascension 3DGuidance trakSTAR system that captures the positions and orientations of hand and laparoscopic tool trajectories. Recorded trajectories are automatically analysed to extract meaningful feedback for training evaluation using statistical and machine learning methods.</p> <p>The data are acquired while a subject performs a standardized task such as peg transfer or suturing. The system records laproscopic instrument positions, hand, forearms, elbows trajectories, as well as wrist angles. We propose several metrics that attempt to objectively quantify the skill level or ergonomics of the procedure. The metrics for surgical skills are based on surgical instrument tip trajectories, whereas the ergonomics metric uses wrist angles. These metrics have been developed using statistical and machine learning methods.</p> <p>The metrics have been experimentally evaluated by using a population of seven first year postgraduate urology residents, one general surgery resident, and eight fourth year postgraduate urology residents and fellows. The machine learning approach discriminated correctly in 73% of cases between experts and novices. The machine learning approach applied to ergonomics data correctly discriminates between experts and novices in 88% of the cases for the peg transfer task and 75% for the suturing task. We also propose a method to derive a competency-based score using either statistical or machine learning derived metrics.</p> <p>Initial experimental data show that the proposed methods discriminate between the skills and ergonomics of expert and novice surgeons. The proposed system can be a valuable tool for research and training evaluation in laparoscopic surgery.</p> / Master of Applied Science (MASc)

Page generated in 0.0962 seconds