• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 441
  • 58
  • 38
  • 35
  • 23
  • 14
  • 11
  • 9
  • 5
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 741
  • 741
  • 235
  • 175
  • 125
  • 117
  • 110
  • 110
  • 98
  • 91
  • 78
  • 64
  • 64
  • 63
  • 62
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Annihilation of cardiac alternans by electric and mechano-electric feedback (MEF) in a cardiac tissue

Deshpande, Dipen Unknown Date
No description available.
82

A model-based approach to nonlinear networked control systems

Liu, Xi Unknown Date
No description available.
83

High Resolution Clinical Model-Based Assessment of Insulin Sensitivity

Lotz, Thomas Friedhelm January 2007 (has links)
Type 2 diabetes has reached epidemic proportions worldwide. The resulting increase in chronic and costly diabetes related complications has potentially catastrophic implications for healthcare systems, and economies and societies as a whole. One of the key pathological factors leading to type 2 diabetes is insulin resistance (IR), which is the reduced or impaired ability of the body to make use of available insulin to maintain normal blood glucose levels. Diagnosis of developing IR is possible up to 10 years before the diagnosis of type 2 diabetes, providing an invaluable opportunity to intervene and prevent or delay the onset of the disease. However, an accurate, yet simple, test to provide a widespread clinically feasible early diagnosis of IR is not yet available. Current clinically practicable tests cannot yield more than a crude surrogate metric that allows only a threshold-based assessment of an underlying disorder, and thus delay its diagnosis. This thesis develops, analyses and pilots a model-based insulin sensitivity test that is simple, short, physiological and cost efficient. It is thus useful in a practical clinical setting for wider clinical screening. The method incorporates physiological knowledge and modelling of glucose, insulin and C-peptide kinetics and their pharmaco-dynamics. The clinical protocol is designed to produce data from a dynamic perturbation of the metabolic system that enables a unique physiologically valid assessment of metabolic status. A combination of a-priori information and a convex integral-based identification method guarantee a unique, robust and automated identification of model parameters. In addition to a high resolution insulin sensitivity metric, the test also yields a clinically valuable and accurate assessment of pancreatic function, which is also a good indicator of the progression of the metabolic defect. The combination of these two diagnostic metrics allow a clinical assessment of a more complete picture of the overall metabolic dysfunction. This outcome can assist the clinician in providing an earlier and much improved diagnosis of insulin resistance and metabolic status and thus more optimised treatment options. Test protocol accuracy is first evaluated in Monte Carlo simulations and subsequently in a clinical pilot study. Both validations yield comparable results in repeatability and robustness. Repeatability and resolution of the test metrics are very high, particularly when compared to current clinical standard surrogate fasting or oral glucose tolerance assessments. Additionally, the model based insulin sensitivity metric is shown to be highly correlated to the highly complex, research focused gold standard euglycaemic clamp test. Various reduced sample and shortened protocols are also proposed to enable effective application of the test in a wider range of clinical and laboratory settings. Overall, test time can be as short as 30 minutes with no compromise in diagnostic performance. A suite of tests is thus created and made available to match varying clinical and research requirements in terms of accuracy, intensity and cost. Comparison between metrics obtained from all protocols is possible, as they measure the same underlying effects with identical model-based assumptions. Finally, the proposed insulin sensitivity test in all its forms is well suited for clinical use. The diagnostic value of the test can assist clinical diagnosis, improve treatment, and provide for higher resolution and earlier diagnosis than currently existing clinical and research standards. High risk populations can therefore be diagnosed much earlier and the onset of complications delayed. The net result will thus improve overall healthcare, reduce costs and save lives.
84

Agent and model-based simulation framework for deep space navigation analysis and design

Anzalone, Evan John 27 August 2014 (has links)
As the number of spacecraft in simultaneous operation continues to grow, there is an increased dependency on ground-based navigation support. The current baseline system for deep space navigation utilizes Earth-based radiometric tracking, which requires long duration, often global, observations to perform orbit determination and generate a state update. The age, complexity, and high utilization of the assets that make up the Deep Space Network (DSN) pose a risk to spacecraft navigation performance. With increasingly complex mission operations, such as automated asteroid rendezvous or pinpoint planetary landing, the need for high accuracy and autonomous navigation capability is further reinforced. The Network-Based Navigation (NNAV) method developed in this research takes advantage of the growing inter-spacecraft communication network infrastructure to allow for autonomous state measurement. By embedding navigation headers into the data packets transmitted between nodes in the communication network, it is possible to provide an additional source of navigation capability. Simulation results indicate that as NNAV is implemented across the deep space network, the state estimation capability continues to improve, providing an embedded navigation network. To analyze the capabilities of NNAV, an analysis and simulation framework is designed that integrates navigation and communication analysis. Model-Based Systems Engineering (MBSE) and Agent-Based Modeling (ABM) techniques are utilized to foster a modular, expandable, and robust framework. This research has developed the Space Navigation Analysis and Performance Evaluation (SNAPE) framework. This framework allows for design, analysis, and optimization of deep space navigation and communication architectures. SNAPE captures high-level performance requirements and bridges them to specific functional requirements of the analytical implementation. The SNAPE framework is implemented in a representative prototype environment using the Python language and verified using industry standard packages. The capability of SNAPE is validated through a series of example test cases. These analyses focus on the performance of specific state measurements to state estimation performance, and demonstrate the core analytic functionality of the framework. Specific cases analyze the effects of initial error and measurement uncertainty on state estimation performance. The timing and frequency of state measurements are also investigated to show the need for frequent state measurements to minimize navigation errors. The dependence of navigation accuracy on timing stability and accuracy is also demonstrated. These test cases capture the functionality of the tool as well as validate its performance. The SNAPE framework is utilized to capture and analyze NNAV, both conceptually and analytically. Multiple evaluation cases are presented that focus on the Mars Science Laboratory's (MSL) Martian transfer mission phase. These evaluation cases validate NNAV and provide concrete evidence of its operational capability for this particular application. Improvement to onboard state estimation performance and reduced reliance on Earth-based assets is demonstrated through simulation of the MSL spacecraft utilizing NNAV processes and embedded packets within a limited network containing DSN and MRO. From the demonstrated state estimation performance, NNAV is shown to be a capable and viable method of deep space navigation. Through its implementation as a state augmentation method, the concept integrates with traditional measurements and reduces the dependence on Earth-based updates. Future development of this concept focuses on a growing network of assets and spacecraft, which allows for improved operational flexibility and accuracy in spacecraft state estimation capability and a growing solar system-wide navigation network.
85

Development and Evaluation of Model-Based Misfire Detection Algorithm

Therén, Linus January 2014 (has links)
This report present the work to develop a misfire detection algorithm for onboard diagnostics on a spark ignited combustion engine. The work is based on a previous developed model-based detection algorithm, created to meet more stringent future legislation and reduce the cost of calibration. In the existing approach a simplified engine model is used to estimate the torque from the flywheel angular velocity, and the algorithm can detect misfires in various conditions. The main contribution in this work, is further development of the misfire detection algorithm with focus on improving the handling of disturbances and variations between different vehicles. The resulting detection algorithm can be automatically calibrated with training data and manage disturbances such as manufacturing errors on the flywheel and torsional vibrations in the crankshaft occurring after a misfire. Furthermore a robustness analysis with different engine configurations is carried out, and the algorithm is evaluated with the Kullback- Leibler divergence correlated to the diagnosis requirements. In the validation, data from vehicles with four cylinder engines are used and the algorithm show good performance with few false alarms and missed detections.
86

Developing a computational model of the pilot's best possible expectation of aircraft state given vestibular and visual cues

Onur, Can 12 January 2015 (has links)
Loss of Control (LOC) accidents are a major threat for aviation, and contribute the highest risk for fatalities in all aviation accidents. The major contributor to LOC accidents is pilot spatial disorientation (SD), which accounts for roughly 32% of all LOC accidents. A pilot experiences SD during flight when the pilot's expectation of the aircraft's state deviates from reality. This deviation results from a number of underlying mechanisms, such as distraction, failure to monitor flight instruments, and vestibular illusions. Previous researchers have developed computational models to understand those mechanisms. However, these models are limited in scope as they do not model the pilot's knowledge of the aircraft dynamics. This research proposes a novel model to predict the best-possible-pilot-expectation of the aircraft state given vestibular and visual cues. The proposed model uses a Model-Based Observer (MBO) as the infrastructure needed to establish an “expert pilot”. Expert pilots are known to form an internal model of the operated system through training and experience, which allows the expert to generate better internal expectations of the system states. Pilots' internal expectations are enhanced by the presence of information fed through the pilots̕ sensory systems. Thus, the proposed model integrates pilot's knowledge of the system dynamics (i.e. an aircraft model) with a continuous vestibular sensory model and a discrete visual-sampling sensory model. The computational model serves to investigate the underlying mechanisms of SD during flight and provide a quantitative analysis tool to support flight deck and countermeasure designs.
87

A model-based approach to nonlinear networked control systems

Liu, Xi 11 1900 (has links)
This thesis is concerned with the analysis of the control design to the nonlinear networked control systems (NCSs). Ignoring the network connection and cascading actuators, the plant and sensors together, a sampled-data system is obtained. The stabilization problem of nonlinear sampled-data systems is considered under the low measurement rate constraint. Dual-rate control schemes based on the emulation design and discrete-time design approaches respectively are proposed that utilize a numerical integration model to approximately predict the current state of the plant. It is shown that using the dual-rate control schemes, input-to-state stability property will be preserved for the closed loop sampled-data system in a practical sense. On the other hand, the networked realization of nonlinear control systems is studied and a model-based control scheme is addressed as a solution to reduce the network traffic and resultantly, to attain a higher performance. The NCSs are modeled as continuous-time systems and sampled-data systems, respectively. Under the proposed control scheme, a tradeoff between satisfactory control performance and reduction of network traffic can be achieved. It is shown that by using the estimated values, generated by the plant model, instead of true values of the plant, a significant saving in the required bandwidth is achieved and this makes possible stabilization of the plant even under slow network conditions.
88

An Evaluation of Model-Based Testing for an Industrial Train Control Software

Suli, Sidorela January 2018 (has links)
Currently, the increasing complexity of software and the short release cycles are becoming a challenge for testing software in an efficient and effective way. Traditionally, creating tests is done manually by engineers, which are then automatically or manually executed on the actual software. Manually creating test cases is a time-consuming effort. For the last couple of decades, researchers have proposed ways to improve this process by automating parts of the testing steps. One of these approaches that have gained a lot of attention is called Model-Based Testing (MBT). MBT has been suggested as a way of automatically creating tests at a lower cost. Nonetheless, it is not very well studied how MBT is actually applied in industrial contexts and how these tests compare to manually written ones. This is particularly true for industrial control software such as the one found in the train domain, where strict requirements on testing are in place. In this thesis, we investigate the literature and review the related work on case studies on the MBT use in industry and its evaluation. We perform a case study to evaluate MBT on a train control management system provided by Bombardier Transportation. We use Comformiq Creator MBT Tool to create models for functional requirements of a master controller function and generate test cases. We provide the result of the modeling approach as well as a comparison between automatic test cases created by Conformiq Creator and manual test cases written by industrial engineers at Bombardier Transportation using the following metrics: test coverage and time spent on testing. The results of this comparison suggest that test coverage of MBT is higher and test cases are more detailed than manual testing. Our results are not conclusive in regard to the cost of using MBT, mainly because this depends on different testing scenarios and how testing is performed. We show that MBT is a suitable approach for modeling the functional requirements of a realistic industrial control software function. In this thesis work, we focus on system-level testing. As future work, applying MBT on lower levels of testing can be a promising way forward for evaluation. In addition, the transformation of these test cases into executable test scripts and the possible problems needs to be investigated further.
89

Handling domain knowledge in system design models. An ontology based approach.

Hacid, Kahina 06 March 2018 (has links) (PDF)
Complex systems models are designed in heterogeneous domains and this heterogeneity is rarely considered explicitly when describing and validating processes. Moreover, these systems usually involve several domain experts and several design models corresponding to different analyses (views) of the same system. However, no explicit information regarding the characteristics neither of the domain nor of the performed system analyses is given. In our thesis, we propose a general framework offering first, the formalization of domain knowledge using ontologies and second, the capability to strengthen design models by making explicit references to the domain knowledgeformalized in these ontology. This framework also provides resources for making explicit the features of an analysis by formalizing them within models qualified as ‘’points of view ‘’. We have set up two deployments of our approach: a Model Driven Engineering (MDE) based deployment and a formal methods one based on proof and refinement. This general framework has been validated on several no trivial case studies issued from system engineering.
90

Comprehensive Model-Based Design and Analysis Approach for Thermal Management Systems in Hybridized Vehicles

January 2017 (has links)
abstract: This research effort focuses on thermal management system (TMS) design for a high-performance, Plug-in Hybrid Electric Vehicle (PHEV). The thermal performance for various components in an electrified powertrain is investigated using a 3D finite difference model for a complete vehicle system, including inherently temperature-sensitive components. The components include the electric motor (EM), power electronics, Energy Storage System (ESS), and Internal Combustion Engine (ICE). A model-based design approach is utilized, where a combination of experimental work and simulation are integrated. After defining heat sources and heat sinks within the power train system, temporal and spatial boundary conditions were extracted experimentally to facilitate the 3D simulation under different road-load scenarios. Material properties, surface conditions, and environmental factors were defined for the geometrical surface mesh representation of the system. Meanwhile the finite differencing code handles the heat transfer phenomena via conduction and radiation, all convective heat transfer mode within the powertrain are defined using fluid nodes and fluid streams within the powertrain. Conclusions are drawn through correlating experimental results to the outcome from the thermal model. The outcome from this research effort is a 3D thermal performance predictive tool that can be utilized in order to evaluate the design of advanced thermal management systems (TMSs) for alternative powertrains in early design/concept stages of the development process. For future work, it is recommended that a full validation of the 3D thermal model be completed. Subsequently, design improvements can be made to the TMS. Some possible improvements include analysis and evaluation of shielding of the catalytic converter, exhaust manifold, and power electronics, as well as substituting for material with better thermal performance in other temperature-sensitive components, where applicable. The result of this improvement in design would be achieving an effective TMS for a high-performance PHEV. / Dissertation/Thesis / Masters Thesis Engineering 2017

Page generated in 0.0561 seconds