Spelling suggestions: "subject:"component"" "subject:"dcomponent""
281 |
Mode switch for component-based multi-mode systemsYin, Hang January 2012 (has links)
Component-based software engineering is becoming a prominent solution to the development of complex embedded systems. Since it allows a system to be built by reusable and independently developed components, component-based development substantially facilitates the development of a complex embedded system and allows its complexity to be better managed. Meanwhile, partitioning system behavior into multiple operational modes is also an effective approach to reducing system complexity. Combining the component-based approach with the multi-mode approach, we get a component-based multi-mode system, for which a key issue is its mode switch handling. The mode switch of such a system corresponds to the joint mode switches of many hierarchically organized components. Such a mode switch is not trivial as it amounts to coordinate the mode switches of different components that are independently developed. Since most existing approaches to mode switch handling assume that mode switch is a global event of the entire system, they cannot be easily applied to component-based multi-mode systems where both the mode switch of the system and each individual component must be considered, and where components cannot be assumed to have global knowledge of the system. In this thesis, we present a mechanism---the Mode Switch Logic (MSL)---which provides an effective solution to mode switch in component-based multi-mode systems. MSL enables a multi-mode system to be developed in a component-based manner, including (1) a mode-aware component model proposed to suit the multi-mode context; (2) a mode mapping mechanism for the seamless composition of multi-mode components and their mode switch guidance; (3) a mode switch runtime mechanism which coordinates the mode switches of all related components so that the mode switch can be correctly and efficiently performed at the system level; and (4) a timing analysis for mode switches realized by MSL. All the essential elements of MSL are additionally demonstrated by a case study. / ARROWS
|
282 |
Zásada superficies solo cedit / Principle of superficies solo ceditSabaková, Ivana January 2017 (has links)
PRINCIPLE OF SUPERFICIES SOLO CEDIT This thesis focuses on the superficies solo cedit principle, which can be translated as "the surface yields to the ground". The principle originated in Roman law and is projected into legal systems of most of the European democratic countries until today. According to this principle, everything firmly attached to the ground or grown on the land is component part of the land. Therefore, plants and structures are considered as typical component parts of the tract of land. The owner of the tract of land simultaneously and automatically owns its component parts. All dispositions regarding the tract of land must include the component parts, as component parts are not separate objects of law. This thesis deals with the evolution and regulation of the superficies solo cedit principle in Roman law, as well as in legal orders valid on the Czech territory until now. The main focus is on the analysis and comparison of previous legal regulations and currently valid civil law, represented by Act No. 89/2012 Sb., Civil Code. Close attention is given to the definition of things in legal sense, definition of immovable things and related legal concepts, particularly to the right of superficies and other exceptions from the principle superficies solo cedit. In the first chapter,...
|
283 |
Predictive detection of epileptic seizures in EEG for reactive careValko, Andras, Homsi, Antoine January 2017 (has links)
It is estimated that 65 million people worldwide have epilepsy, and many of them have uncontrollable seizures even with the use of medication. A seizure occurs when the normal electrical activity of the brain is interrupted by sudden and unusually intense bursts of electrical energy, and these bursts can be observed and detected by the use of an electroencephalograph (EEG) machine. This work presents an algorithm that monitors subtle changes in scalp EEG characteristics to predict seizures. The algorithm is built to calibrate itself to every specifc patient based on recorded data, and is computationally effcient enough for future on-line applications. The presented algorithm performs ICA-based artifact filtering and Lasso-based feature selection from a large array of statistical features. Classification is based on a neural network using Bayesian regularized backpropagation.The selected method was able to classify 4 second long preictal segments with an average sensitivity of 99.53% and an average specificity of 99.9% when tested on 15 different patients from the CHB-MIT database.
|
284 |
The Impact of Latency Jitter on the Interpretation of P300 in the Assessment of Cognitive FunctionYu, Xiaoqian 16 June 2016 (has links)
When stimuli processing time varies in an oddball paradigm, the latency of the P300 will vary across trials. In an oddball task requiring difficult response selections, as the variation of stimuli processing time increases, so does the variation of the P300 latency, causing latency jitters in the measurement. Averaging the P300 across different trials without adjusting this latency jitter will lead to diminished P300 amplitude, resulting in inaccurate conclusions from the data. Verleger et al. (2014) reported a diminished P300 amplitude in a difficult oddball task that required subjects to make response selections among stimuli that are difficult to distinguish, but his work did not correct for any latency jitter observed within his sample. The current study replicated the easy and hard oddball tasks conducted in Verleger et al.. Raw ERPs obtained from 16 subjects indicated a successful replication of the study. An examination of the behavioral data showed that there was substantial variation in the P300 during the hard oddball tasks, and a latency jitter correction was applied in the analysis. Results indicated that there was a significant increase in the amplitude of P300 after latency jitter correction, and that this P300 amplitude did not differ significantly between easy and hard oddball tasks. These results suggest that difficult decision requirement does not reduce the amplitude of the P300, and that latency jitter should be accounted for when analyzing data from tasks involving a difficult decision requirement.
|
285 |
Structured interrelations of component architecturesJung, Georg January 1900 (has links)
Doctor of Philosophy / Department of Computing and Information Sciences / John M. Hatcliff / Software architectures—abstract interrelation models which decompose complex artifacts into modular functional units and specify the connections and relationships among them—have become an important factor in the development and maintenance of large scale, heterogeneous, information and computation systems. In system development, software architecture design has become a main starting point, and throughout the life-cycle of a system, conformance to the architecture is important to guarantee a system's integrity and consistency.
For an effective use of software architectures in initial development and ongoing maintenance, the interrelation models themselves have to be clear, consistent, well structured, and—in case substantial functionality has to be added, reduced, or changed at any stage of the life cycle—flexible and manipulable. Further, enforcing the conformance of a software artifact to its architecture is a non-trivial task. Implementation units need to be identifiable and their association to the abstract constructs of the architecture has to be maintained. Finally, since software architectures can be employed at many different levels of abstraction, with some architectures describing systems that span over multiple different computing platforms, associations have to be flexible and abstractions have to be general enough to capture all parts and precise enough to be useful. An efficient and widely used way to employ software architecture in practice are middleware-based component architectures. System development within this methodology relies on the presence of a service layer called middleware which usually resides between operating system (possibly spanning over multiple operating systems on various platforms) and the application described by the architecture. The uniform set of logistic services provided by a middleware allows that communication and context requirements of the functional units, called components, can be expressed in terms of those services and therefore more shortly and concisely than without such a layer. Also, component development in the middleware context can focus on high-level functionality since the low-level logistics is provided by the middleware.
While type systems have proved effective for enforcing structural constraints in programs and data structures, most architectural modeling frameworks include only weak notions of typing or rely on first-order logic constraint languages instead. Nevertheless, a consequent, adherent, use of typing can seamlessly enforce a wide range of constraints crucial for the structural integrity of architectures and the computation systems specified by them without the steep learning curve associated with first-order logic. Also, type systems scale better than first-order logic both in use and understandability/legibility as well as in computational complexity. This thesis describes component-oriented architecture modeling with CADENA and introduces the CADENA
Architecture Language with Meta-modeling (CALM). CALM uses multi-level type systems to specify complex interaction models and enforce a variety of structural properties and consistency constraints relevant for the development of large-scale component-based systems. Further, CALM generalizes the notion of middleware-based architectures and uniformly captures and maintains complex interrelated architectures integrated on multiple, differing, middleware platforms. CADENA is a robust and extensible tool based on the concepts and notions of CALM that has been used to specify a number of industrial-strength component models and applied in multiple industrial research projects on model-driven development and software product lines.
|
286 |
An Exploration of Adolescent Obesity DeterminantsSmith, Anastasia King 13 May 2016 (has links)
In 2010, approximately two-thirds of adults and one-fifth of the adolescent population in the United States were considered either overweight or obese, resulting in the United States having the highest per capita obesity rate among all OECD countries. A considerable body of literature regarding health behavior, health outcomes, and public policy exists on what the Centers for Disease Control and Prevention considers an obesity epidemic. In response to the growing problem of childhood obesity, the Child Nutrition and WIC Reauthorization Act of 2004 (CNRA), which required that schools participating in the National School Lunch Program and/or School Breakfast Program have wellness policies on file, was passed.
The purpose of this research is to provide additional insight into the origin of the geographic variation in adolescent obesity rates between the U.S. states. Previous research has looked at differences in built environments, maternal employment, food prices, agriculture policies, and technology factors in an effort to explain the variation in adolescent obesity prevalence. This dissertation contributes to the literature by examining the hypothesis that state-level school wellness policies also played a role in determining the rates of childhood obesity. Using School Health Policies and Practices Study (SHPPS) surveys from 2000 – 2012, I derived a state-level school wellness policy measure. This, together with Youth Risk Behavior Surveillance survey data on adolescent BMI was used to measure the effect of the wellness policy mandate on adolescent obesity prevalence. Several models were applied to first demonstrate that the state of residence for an adolescent is indeed related to BMI trends and then to investigate various determinants of adolescent obesity including the primary variable of interest, state school wellness policies.
The results of this research provide evidence of a statistically significant, although very small positive effect of school wellness policies on adolescent BMI that is contrary to my hypothesis. Dominance analysis showed that of the four wellness policy factors considered in the principal component composition of the wellness policy measure, policy components that met state requirements rather than those meeting health screen criteria, state recommendations, and national standards were most important in explaining the overall variance of the regression model. Interestingly, the public school attendance rate itself was also associated with a substantial decrease in adolescent BMI.
Understanding the determinants of adolescent obesity and how to effect change in the rising trend is a national concern. Obese adolescents are at significant risk of becoming obese adults and previous research has already shown the high economic costs associated with adult obesity and its comorbidities. Policies implemented in school, where adolescents consume a considerable portion of their daily calories and participate in physical activity, can help to build healthy habits that have the potential to lower the probability of an adolescent becoming an obese adult. Over time, a healthier adult population may result in lower economic costs associated with medical care and lost productivity.
|
287 |
Advanced process monitoring using wavelets and non-linear principal component analysisFourie, Steven 12 January 2007 (has links)
The aim of this study was to propose a nonlinear multiscale principal component analysis (NLMSPCA) methodology for process monitoring and fault detection based upon multilevel wavelet decomposition and nonlinear principal component analysis via an input-training neural network. Prior to assessing the capabilities of the monitoring scheme on a nonlinear industrial process, the data is first pre-processed to remove heavy noise and significant spikes through wavelet thresholding. The thresholded wavelet coefficients are used to reconstruct the thresholded details and approximations. The significant details and approximations are used as the inputs for the linear and nonlinear PCA algorithms in order to construct detail and approximation conformance models. At the same time non-thresholded details and approximations are reconstructed and combined which are used in a similar way as that of the thresholded details and approximations to construct a combined conformance model to take account of noise and outliers. Performance monitoring charts with non-parametric control limits are then applied to identify the occurrence of non-conforming operation prior to interrogating differential contribution plots to help identify the potential source of the fault. A novel summary display is used to present the information contained in bivariate graphs in order to facilitate global visualization. Positive results were achieved. / Dissertation (M Eng (Control Engineering))--University of Pretoria, 2007. / Chemical Engineering / unrestricted
|
288 |
Evaluating Multi-level Risk Factors for Malaria and Arboviral Infections in Regions of TanzaniaHomenauth, Esha January 2016 (has links)
Vector-borne diseases, such as those transmitted by mosquitoes, pose a significant public health concern in many countries worldwide. In this thesis, I explored the role of a number of risk factors defined at multiple scales on vector-borne disease prevalence, focusing on malaria and arboviral infections in several regions of North-Eastern Tanzania, with the principal aim of improving the overall diagnosis of febrile illness in this region.
First, I investigated the influence of household-wealth on prevalence of malaria and arboviral infections using principal component analysis (PCA), and then described the methodological challenges associated with this statistical technique when used to compute indices from smaller datasets. I then employed a multilevel modelling approach to simultaneously incorporate household-level anthropogenic factors and village-level environmental characteristics to investigate key determinants of Anopheles malaria vector density among rural households. These analyses provided methodologically rigorous approaches to studying vector-borne diseases at a very fine-scale and also have significant public health relevance as the research findings can assist in guiding policy decisions regarding surveillance efforts as well as inform where and when to prioritize interventions.
|
289 |
Constructing component-based systems directly from requirements using incremental compositionNordin, Azlin January 2013 (has links)
In software engineering, system construction typically starts from a requirements specification that has been engineered from raw requirements in a natural language. The specification is used to derive intermediate requirements models such as structured or object-oriented models. Throughout the stages of system construction, these artefacts will be used as reference models. In general, in order to derive a design specification out of the requirements, the entire set of requirements specifications has to be analysed. Such models at best only approximate the raw requirements since these design models are derived as a result of the abstraction process according to the chosen software development methodology, and subjected to the expertise, intuition, judgment and experiences of the analysts or designers of the system. These abstraction models require the analysts to elicit all useful information from the requirements, and there is a potential risk that some information may be lost in the process of model construction. As the use of natural language requirements in system construction is inevitable, the central focus of this study was to use requirements stated in natural language in contrast to any other requirements representation (e.g. modelling artefact). In this thesis, an approach that avoids intermediate requirements models, and maps natural language requirements directly into architectural constructs, and thus minimises information loss during the model construction process, has been defined. This approach has been grounded on the adoption of a component model that supports incremental composition. Incremental composition allows a system to be constructed piece by piece. By mapping a raw requirement to elements of the component model, a partial architecture that satisfies that requirement is constructed. Consequently, by iterating this process for all the requirements, one at a time, the incremental composition to build the system piece by piece directly from the requirements can be achieved. In software engineering, system construction typically starts from a requirements specification that has been engineered from raw requirements in a natural language. The specification is used to derive intermediate requirements models such as structured or object-oriented models. Throughout the stages of system construction, these artefacts will be used as reference models. In general, in order to derive a design specification out of the requirements, the entire set of requirements specifications has to be analysed. Such models at best only approximate the raw requirements since these design models are derived as a result of the abstraction process according to the chosen software development methodology, and subjected to the expertise, intuition, judgment and experiences of the analysts or designers of the system. These abstraction models require the analysts to elicit all useful information from the requirements, and there is a potential risk that some information may be lost in the process of model construction. As the use of natural language requirements in system construction is inevitable, the central focus of this study was to use requirements stated in natural language in contrast to any other requirements representation (e.g. modelling artefact). In this thesis, an approach that avoids intermediate requirements models, and maps natural language requirements directly into architectural constructs, and thus minimises information loss during the model construction process, has been defined. This approach has been grounded on the adoption of a component model that supports incremental composition. Incremental composition allows a system to be constructed piece by piece. By mapping a raw requirement to elements of the component model, a partial architecture that satisfies that requirement is constructed. Consequently, by iterating this process for all the requirements, one at a time, the incremental composition to build the system piece by piece directly from the requirements can be achieved.
|
290 |
Two-dimensional landmark analysis of Spinocyrtid brachiopods of Euramerica during the GivetianLayng, Alexander Patrick 01 August 2017 (has links)
Recent inquiry into the nomenclature of several species within Spinocyrtia has led to questions concerning name applicability and validity, particularly whether Delthyris granulosa and Spinocyrtia (Spirifer) granulosa are synonymous. This study utilized two-dimensional outline landmark analysis, a form of geometric morphometric analysis, to evaluate interspecific variation among these species. I took over a thousand photographs of over a hundred specimens of brachiopods belonging to the family Spinocyrtiidae. Ninety-six specimens originated from the Givetian outcrop belt of New York state, three were from northwestern Ohio, there was single Canadian specimen, and there was a single German specimen. The results from these analyses indicate that the mophospaces of Spinocyrtia (Spirifer) congesta, S. (Spirifer) granulosa, and S?. (Spirifer) marcyi are statistically (p < 0.05) distinct from one another.
|
Page generated in 0.0446 seconds