Spelling suggestions: "subject:"[een] METRICS"" "subject:"[enn] METRICS""
281 |
Land cover study in Iowa: analysis of classification methodology and its impact on scale, accuracy, and landscape metricsPorter, Sarah Ann 01 July 2011 (has links)
For landscapes dominated by agriculture, land cover plays an important role in the balance between anthropogenic and natural forces. Therefore, the objective of this thesis is to describe two different methodologies that have been implemented to create high-resolution land cover classifications in a dominant agricultural landscape. First, an object-based segmentation approach will be presented, which was applied to historic, high resolution, panchromatic aerial photography. Second, a traditional per-pixel technique was applied to multi-temporal, multispectral, high resolution aerial photography, in combination with light detection and ranging (LIDAR) and independent component analysis (ICA). A critical analysis of each approach will be discussed in detail, as well as the ability of each methodology to generate landscape metrics that can accurately characterize the quality of the landscape. This will be done through the comparison of various landscape metrics derived from the different classifications approaches, with a goal of enhancing the literature concerning how these metrics vary across methodologies and across scales. This is a familiar problem encountered when analyzing land cover datasets over time, which are often at different scales or generated using different methodologies. The diversity of remotely sensed imagery, including varying spatial resolutions, landscapes, and extents, as well as the wide range of spatial metrics that can be created, has generated concern about the integrity of these metrics when used to make inferences about landscape quality. Finally, inferences will be made about land cover and land cover change dynamics for the state of Iowa based on insight gained throughout the process.
|
282 |
Rehabilitating Asymmetric Gait Using AsymmetryRamakrishnan, Tyagi 07 November 2017 (has links)
Human gait is a complex process that involves the coordination of the central nervous and muscular systems. A disruption to the either system results in the impairment of a person’s ability to walk. Impairments can be caused by neurological disorders such as stroke and physical conditions like amputation. There is not a standardized method to quantitatively assess the gait asymmetry of affected subjects. The purpose of this research is to understand the fundamental aspects of asymmetrical effects on the human body and improve rehabilitation techniques and devices. This research takes an interdisciplinary approach to address the limitations with current rehabilitation methodologies.
The goal of my Doctoral research is to understand the fundamental effects of asymmetry caused by physical and neurological impairments. The methods discussed in this document help in developing better solutions to rehabilitate impaired individuals’ gait. I studied four major hypothesis in regards to gait asymmetry. The first hypothesis is the potential of asymmetric systems to have symmetric output. The second hypothesis is that a method that incorporates a wider range of gait parameter asymmetries can be used as a measure for gait rehabilitation. The third hypothesis is that individuals can visually identify subtle gait asymmetries. Final hypothesis is to establish the relationship between gait quality and function. Current approaches to rehabilitate impaired gait typically focus on achieving the same symmetric gait as an able-body person. This cannot work because an impaired person is inherently asymmetric and forcing them to walk symmetrically causes them to adopt patterns that are not beneficial long term. Instead, it is more prudent to embrace the asymmetry of the condition and work to minimize in specific gait parameters that may cause more harm over the long run. Combined gait asymmetry metric (CGAM) provides the necessary means to study the effect of the gait parameters and it is weighted to balance each parameter’s effect equally by normalizing the data. CGAM provides the necessary means to study the effect of the gait parameters and is weighted towards parameters that are more asymmetric. The metric is also designed to combine spatial, temporal, kinematic, and kinetic gait parameter asymmetries. It can also combine subsets of the different gait parameters to provide a more thorough analysis. CGAM will help define quantitative thresholds for achievable balanced overall gait asymmetry.
The studies in this dissertation conducted on able-body and impaired subjects provides better understanding of some fundamental aspects of asymmetry in human gait. Able body subjects test devices that aim to make an individual’s gait more asymmetric. These perturbations include a prosthetic and stroke simulator, addition of distal mass, and leg length alterations. Six able-body subjects and one amputee participated in the experiment that studied the effect of asymmetric knee height. The results which consisted of analyses of individual gait parameters and CGAM scores revealed that there is evidence of overall reduction of asymmetry in gait for both able-body subject on prosthetic simulators and transfemoral amputee. The transfemoral amputee also walked with a combination of distal mass with lowered knee height. Although this configuration showed better symmetry, the configuration is detrimental in terms of energy costs. Analyzing the data of gait with the stroke simulator showed that the subject’s gait does undergo alterations in terms of overall gait asymmetry. The distal mass and leg length alteration study has revealed some significant findings that are also reflected in the prosthetic study with distal mass. A leg length discrepancy (LLD) or the change of limb mass can result in asymmetric gait patterns. Although adding mass and LLD have been studied separately, this research studies how gait patterns change as a result of asymmetrically altering both leg length and mass at a leg’s distal end. Spatio-temporal and kinetic gait measures are used to study the combined asymmetric effects of placing LLD and mass on the opposite and same side. There were statistically significant differences for the amount of mass and leg length added for all five parameters. When LLD is added to longer leg, the temporal and kinetic gait parameters of the shorter limb and the altered limb’s spatial parameter become more asymmetric. Contrary to the hypothesis, there was no significant interaction between the amount of mass and leg length added. There were cases in all perturbations where a combination of mass and LLD make a gait parameter more symmetric than a single effect. These cases exhibit the potential for configurations with lower overall asymmetries even though each parameter has a slight asymmetry as opposed to driving one parameter to symmetry and other parameters to a larger asymmetry. CGAM analysis of the results revealed that the addition of distal mass contributes more towards overall asymmetry than LLD. Analyzing 11 gait parameters for LLD and mass on the same side showed that the overall asymmetry decreased for the combination of small LLD and mass. This is consistent with the findings from analyzing five individual gait parameters.
Impaired subjects include individuals with stroke and amputees. The clinical trials for individuals with stroke involve training with the Gait Enhancing Mobile Shoe (GEMS) that pro- vides an asymmetric effect on the subject’s step length and time. Training with the GEMS showed improvement in clinical measures such as timed up and go (TUG), six minute walk test (6MWT), and gait velocity. The subjects also showed lower step length symmetry as intended by the GEMS. The ground reaction force asymmetries became more asymmetric as the spatial and temporal parameters became more symmetric. This phenomenon shows evidence that when an individual with stroke is corrected, for spatial and temporal symmetry is at the expense of kinetic symmetry. The CGAM scores also reflected similar trends to that of spatial and temporal symmetry and the r2 correlation with the gait parameters proved that double limb support asymmetry has no correlation with CGAM while ground reaction force asymmetry has a weak correlation. Step length, step, and swing time showed high correlation to CGAM. I also found the r2 correlation between the clinical measures and the CGAM scores. The CGAM scores were moderately correlated to 6MWT and gait velocity but had a weak correlation with TUG. CGAM has positive correlation with TUG and has negative correlation with 6MWT and gait velocity. This gives some validation to CGAM as a potential metric that can be used to evaluate gait patterns based on their asymmetries.
Transfemoral amputees were tested for their gait with varied prosthetic knee heights to study the asymmetrical effects and trained split-belt treadmill. Asymmetric knee heights showed improvement in multiple gait parameters such as step length, vertical, propulsive, and braking force asymmetry. It also decreased hip and ankle angle asymmetries. However, these improvements did lead other parameters to become more asymmetric. The CGAM scores reflect this and they show overall improvement. Although the lowest knee height showed improvement, the input from the amputee suggested that the quality of gait decreased with the lowest knee height. These exploratory results did show that a slightly lower knee height may not affect the quality of gait but may provide better overall symmetry. Another exploratory study with split-belt treadmill training, similar to the protocol followed for individuals with stroke, showed definitive improvement in double limb support, swing time, step length and time symmetry. This was also reflected in the improvements seem post training in the CGAM scores as well. I found the r2 correlation of the CGAM and the gait parameters including gait velocity. Step length and swing time show consistent correlation for individual subjects and all the data combined to CGAM. Gait velocity shows a moderate correlation to CGAM for one subject and a high correlation to the other one. However, the combined data of gait velocities does not have any correlation with CGAM. These results show that CGAM can successfully represent the overall gait parameter asymmetry. The trends seen in the gait parameters is closely reflected in the CGAM scores.
This research combines the study of asymmetry with people’s perception of human gait asymmetry, which will help in estimating the thresholds for perceivable asymmetrical changes to gait. Sixteen videos were generated using motion capture data and Unity game engine. The videos were chosen to represent the largest variation of gait asymmetries. Some videos were also chosen based on CGAM values that were similar but had large variation in underlying gait parameters. The dataset consisted of results of perturbation experiments on able-body subjects and asymmetric knee height prosthesis on transfemoral amputee. These videos were rated on a seven point Likert scale by subjects from 7 being normal to 1 being abnormal. Thirty one subjects took part in the experiment, out of which only 22 subject’s data was used because they rated at least 3 videos. The results show that the subjects were able to differentiate asymmetric gait with perturbations to able-body gait without perturbation at a self-selected speed. r2 correlation analysis showed that hip angle had mild correlation to the Likert scale rating of the 16 different gait patterns. Multivariate linear regression analysis with a linear model showed significant contribution of ankle and hip angles, vertical, propulsive, and braking forces. It is interesting that the majority of parameters that showed significance are not perceivable visually. Ankle and hip angles are visually perceivable and this significance revealed that subjects seemed to perceive asymmetric ankle and hip angles as abnormal. However, the subjects do not perceive asymmetric knee angles as completely abnormal with evidence of no significance, no correlation, and neutral Likert rating for gait patterns that perturbed knee angles.
|
283 |
Sustainability at U.S. Urban Water Utilities: A Framework to Assess Key AttributesRies, Matthew Paul 22 June 2016 (has links)
Urban water utilities in the United States face challenges due to a combination of external drivers. These include urbanization and population growth, which are stressing a system of aging infrastructure. Compliance with increasing regulations is also a challenge in a fiscally-constrained economic environment. A changing climate threatens infrastructure and past assumptions for water supply and quality. Urban utilities provide clean water and sanitation services to over 80% of the country’s population and its industrial centers. Therefore, the sustainability of these water utilities are crucial to the country’s and the public’s well-being.
New operating models are emerging for a “utility of the future.” Future utilities will recover resources, reduce their overall environmental impact, partner in the local economy, and deliver watershed-wide benefits to improve quality of life. These are all elements of a sustainable utility, but the sector has not agreed upon an applicable definition of sustainability, which intuitively incorporates an inter-generational approach to utility operations. For the purposes of this research, a sustainable utility is defined as one that will provide its crucial services for current and future generations, protect public and environmental health, and enable economic growth, all while minimizing resource consumption. Previous research provided little guidance on the most important sustainable practices for U.S. urban water utilities or the key attributes of those utilities that enable the shift toward sustainability. Additionally, the practice of sustainability measurement, and the closely-related practice of performance measurement, has not been widely adopted in the U.S. water sector.
This research program addressed the challenge of providing guidance on, and measurement of, sustainability by developing a framework to quickly and quantitatively assess a utility’s sustainability and key organizational attributes. A mixed methods approach to this research used qualitative and quantitative methodologies. The approach utilized accepted anthropological methods to assess engineering and business concepts at water utilities. Data originated from semi-structured interviews of an external advisory committee of 12 widely-recognized, progressive, U.S. water utility leaders along with online surveys of water utility professionals.
The analyzed data revealed the most important sustainable practices for sustainable utilities and organizational attributes that enable the shift toward sustainable operations. Practices are actionable, quantitative, and in some cases, unique to the water sector. Attributes are generally qualitative; largely controlled by internal decisions and actions; and influence a utility’s ability to operate sustainably. Datasets for sustainable practices and organizational attributes were generated using the techniques of discourse analysis on the semi-structured interview transcripts and freelisting on the online survey results. Top results from each dataset were cross-compared to generate the final, consolidated list of top practices and attributes.
A sustainability index was developed from the top eight sustainable practices, measured via a total of 14 indicators. Indices were tailored to water, wastewater, and combined utilities. The top sustainable practices were: Education and Communication; Financial Management; Green Infrastructure; Habitat/Watershed Protection; Long-term Resource Plan; Resource Recovery; and Water Conservation. These eight practices provided sufficient coverage of the economic, social, environmental, and infrastructure components of the triple bottom line-plus concept used to frame sustainability for this research.
This research also established the top six organizational attributes that enable the shift toward sustainability. These attributes were: Board Support / Political Will; Flexible Staff; Innovative Culture; Leadership; Organizational Commitment; and Staff Training / Development. These six attributes were assessed via a total of seven indicators, with guidance and scaling similar to the practices for ease of use by the end user.
Current sustainability and performance measurement frameworks were analyzed for indicators and measurement approaches that matched the top practices and attributes. Some of the practices and only one of the six attributes matched an existing framework. When there was a match, the existing assessment was used to help with ease of use. In other cases, new indicators, guidance, and scaling (for assessment) were developed. Practices and attributes without a match suggests these aspects of sustainable utilities are relatively new to the sector, or at least, measurement of these practices and attributes is not widespread.
The practices and attributes were combined into the final framework, a survey tool, which was pilot tested with three water utilities. The pilot testing demonstrated that the survey was comprehensive, yet at the same time, concise enough that it could be completed in under two hours by a limited number of utility staff. The application of this framework to a representative sample of U.S. urban water utilities can generate data to establish which attributes correlate to sustainable utilities. This will help utilities focus their limited resources on attributes which are shown to enable the shift toward sustainability.
|
284 |
Early Stratification of Gestational Diabetes Mellitus (GDM) by building and evaluating machine learning modelsSharma, Vibhor January 2020 (has links)
Gestational diabetes Mellitus (GDM), a condition involving abnormal levels of glucose in the blood plasma has seen a rapid surge amongst the gestating mothers belonging to different regions and ethnicities around the world. Cur- rent method of screening and diagnosing GDM is restricted to Oral Glucose Tolerance Test (OGTT). With the advent of machine learning algorithms, the healthcare has seen a surge of machine learning methods for disease diag- nosis which are increasingly being employed in a clinical setup. Yet in the area of GDM, there has not been wide spread utilization of these algorithms to generate multi-parametric diagnostic models to aid the clinicians for the aforementioned condition diagnosis.In literature, there is an evident scarcity of application of machine learn- ing algorithms for the GDM diagnosis. It has been limited to the proposed use of some very simple algorithms like logistic regression. Hence, we have attempted to address this research gap by employing a wide-array of machine learning algorithms, known to be effective for binary classification, for GDM classification early on amongst gestating mother. This can aid the clinicians for early diagnosis of GDM and will offer chances to mitigate the adverse out- comes related to GDM among the gestating mother and their progeny.We set up an empirical study to look into the performance of different ma- chine learning algorithms used specifically for the task of GDM classification. These algorithms were trained on a set of chosen predictor variables by the ex- perts. Then compared the results with the existing machine learning methods in the literature for GDM classification based on a set of performance metrics. Our model couldn’t outperform the already proposed machine learning mod- els for GDM classification. We could attribute it to our chosen set of predictor variable and the under reporting of various performance metrics like precision in the existing literature leading to a lack of informed comparison. / Graviditetsdiabetes Mellitus (GDM), ett tillstånd som involverar onormala ni- våer av glukos i blodplasma har haft en snabb kraftig ökning bland de drab- bade mammorna som tillhör olika regioner och etniciteter runt om i världen. Den nuvarande metoden för screening och diagnos av GDM är begränsad till Oralt glukosetoleranstest (OGTT). Med tillkomsten av maskininlärningsalgo- ritmer har hälso- och sjukvården sett en ökning av maskininlärningsmetoder för sjukdomsdiagnos som alltmer används i en klinisk installation. Ändå inom GDM-området har det inte använts stor spridning av dessa algoritmer för att generera multiparametriska diagnostiska modeller för att hjälpa klinikerna för ovannämnda tillståndsdiagnos.I litteraturen finns det en uppenbar brist på tillämpning av maskininlär- ningsalgoritmer för GDM-diagnosen. Det har begränsats till den föreslagna användningen av några mycket enkla algoritmer som logistisk regression. Där- för har vi försökt att ta itu med detta forskningsgap genom att använda ett brett spektrum av maskininlärningsalgoritmer, kända för att vara effektiva för binär klassificering, för GDM-klassificering tidigt bland gesterande mamma. Det- ta kan hjälpa klinikerna för tidig diagnos av GDM och kommer att erbjuda chanser att mildra de negativa utfallen relaterade till GDM bland de dödande mamma och deras avkommor.Vi inrättade en empirisk studie för att undersöka prestandan för olika ma- skininlärningsalgoritmer som används specifikt för uppgiften att klassificera GDM. Dessa algoritmer tränades på en uppsättning valda prediktorvariabler av experterna. Jämfört sedan resultaten med de befintliga maskininlärnings- metoderna i litteraturen för GDM-klassificering baserat på en uppsättning pre- standametriker. Vår modell kunde inte överträffa de redan föreslagna maskininlärningsmodellerna för GDM-klassificering. Vi kunde tillskriva den valda uppsättningen prediktorvariabler och underrapportering av olika prestanda- metriker som precision i befintlig litteratur vilket leder till brist på informerad jämförelse.
|
285 |
Loan contracting and the credit cycleJericevic, Sandra Lynne Unknown Date (has links)
The performance of financial institutions is significantly influenced by the actions of loan officers. The process by which lending decisions are made is therefore of critical interest to management, shareholders, and regulators alike. Indeed, the drain on bank capital that has often accompanied credit quality problems in the past has encouraged the search for new approaches towards the management of lending and related activities. / This thesis seeks to examine whether existing governance and incentive techniques found in banks are sufficiently comprehensive in guiding loan decision-making. In the context of lending to the corporate sector, the study investigates the endogenous and exogenous influences surrounding the lending role, and assesses the implications for how loan officers are monitored, evaluated, and motivated to act in a financial institution’s best interests. / By first developing an expanded model that conceptualizes the loan offer function, and then grounding this framework within a business cycle context, the study demonstrates the potential for governance and reward systems, that are constant through time, to have variable outcomes/effects. Support for this hypothesis is provided based on publicly available financial market information and other material gathered from private sources. A proposal is then advanced for the development of a management information system that identifies changes in credit standards being applied, thereby enabling banks to benchmark and influence loan officer performance in the context of cyclically changing attitudes to risk and the effects on negotiating power.
|
286 |
Loan contracting and the credit cycleJericevic, Sandra Lynne Unknown Date (has links)
The performance of financial institutions is significantly influenced by the actions of loan officers. The process by which lending decisions are made is therefore of critical interest to management, shareholders, and regulators alike. Indeed, the drain on bank capital that has often accompanied credit quality problems in the past has encouraged the search for new approaches towards the management of lending and related activities. / This thesis seeks to examine whether existing governance and incentive techniques found in banks are sufficiently comprehensive in guiding loan decision-making. In the context of lending to the corporate sector, the study investigates the endogenous and exogenous influences surrounding the lending role, and assesses the implications for how loan officers are monitored, evaluated, and motivated to act in a financial institution’s best interests. / By first developing an expanded model that conceptualizes the loan offer function, and then grounding this framework within a business cycle context, the study demonstrates the potential for governance and reward systems, that are constant through time, to have variable outcomes/effects. Support for this hypothesis is provided based on publicly available financial market information and other material gathered from private sources. A proposal is then advanced for the development of a management information system that identifies changes in credit standards being applied, thereby enabling banks to benchmark and influence loan officer performance in the context of cyclically changing attitudes to risk and the effects on negotiating power.
|
287 |
Provider recommendation based on client-perceived performanceThio, Niko January 2009 (has links)
In recent years the service-oriented design paradigm has enabled applications to be built by incorporating third party services. With the increasing popularity of this new paradigm, many companies and organizations have started to adopt this technology, which has resulted in an increase of the number and variety of third party providers. With the vast improvement of global networking infrastructure, a large number of providers offer their services for worldwide clients. As a result, clients are often presented with a number of providers that offer services with the same or similar functionalities, but differ in terms of non-functional attributes (or Quality of Service – QoS), such as performance. In this environment, the role of provider recommendation has become more important - in assisting clients in choosing the provider that meets their QoS requirement. / In this thesis we focus on provider recommendation based on one of the most important QoS attributes – performance. Specifically, we investigate client-perceived performance, which is the application-level performance measured at the client-side every time the client invokes the service. This performance metric has the advantage of accurately representing client experience, compared to the widely used server-side metrics in the current frameworks (e.g. Service Level Agreement or SLA in Web Services context). As a result, provider recommendation based on this metric will be favourable from the client’s point of view. / In this thesis we address two key research challenges related to provider recommendation based on client-perceived performance - performance assessment and performance prediction. We begin by identifying heterogeneity factors that affect client-perceived performance among clients in a global Internet environment. We then perform extensive real-world experiments to evaluate the significance of each factor to the client-perceived performance. / From our finding on heterogeneity factors, we then develop a performance estimation technique to address performance assessment for cases where direct measurements are unavailable. This technique is based on the generalization concept, i.e. estimating performance based on the measurement gathered by similar clients. A two-stage grouping scheme based on the heterogeneity factors we identified earlier is proposed to address the problem of determining client similarity. We then develop an estimation algorithm and validate it using synthetic data, as well as real world datasets. / With regard to performance prediction, we focus on the medium-term prediction aspect to address the needs of the emerging technology requirements: distinguishing providers based on medium-term (e.g. one to seven days) performance. Such applications are found when the providers require subscription from their clients to access the service. Another situation where the medium-term prediction is important is in temporal-aware selection: the providers need to be differentiated, based on the expected performance of a particular time interval (e.g. during business hours). We investigate the applicability of classical time series prediction methods: ARIMA and exponential smoothing, as well as their seasonal counterparts – seasonal ARIMA and Holt-Winters. Our results show that these existing models lack the ability to capture the important characteristics of client-perceived performance, thus producing poor medium-term prediction. We then develop a medium-term prediction method that is specifically designed to account for the key characteristics of a client-perceived performance series, and to show that our prediction methods produce higher accuracy for medium-term prediction compared to the existing methods. / In order to demonstrate the applicability of our solution in practice, we developed a provider recommendation framework based on client-perceived performance (named PROPPER), which utilizes our findings on performance assessment and prediction. We formulated the recommendation algorithm and evaluated it through a mirror selection case study. It is shown that our framework produces better outcomes in most cases, compared to country-based or geographic distance-based selection schemes, which are the current approach of mirror selection nowadays.
|
288 |
Improved effort estimation of software projects based on metricsAndersson, Veronika, Sjöstedt, Hanna January 2005 (has links)
<p>Saab Ericsson Space AB develops products for space for a predetermined price. Since the price is fixed, it is crucial to have a reliable prediction model to estimate the effort needed to develop the product. In general software effort estimation is difficult, and at the software department this is a problem.</p><p>By analyzing metrics, collected from former projects, different prediction models are developed to estimate the number of person hours a software project will require. Models for predicting the effort before a project begins is first developed. Only a few variables are known at this state of a project. The models developed are compared to a current model used at the company. Linear regression models improve the estimate error with nine percent units and nonlinear regression models improve the result even more. The model used today is also calibrated to improve its predictions. A principal component regression model is developed as well. Also a model to improve the estimate during an ongoing project is developed. This is a new approach, and comparison with the first estimate is the only evaluation.</p><p>The result is an improved prediction model. There are several models that perform better than the one used today. In the discussion, positive and negative aspects of the models are debated, leading to the choice of a model, recommended for future use.</p>
|
289 |
Towards Measurable and Tunable SecurityLundin, Reine January 2007 (has links)
<p>Many security services today only provides one security configuration at run-time, and cannot then utilize the trade-off between performance and security. In order to make use of this trade-off, tunable security services providing several security configurations that can be selected at run-time are needed. To be able to make intelligent choices on which security configuration to use for different situations we need to know how good they are, i.e., we need to order the different security configurations with respect to each security attribute using measures for both security and performance.</p><p>However, a key issue with computer security is that it is due to its complex nature hard to measure.</p><p>As the title of this thesis indicates, it discusses both security measures and tunable security services. Thus, it can be seen to consist of two parts. In the first part, discussing security measures for tunable security services, an investigation on the security implications of selective encryption by using guesswork as a security measure is made. Built on this an investigation of the relationship between guesswork and entropy. The result shows that guesswork,</p><p>after a minor redefinition, is equal to the sum of the entropy and the relative entropy.</p><p>The second part contributes to the area of tunable security services, e.g., services that provides several security configurations at run-time. In particular, we present the mobile Crowds (mCrowds) system,</p><p>an anonymity technology for the mobile Internet developed at Karlstad University, and a tunable encryption service, that is based on a selective encryption paradigm and designed as a middleware. Finally, an investigation of the tunable features provided by Mix-Nets and Crowds are done, using a conceptual model for tunable security services.</p>
|
290 |
Systems Modeling and Modularity Assessment for Embedded Computer Control ApplicationsChen, Dejiu January 2004 (has links)
AbstractThe development of embedded computer control systems(ECS) requires a synergetic integration of heterogeneoustechnologies and multiple engineering disciplines. Withincreasing amount of functionalities and expectations for highproduct qualities, short time-to-market, and low cost, thesuccess of complexity control and built-in flexibility turn outto be one of the major competitive edges for many ECS products.For this reason, modeling and modularity assessment constitutetwo critical subjects of ECS engineering.In the development ofECS, model-based design is currently being exploited in most ofthe sub-systems engineering activities. However, the lack ofsupport for formalization and systematization associated withthe overall systems modeling leads to problems incomprehension, cross-domain communication, and integration oftechnologies and engineering activities. In particular, designchanges and exploitation of "components" are often risky due tothe inability to characterize components' properties and theirsystem-wide contexts. Furthermore, the lack of engineeringtheories for modularity assessment in the context of ECS makesit difficult to identify parameters of concern and to performearly system optimization. This thesis aims to provide a more complete basis for theengineering of ECS in the areas of systems modeling andmodularization. It provides solution domain models for embeddedcomputer control systems and the software subsystems. Thesemeta-models describe the key system aspects, design levels,components, component properties and relationships with ECSspecific semantics. By constituting the common basis forabstracting and relating different concerns, these models willalso help to provide better support for obtaining holisticsystem views and for incorporating useful technologies fromother engineering and research communities such as to improvethe process and to perform system optimization. Further, amodeling framework is derived, aiming to provide a perspectiveon the modeling aspect of ECS development and to codifyimportant modeling concepts and patterns. In order to extendthe scope of engineering analysis to cover flexibility relatedattributes and multi-attribute tradeoffs, this thesis alsoprovides a metrics system for quantifying componentdependencies that are inherent in the functional solutions.Such dependencies are considered as the key factors affectingcomplexity control, concurrent engineering, and flexibility.The metrics system targets early system-level design and takesinto account several domain specific features such asreplication and timing accuracy. Keywords:Domain-Specific Architectures, Model-basedSystem Design, Software Modularization and Components, QualityMetrics. / QC 20100524
|
Page generated in 0.0359 seconds