Spelling suggestions: "subject:"[een] FUZZY INFERENCE SYSTEM"" "subject:"[enn] FUZZY INFERENCE SYSTEM""
1 |
SENS-IT: Semantic Notification of Sensory IoT Data Framework for Smart EnvironmentsAlowaidi, Majed 12 December 2018 (has links)
Internet of Things (IoT) is becoming commonplace in people's daily life. Even, many governments' authorities have already deployed a very large number of IoT sensors toward their smart city initiative and development road-map. However, lack of semantics in the presentation of IoT-based sensory data represents the perception complexity by general people. Adding semantics to the IoT sensory data remains a challenge for smart cities and environments. In this thesis proposal, we present an implementation that provides a meaningful IoT sensory data notifications approach about indoor and outdoor environment status for people and authorities. The approach is based on analyzing spatio-temporal thresholds that compose of multiple IoT sensors readings. Our developed IoT sensory data analytics adds real-time semantics to the received sensory raw data stream by converting the IoT sensory data into meaningful and descriptive notifications about the environment status such as green locations, emergency zone, crowded places, green paths, polluted locations, etc. Our adopted IoT messaging protocol can handle a very large number of dynamically added static and dynamic IoT sensors publication and subscription processes. People can customize the notifications based on their preference or can subscribe to existing semantic notifications in order to be acknowledged of any concerned environmental condition. The thesis is supposed to come up with three contributions. The first, an IoT approach of a three-layer architecture that extracts raw sensory data measurements and converts it to a contextual-aware format that can be perceived by people. The second, an ontology that infers a semantic notification of multiple sensory data according to the appropriate spatio-temporal reasoning and description mechanism. We used a tool called Protégé to model our ontology as a common IDE to build semantic knowledge. We built our ontology through extending a well-known web ontology called Semantic Sensor Network (SSN). We built the extension from which six classes were adopted to derive our SENS-IT ontology and fulfill our objectives. The third, a fuzzy system approach is proposed to make our system much generic of providing broader semantic notifications, so it can be agile enough to accept more measurements of multiple sensory sources.
|
2 |
A Hybrid-Genetic Algorithm for Training a Sugeno-Type Fuzzy Inference System with a Mutable Rule BaseCoy, Christopher G. January 2010 (has links)
No description available.
|
3 |
Development of a Performance Index for Stormwater Pipe Infrastructure using Fuzzy Inference MethodVelayutham Kandasamy, Vivek Prasad 30 June 2017 (has links)
Stormwater pipe infrastructure collects and conveys surface runoff resulting from rainfall or snowmelt to nearby streams. Traditionally, stormwater pipe systems were integrated with wastewater infrastructure through a combined sewer system. Many of these systems are being separated due to the impact of environmental laws and regulations; and the same factors have led to the creation of stormwater utilities. However, in the current ASCE Infrastructure Report Card, stormwater infrastructure is considered a sub-category of wastewater infrastructure. Stormwater infrastructure has always lacked attention compared to water and wastewater infrastructure. However, this notion has begun to shift, as aging stormwater pipes coupled with changes in climatic patterns and urban landscapes makes stormwater infrastructure more complex to manage. These changes and lack of needed maintenance has resulted in increased rates of deterioration and capacity. Stormwater utility managers have limited resources and funds to manage their pipe system. To effectively make decisions on allocating limited resources and funds, a utility should be able to understand and assess the performance of its pipe system. There is no standard rating system or comprehensive list of performance parameters for stormwater pipe infrastructure. Previous research has identified performance parameters affecting stormwater pipes and developed a performance index using a weighted factor method. However, the weighted performance index model does not capture interdependencies between performance parameters. This research developed a comprehensive list of parameters affecting stormwater pipe performance. This research also developed a performance index using fuzzy inference method to capture interdependencies among parameters. The performance index was evaluated and validated with the pipe ratings provided by one stormwater utility to document its effectiveness in real world conditions. / Master of Science / Stormwater pipe infrastructure collects and conveys the surface water resulting from rainfall or snowmelt to nearby streams. Traditionally, stormwater pipe system was integrated with wastewater infrastructure by combined sewer system. Environmental regulations forced creation of stormwater utilities and separate stormwater system, however, according to ASCE infrastructure report, stormwater infrastructure has been considered a sub-category of wastewater infrastructure. Stormwater infrastructure has always lacked attention compared to water and wastewater infrastructure. However, this notion has to shift, as aging stormwater pipes coupled with changes in climatic patterns and urban landscapes makes stormwater infrastructure complex to manage resulting in increased rate of deterioration and design capacity. Stormwater utility managers have limited resources and funds to manage their pipe system. To effectively make decisions on allocating limited resources and funds, a utility should be able to understand and assess the performance of its pipe system. There is no standard rating system for assessing the condition of stormwater pipe infrastructure. This research developed an index using fuzzy inference method to capture the interdependencies. Fuzzy inference method basically captures the interdependencies between parameters using if-then rule statements. Parameters are individual elements affecting the performance of stormwater pipes. The performance index was evaluated and validated with the pipe ratings provided by one stormwater utility to document its effectiveness in real world conditions.
|
4 |
Feasibility Study in Development of a Wearable Device to Enable Emotion Regulation in Children with Autism Spectrum DisorderHora, Manpreet Kaur 17 September 2014 (has links)
Autism spectrum disorder (ASD) is a group of developmental disabilities characterized by impairments in social interaction and communication and by difficulties in emotion recognition and regulation. There is currently no cure for autism but psychosocial interventions and medical treatments exist. However, very few of them have been trialed on young children and others pose limitations. Strengthening young children's capacity to manage their emotions is important for academic success. Thus it becomes important to design and test the feasibility of an appropriate methodology that can teach emotion regulation to young children (age 3-6 years) with ASD. This thesis addresses the problem by proposing a novel framework that integrates physiology with Cognitive Behavior Theory to enable emotion regulation in the target population by exposing them to real-time stressful situations. The framework uses a feedback loop that measures the participant's physiology, estimates the level of stress being experienced and provides an audio feedback. The feasibility of the individual building blocks of the framework was tested by conducting pilot studies on nine typically developing children (age 3-6 years). The attention capturing capacity of different audio representations was tested, and a stress profile generating system was designed and developed to map the measured physiology of the participant on to a relative stress level. 33 out of 43 instances of audio representations proved to be successful in capturing the participants' attention and the stress profiles were found to be capable of distinguishing between stressed and relaxed state of the participants with an average accuracy of 83%. / Master of Science
|
5 |
Energy And Power Systems Simulated Attack Algorithm For Defense Testbed And AnalysisRuttle, Zachary Andrew 31 May 2023 (has links)
The power grid has evolved over the course of many decades with the usage of cyber systems and communications such as Supervisory Control And Data Acquisition (SCADA); however, due to their connectivity to the internet, the cyber-power system can be infiltrated by malicious attackers. Encryption is not a singular solution. Currently, there are several cyber security measures in development, including those based on artificial intelligence. However, there is a need for a varying but consistent attack algorithm to serve as a testbed for these AI or other practices to be trained and tested. This is important because in the event of a real attacker, it is not possible to know exactly where they will attack and in what order. Therefore, the proposed method in this thesis is to use criminology concepts and fuzzy logic inference to create this algorithm and determine its effectiveness in making decisions on a cyber-physical system model. The method takes various characteristics of the attacker as an input, builds their ideal target node, and then compares the nodes to the high-impact target and chooses one as the goal. Based on that target and their knowledge, the attackers will attack nodes if they have resources. The results show that the proposed method can be used to create a variety of attacks with varying damaging effects, and one other set of tests shows the possibility for multiple attacks, such as denial of service and false data injection. The proposed method has been validated using an extended cyber-physical IEEE 13-node distribution system and sensitivity tests to ensure that the ruleset created would take each of the inputs well. / Master of Science / For the last decades, information and communications technology has become more commonplace for electric power and energy systems around the world. As a result, it has attracted hackers to take advantage of the cyber vulnerabilities to attack critical systems and cause damage, e.g., the critical infrastructure for electric energy. The power grid is a wide-area, distributed infrastructure with numerous power plants, substations, transmission and distribution lines as well as customer facilities. For operation and control, the power grid needs to acquire measurements from substations and send control commands from the control center to substations. The cyber-physical system has its vulnerabilities that can be deployed by hackers to launch falsified measurements or commands. Much research is concerned with how to detect and mitigate cyber threats. These methods are used to determine if an attack is occurring, and, if so, what to do about it. However, for these techniques to work properly, there must be a way to test how the defense will understand the purpose and target of an actual attack, which is where the proposed modeling and simulation method for an attacker comes in. Using a set of values for their resources, motivation and other characteristics, the defense algorithm determines what the attacker's best target would be, and then finds the closest point on the power grid that they can attack. While there are still resources remaining based on the initial value, the attacker will keep choosing places and then execute the attack. From the results, these input characteristic values for the attacker can affect the decisions the attacker makes, and the damage to the system is reflected by the values too. This is tested by looking at the results for the high-impact nodes for each input value, and seeing what came out of it. This shows that it is possible to model an attacker for testing purposes on a simulation.
|
6 |
A QoE Model to Evaluate Semi-Transparent Augmented-Reality SystemZhang, Longyu 21 February 2019 (has links)
With the development of three-dimensional (3D) technologies, the demand for high-quality 3D content, 3D visualization, and flexible and natural interactions are increasing. As a result, semi-transparent Augmented-Reality (AR) systems are emerging and evolving rapidly. Since there are currently no well-recognized models to evaluate the performance of these systems, we proposed a Quality-of-Experience (QoE) taxonomy for semi-transparent AR systems containing three levels of influential QoE parameters, through analyzing existing QoE models in other related areas and integrating the feedbacks received from our user study. We designed a user study to collect training and testing data for our QoE model, and built a Fuzzy-Inference-System (FIS) model to estimate the QoE evaluation and validate the proposed taxonomy. A case study was also conducted to further explore the relationships between QoE parameters and technical QoS parameters with functional components of Microsoft HoloLens AR system. In this work, we illustrate the experiments in detail and thoroughly explain the results obtained. We also present the conclusion and future work.
|
7 |
Comparison of Topographic Surveying Techniques in StreamsBangen, Sara G. 01 May 2013 (has links)
Fine-scale resolution digital elevation models (DEMs) created from data collected using high precision instruments have become ubiquitous in fluvial geomorphology. They permit a diverse range of spatially explicit analyses including hydraulic modeling, habitat modeling and geomorphic change detection. Yet, the intercomparison of survey technologies across a diverse range of wadeable stream habitats has not yet been examined. Additionally, we lack an understanding regarding the precision of DEMs derived from ground-based surveys conducted by different, and inherently subjective, observers. This thesis addresses current knowledge gaps with the objectives i) to intercompare survey techniques for characterizing instream topography, and ii) to characterize observer variability in instream topographic surveys. To address objective i, we used total station (TS), real-time kinematic (rtk) GPS, terrestrial laser scanner (TLS), and infrared airborne laser scanning (ALS) topographic data from six sites of varying complexity in the Lemhi River Basin, Idaho. The accuracy of derived bare earth DEMs was evaluated relative to higher precision TS point data. Significant DEM discrepancies between pairwise techniques were calculated using propagated DEM errors thresholded at a 95% confidence interval. Mean discrepancies between TS and rtkGPS DEMs were relatively low (≤ 0.05 m), yet TS data collection time was up to 2.4 times longer than rtkGPS. ALS DEMs had lower accuracy than TS or rtkGPS DEMs, but ALS aerial coverage and floodplain topographic representation was superior to all other techniques. The TLS bare earth DEM accuracy and precision were lower than other techniques as a result of vegetation returns misinterpreted as ground returns. To address objective ii, we used a case study where seven field crews surveyed the same six sites to quantify the magnitude and effect of observer variability on DEMs interpolated from the survey data. We modeled two geomorphic change scenarios and calculated net erosion and deposition volumes at a 95% confidence interval. We observed several large magnitude elevation discrepancies across crews, however many of these i) tended to be highly localized, ii) were due to systematic errors, iii) did not significantly affect DEM-derived metric precision, and iv) can be corrected post-hoc.
|
8 |
Local and personalised models for prediction, classification and knowledge discovery on real world data modelling problemsHwang, Yuan-Chun January 2009 (has links)
This thesis presents several novel methods to address some of the real world data modelling issues through the use of local and individualised modelling approaches. A set of real world data modelling issues such as modelling evolving processes, defining unique problem subspaces, identifying and dealing with noise, outliers, missing values, imbalanced data and irrelevant features, are reviewed and their impact on the models are analysed. The thesis has made nine major contributions to information science, includes four generic modelling methods, three real world application systems that apply these methods, a comprehensive review of the real world data modelling problems and a data analysis and modelling software. Four novel methods have been developed and published in the course of this study. They are: (1) DyNFIS – Dynamic Neuro-Fuzzy Inference System, (2) MUFIS – A Fuzzy Inference System That Uses Multiple Types of Fuzzy Rules, (3) Integrated Temporal and Spatial Multi-Model System, (4) Personalised Regression Model. DyNFIS addresses the issue of unique problem subspaces by identifying them through a clustering process, creating a fuzzy inference system based on the clusters and applies supervised learning to update the fuzzy rules, both antecedent and consequent part. This puts strong emphasis on the unique problem subspaces and allows easy to understand rules to be extracted from the model, which adds knowledge to the problem. MUFIS takes DyNFIS a step further by integrating a mixture of different types of fuzzy rules together in a single fuzzy inference system. In many real world problems, some problem subspaces were found to be more suitable for one type of fuzzy rule than others and, therefore, by integrating multiple types of fuzzy rules together, a better prediction can be made. The type of fuzzy rule assigned to each unique problem subspace also provides additional understanding of its characteristics. The Integrated Temporal and Spatial Multi-Model System is a different approach to integrating two contrasting views of the problem for better results. The temporal model uses recent data and the spatial model uses historical data to make the prediction. By combining the two through a dynamic contribution adjustment function, the system is able to provide stable yet accurate prediction on real world data modelling problems that have intermittently changing patterns. The personalised regression model is designed for classification problems. With the understanding that real world data modelling problems often involve noisy or irrelevant variables and the number of input vectors in each class may be highly imbalanced, these issues make the definition of unique problem subspaces less accurate. The proposed method uses a model selection system based on an incremental feature selection method to select the best set of features. A global model is then created based on this set of features and then optimised using training input vectors in the test input vector’s vicinity. This approach focus on the definition of the problem space and put emphasis the test input vector’s residing problem subspace. The novel generic prediction methods listed above have been applied to the following three real world data modelling problems: 1. Renal function evaluation which achieved higher accuracy than all other existing methods while allowing easy to understand rules to be extracted from the model for future studies. 2. Milk volume prediction system for Fonterra achieved a 20% improvement over the method currently used by Fonterra. 3. Prognoses system for pregnancy outcome prediction (SCOPE), achieved a more stable and slightly better accuracy than traditional statistical methods. These solutions constitute a contribution to the area of applied information science. In addition to the above contributions, a data analysis software package, NeuCom, was primarily developed by the author prior and during the PhD study to facilitate some of the standard experiments and analysis on various case studies. This is a full featured data analysis and modelling software that is freely available for non-commercial purposes (see Appendix A for more details). In summary, many real world problems consist of many smaller problems. It was found beneficial to acknowledge the existence of these sub-problems and address them through the use of local or personalised models. The rules extracted from the local models also brought about the availability of new knowledge for the researchers and allowed more in-depth study of the sub-problems to be carried out in future research.
|
9 |
Local and personalised models for prediction, classification and knowledge discovery on real world data modelling problemsHwang, Yuan-Chun January 2009 (has links)
This thesis presents several novel methods to address some of the real world data modelling issues through the use of local and individualised modelling approaches. A set of real world data modelling issues such as modelling evolving processes, defining unique problem subspaces, identifying and dealing with noise, outliers, missing values, imbalanced data and irrelevant features, are reviewed and their impact on the models are analysed. The thesis has made nine major contributions to information science, includes four generic modelling methods, three real world application systems that apply these methods, a comprehensive review of the real world data modelling problems and a data analysis and modelling software. Four novel methods have been developed and published in the course of this study. They are: (1) DyNFIS – Dynamic Neuro-Fuzzy Inference System, (2) MUFIS – A Fuzzy Inference System That Uses Multiple Types of Fuzzy Rules, (3) Integrated Temporal and Spatial Multi-Model System, (4) Personalised Regression Model. DyNFIS addresses the issue of unique problem subspaces by identifying them through a clustering process, creating a fuzzy inference system based on the clusters and applies supervised learning to update the fuzzy rules, both antecedent and consequent part. This puts strong emphasis on the unique problem subspaces and allows easy to understand rules to be extracted from the model, which adds knowledge to the problem. MUFIS takes DyNFIS a step further by integrating a mixture of different types of fuzzy rules together in a single fuzzy inference system. In many real world problems, some problem subspaces were found to be more suitable for one type of fuzzy rule than others and, therefore, by integrating multiple types of fuzzy rules together, a better prediction can be made. The type of fuzzy rule assigned to each unique problem subspace also provides additional understanding of its characteristics. The Integrated Temporal and Spatial Multi-Model System is a different approach to integrating two contrasting views of the problem for better results. The temporal model uses recent data and the spatial model uses historical data to make the prediction. By combining the two through a dynamic contribution adjustment function, the system is able to provide stable yet accurate prediction on real world data modelling problems that have intermittently changing patterns. The personalised regression model is designed for classification problems. With the understanding that real world data modelling problems often involve noisy or irrelevant variables and the number of input vectors in each class may be highly imbalanced, these issues make the definition of unique problem subspaces less accurate. The proposed method uses a model selection system based on an incremental feature selection method to select the best set of features. A global model is then created based on this set of features and then optimised using training input vectors in the test input vector’s vicinity. This approach focus on the definition of the problem space and put emphasis the test input vector’s residing problem subspace. The novel generic prediction methods listed above have been applied to the following three real world data modelling problems: 1. Renal function evaluation which achieved higher accuracy than all other existing methods while allowing easy to understand rules to be extracted from the model for future studies. 2. Milk volume prediction system for Fonterra achieved a 20% improvement over the method currently used by Fonterra. 3. Prognoses system for pregnancy outcome prediction (SCOPE), achieved a more stable and slightly better accuracy than traditional statistical methods. These solutions constitute a contribution to the area of applied information science. In addition to the above contributions, a data analysis software package, NeuCom, was primarily developed by the author prior and during the PhD study to facilitate some of the standard experiments and analysis on various case studies. This is a full featured data analysis and modelling software that is freely available for non-commercial purposes (see Appendix A for more details). In summary, many real world problems consist of many smaller problems. It was found beneficial to acknowledge the existence of these sub-problems and address them through the use of local or personalised models. The rules extracted from the local models also brought about the availability of new knowledge for the researchers and allowed more in-depth study of the sub-problems to be carried out in future research.
|
10 |
The foundation of capability modelling : a study of the impact and utilisation of human resourcesShekarriz, Mona January 2011 (has links)
This research aims at finding a foundation for assessment of capabilities and applying the concept in a human resource selection. The research identifies a common ground for assessing individuals’ applied capability in a given job based on literature review of various disciplines in engineering, human sciences and economics. A set of criteria is found to be common and appropriate to be used as the basis of this assessment. Applied Capability is then described in this research as the impact of the person in fulfilling job requirements and also their level of usage from their resources with regards to the identified criteria. In other words how their available resources (abilities, skills, value sets, personal attributes and previous performance records) can be used in completing a job. Translation of the person’s resources and task requirements using the proposed criteria is done through a novel algorithm and two prevalent statistical inference techniques (OLS regression and Fuzzy) are used to estimate quantitative levels of impact and utilisation. A survey on post graduate students is conducted to estimate their applied capabilities in a given job. Moreover, expert academics are surveyed on their views on key applied capability assessment criteria, and how different levels of match between job requirement and person’s resources in those criteria might affect the impact levels. The results from both surveys were mathematically modelled and the predictive ability of the conceptual and mathematical developments were compared and further contrasted with the observed data. The models were tested for robustness using experimental data and the results for both estimation methods in both surveys are close to one another with the regression models being closer to observations. It is believed that this research has provided sound conceptual and mathematical platforms which can satisfactorily predict individuals’ applied capability in a given job. This research has contributed to the current knowledge and practice by a) providing a comparison of capability definitions and uses in different disciplines, b) defining criteria for applied capability assessment, c) developing an algorithm to capture applied capabilities, d) quantification of an existing parallel model and finally e) estimating impact and utilisation indices using mathematical methods.
|
Page generated in 0.0331 seconds