Spelling suggestions: "subject:"model devevelopment"" "subject:"model agentdevelopment""
1 |
Development of a model for an offshore wind turbine supported by a moored semi-submersible platformSahasakkul, Watsamon 12 September 2014 (has links)
Wind energy is one of the fastest growing sources of renewable energy in the world. There has been a lot of research, development, and investment in wind energy in recent years. Offshore sites offer stronger winds and low turbulence, along with fewer noise and visual impacts. Establishing large turbines at deepwater sites offers promising opportunities for generating high power output while utilizing the favorable environmental conditions. Researchers at Sandia National Laboratories (SNL) have developed a very large wind turbine model with a 13.2 MW rating that has 100-meter long blades; this turbine is designated as the SNL100 13.2 MW wind turbine. With a hub height of 146 meters and a rotor diameter of 205 meters, such a large turbine is best suited for offshore sites. Developing a wind turbine model for an offshore site requires that a platform model be developed first. Of the various kinds of floating platforms, a moored semi-submersible platform supporting the wind turbine, which offers stability by virtue of the intercepted water-plane area, is an appropriate choice. The goal of this study is to develop a semi-submersible platform model to support the 13.2 MW wind turbine, while keeping loads and deflections within safe limits.
The platform is developed based on work completed as part of the Offshore Code Comparison Collaboration Continuation (OC4) Phase II project, which involved a 5 MW wind turbine supported by a semi-submersible platform. The present study focuses on three important topics: (i) development of the combined offshore wind turbine system model with the 13.2 MW wind turbine, a floating semi-submersible platform, and a mooring system; (ii) the entire procedure involved in modeling and analyzing first-order hydrodynamics using two codes, MultiSurf and WAMIT; and (iii) assembling of the integrated aero-hydro-servo-elastic model considering hydrodynamics in order to verify the steady-state and stochastic response of the integrated wind turbine system. / text
|
2 |
Statistical model development to identify the best data pooling for early stage construction price forecastsTai Yeung, Kam Lan (Daisy) January 2009 (has links)
In the early feasibility study stage, the information concerning the target project is very limited. It is very common in practice for a Quantity Surveyor (Q.S.) to use the mean value of the historical building price data (with similar characteristics to the target project) to forecast the early construction cost for a target project. Most clients rely heavily on this early cost forecast, provided by the Q.S., and use it to make their investment decision and advance financial arrangement. The primary aim of this research is to develop a statistical model and demonstrate through this developed model how to measure the accuracy of mean value forecast. A secondary aim is to review the homogeneity of construction project cost. The third aim is to identify the best data pooling for mean value cost forecast in early construction stages by making the best use of the data available. Three types of mean value forecasts are considered: (1) the use of the target base group (relating to a source with similar characteristics to the target project), (2) the use of a non-target base group (relating to sources with less or dissimilar characteristics to the target project) and (3) the use of a combined target and non-target base group. A formulation of mean square error is derived for each to measure the forecasting accuracy. To accomplish the above research aims, this research uses cost data from 450 completed Hong Kong projects. The collected data is clustered into two levels as: (1) Level one - by project nature (i.e. Residential, Commercial centre, Car parking, Social community centre, School, Office, Hotel, Industrial, University and Hospital), (2) Level two -by project specification and construction floor area. In this research, the accuracy of mean value forecast (i.e. mean square error) for a total number of 10,539 of combined data groups is measured. From their performance, it may reasonably be concluded that (1) the use of a non-target base group (relating to sources with less or dissimilar characteristics to the target project) never improves the forecasting performance, (2) the use of a target base group (relating to a source with similar characteristics to the target project) cannot always provide the best forecasting performance, (3) the use of a combined target and non-target base group in some cases can furnish a better forecasting performance, and (4) when the cost data groups are clustered into a more detailed level, it can improve the forecasting performance.
|
3 |
Development of a sales forecasting model for canopy windowsBarreira, Jose 23 July 2014 (has links)
M.Com. (Business Management) / Forecasting is an important function used in a wide range of business planning or decision-making situations. The purpose ofthis study was to build a sales forecasting model that would be practical and cost effective, from the various forecasting methods and techniques available. Various forecast models, methods and techniques are outlined in the initial part of this study by the author. The author has outlined some of the fundamentals and limitations that underline the preparations of forecasting models. It is not the purpose of this study to microscopically dissect each forecasting model, method or technique. Various forecasting options were assessed in a manner that could provide some relevance to the study, thus providing a general framework for the construction of the specific sales forecasting model. Appropriate data sources were described and analysed. The data was further tested using the author's chosen quantitative forecasting techniques. Results were interpreted, and included into the author's untested sales model. It is the author's opinion that the sales model is practical, cost effective and gives a general sales forecast.
|
4 |
Analysis of the implementation of an antiretroviral treatment programme in KwaZulu- Natal provinceSengwana, Manyeke Jeanivive January 2019 (has links)
Philosophiae Doctor - PhD / The rapid expansion of the ART programme in South Africa has placed an additional
service demand on an already stretched public health infrastructure. The main aim of this
study was therefore to analyse the implementation of the ART programme in KwaZulu-
Natal province using the Donabedian Model of structure, process and outcome in order
to develop an ART delivery model. Ethical approval to conduct this research was issued by
the University’s Senate Research Committee. The first phase of the study used a
descriptive quantitative approach to review existing data from government departments
to analyse the ART programme. A checklist with the list of indicators for the three
elements of the study; the structure, process and outcome were used to collect data. A
pilot study was conducted and the Cronbach Alpha test was used to determine the rigour
of the study. In the second phase, a systematic review of studies on implementation of
the existing models of ART programme was conducted using quantitative descriptive
approach. The Quality Appraisal Tool was used to determine the validity of the research
findings from the literature. In phase 3, both qualitative and quantitative approaches
were used to conduct the Delphi study which included a group of experts in the field of
HIV and ART programme. Responses from the participants were modified to determine
the reliability of the study. The study found that there were structural problems such as
shortages of antiretroviral drugs and delays in the return of laboratory results. The
systematic literature review found that there were only two community-based ART
models in South Africa, namely; the adherence clubs and community-based adherence
clubs. These two models of ART delivery were implemented only in Cape Town.
|
5 |
How do German industrial leaders evolve their business model towards sustainability : A case study of Adidas AG and Siemens AGZagel, Fabian, Tarhonskyi, Volodymyr January 2020 (has links)
No description available.
|
6 |
TOPFLOW-Experimente, Modellentwicklung und Validierung von CFD-Codes für Wasser-Dampf-Strömungen mit PhasenübergangLucas, D., Weiß, F. P. 14 March 2012 (has links) (PDF)
Das Ziel des Vorhabens bestand in der Ertüchtigung von CFD-Codes für Wasser-Dampf-Strömungen mit Phasenübergang. Während CFD-Verfahren für einphasige Strömungen bereits breite Anwendung in der Industrie finden, steht ein entsprechender Einsatz für Zweiphasenströmungen auf Grund der komplexen Phasengrenzfläche und den davon beeinflussten Wechselwirkungen erst am Anfang. Für die Weiterentwicklung und Validierung geeigneter Schließungsmodelle werden experimentelle Daten mit hoher Orts- und Zeitauflösung benötigt. Solche Daten wurden an der TOPFLOW-Versuchsanlage des HZDR durch Kombination von Experimenten bei praxisnahen Parametern für die Reaktorsicherheit (große Skalen, hohe Drücke und Temperaturen) und innovativer Messtechnik gewonnen. Die Gittersensortechnik, mit der detaillierte Informationen über die Phasengrenzfläche gewonnen werden können, wurde in adiabaten Wasser-Luft-Experimenten sowie Kondensations- und Druckentlastungsexperimenten in einem großen DN200-Rohr eingesetzt. Umfangreiche Datenbasen mit hoher Qualität stehen im Ergebnis des Vorhabens zur Verfügung. Die Technologie für die schnelle Röntgentomographie, die Messungen ohne Strömungsbeeinflussung ermöglicht, wurde weiter entwickelt und in einer ersten Messserie erfolgreich eingesetzt. Hochaufgelöste Daten wurden auch in Experimenten zu verschiedenen Strömungssituationen (z.B. Gegenstrombegrenzung) in einem Modell des heißen Strangs eines Druckwasserreaktors gewonnen. Für die Wasser-Dampf-Experimente bei Drücken von bis zu 5 MPa wurde dabei erstmals die neu entwickelte innovative Drucktanktechnologie eingesetzt. Zur Ertüchtigung von CFD-Codes für Zweiphasenströmungen wurde das Inhomogene MUSIG-Modell für Phasenübergänge in Kooperation mit ANSYS erweitert und anhand der o.g. TOPFLOW-Experimente validiert. Außerdem erfolgten Verbesserungen u.a. für die Turbulenzmodellierung in Blasenströmungen sowie Simulationen zur Validierung der Modelle für Blasenkräfte und Blasenkoaleszenz und -zerfall. Ein wesentlicher Fortschritt wurde bei der Modellierung freier Oberflächen durch die Verallgemeinerung des AIAD-Modells erreicht. Die am Heißstrangmodell ermittelten Flut¬kurven können unter Nutzung dieses Modells in guter Übereinstimmung berechnet werden.
|
7 |
A methodology for the development of models for the simulation of non-observable systemsTurner, Andrew J. 22 May 2014 (has links)
The use and application of modeling and simulation (M&S) is pervasive in today's world. A key component in the application of models is to conduct appropriate verification and validation (V&V). V&V is conducted to make sure the model represents reality to the appropriate level of detail based on the questions posed. V&V techniques are well documented within the literature for observable systems, i.e. required data can be collected from the operations of the real system for comparison with the simulation results; however, V&V techniques for non-observable systems are limited to subjective validation. This subjective validation can be applied to the simulation outputs, operational validation, or towards the model development, conceptual validation. Oftentimes subjective operational validation of the simulation is the primary source of validation efforts. It is shown in this thesis that the sole reliance on subjective operational validation of the simulation can easily lead to the inaccurate acceptance of a model. In order to improve M&S practices for the representation of non-observable systems, models must be developed in a methodological manner that provides a traceable and defensible argument behind the model’s representation of reality. Though there is growing discussion within the recent literature, few methods exist on proper conceptual model development and validation. The research objective of this thesis is to identify a methodology to develop a model in a traceable and defensible manner for a system or system of systems that is non-observable. To address this research objective the proposal will address eight aspects of model development. The first is to define a set of terms that are common vernacular in the field of M&S. This is followed by the assessment of what defines a ‘good’ model and how to determine if the model is ‘good’ or not. This leads to a review of V&V and the observation that subjective validation in isolation is not sufficient for model validation. Next, a review of model development procedures is conducted and analyzed against a set of criteria. A selection is made using the Analytic Hierarchy Process (AHP). A procedure developed by Balci in 1986 is selected for the use in development of models for non-observable systems. Specific steps within Balci's 1986 procedure are investigated further to determine appropriate techniques that should be used when developing models of non-observable systems. These steps are system and objective definition, conceptual model, communicative model, and experimental models and results. Five techniques are identified in the literature that can be applied to system and objective definition: Soft Systems Methodology, Requirements Engineering, Unified Modeling Language, Systems Modeling Language, and Department of Defense Architecture Framework. These techniques are reviewed and selection is made using AHP. The System Modeling Language (SysML) is selected as the best technique to perform System an Objective Definition. Significant resources are devoted to the study of conceptual model development. Proposed in this thesis is a process to decompose the impacts of the system and apply subjective weightings in order to identify aspects of the system with significant importance. This approach enables the modeling of the system in question to the appropriate level of fidelity based on the identified importance of the system impacts. Additionally, this process provides traceability and defensibility of the final model form. Communicative model development is rarely addressed in the literature; however, many of the techniques used in system and objective definition can be applied to developing a communicative model. A similar study to the system and objective definition, AHP was utilized to make a selection. It was concluded that the Unified Modeling Language provides the best tool for creating a communicative model. In the final step, experimental models and results, the literature was found to be rich in techniques. A gap was found in the analysis of the outputs of stochastic simulations. Four questions resulted: 'which stochastic measures should be used in analyzing a stochastic simulation?', 'how many replications are required for an accurate estimation of the stochastic measure?', which least squares method should be used in the regression of a stochastic response?, and 'how many replications are required for an accurate regression of a stochastic measure? Heuristics are presented for each of these questions. A proof of concept is provided on the methodology developed within this thesis. The selected scenario is a Humanitarian Aid/Disaster Relief Mission, where the U.S. Navy has been tasked with distributing aid in an effective manner to the affected population. Upon application of the proposed methodology, it was observed that subjective decomposition and weighting of the scenario proved to be a useful tool for guiding and justifying the form of the eventual model. Shortcomings of the methodology were identified. The primary shortcomings identified were the linking of information between the steps of the model development procedure, and the difficulty in correctly identifying the structure of the system impacts decomposition. The primary contribution of this thesis is to the field of M&S. Contributions are made to the practice of conceptual model development, a growing discussion within the literature over the past several years. The contribution to conceptual model development will aid in the development models for non-observable systems. Additional contributions are made to the analysis of stochastic simulations.
The methodology presented in this thesis will provide a new and robust method to develop and validate models in a traceable and defensible manner.
|
8 |
What is Professionalism? The Validation of a Comprehensive Model of ProfessionalismRowland, Andrew W 01 October 2016 (has links)
Professionalism is a term frequently used in organizations yet perceptions of what it means differ from person to person. Given its frequent use and its link to various job outcomes, such as organizational commitment (Bartol, 1979), there is a need to have a universal definition of professionalism. While there are existing models of professionalism these models are typically developed for a specific field or industry. Thus, there is also a need for a comprehensive model of professionalism that can be used across multiple fields and industries. This study worked to develop a model of professionalism that creates a comprehensive model that addresses both of these issues using eleven existing measures of professionalism as its foundation. Four dimensions of professionalism were identified via these models and defined using a combination of existing research and researcher expertise. These dimensions were divided into elements which were used as items in a measure to validate the new model. A five-factor model demonstrated the best fit and was found to have both convergent and discriminant validity.
|
9 |
Analýza storna pojistných smluv / Lapse Analysis of Insurance ContractsStrnad, Jan January 2013 (has links)
The aim of the present work is to develop a tool for identification of Motor Third Party Liability insurance contracts which are at risk of cancellation. Methods for explorative data analysis, building a logistic regression model, comparing models and their validation and calibration are presented. Several models are developed on the real dataset using mentioned methods and then the final one is chosen. Behavior of the final model is verified by the validation on the out-of-time sample. Last step is calibration of the model to the expected value of the future portfolio cancellation rate.
|
10 |
Categorizing conference room climate using K-meansAsp, Jin, Bergdahl, Saga January 2019 (has links)
Smart environments are increasingly common. By utilizing sensor data from the indoor environment and applying methods like machine learning, they can autonomously control and increase productivity, comfort, and well-being of occupants. The aim of this thesis was to model indoor climate in conference rooms and use K-means clustering to determine quality levels. Together, they enable categorization of conference room quality level during meetings. Theoretically, by alerts to the user, this may enhance occupant productivity, comfort, and well-being. Moreover, the objective was to determine which features and which k would produce the highest quality clusters given chosen evaluation measures. To do this, a quasi-experiment was used. CO2, temperature, and humidity sensors were placed in four conference rooms and were sampled continuously. K-means clustering was then used to generate clusters with 10 days of sensor data. To evaluate which feature combination and which k created optimal clusters, we used Silhouette, Davis Bouldin, and the Elbow method. The resulting model, using three clusters to represent quality levels, enabled categorization of the quality of specific meetings. Additionally, all three methods indicated that a feature combination of CO2 and humidity, with k = 2 or k = 3, was suitable.
|
Page generated in 0.0578 seconds