• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 1
  • Tagged with
  • 11
  • 11
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

An assessment of mesoscale wind modelling techniques in complex terrain

Guo, Xuewen January 1989 (has links)
No description available.
2

Theory, modelling, and applications of advanced electromagnetic materials

Naeem, Majid January 2017 (has links)
A multitude of recent work predicts many novel concepts based on the availability of non-natural materials; some prominent examples include transformation optics (TO) and perfect lens. The interest in this eld has grown dramatically due to spec- ulated possibility to allow for continuously varying material properties to steer the incident wave at will, such as for the TO. The posed challenges for their realisation include the limitations of numerical modelling and manufacturing techniques. A de- sign scheme has been proposed, in this thesis, for composite materials: the desired electromagnetic properties of composites can be engineered by judiciously varying the volume fraction of the inclusion-to-host materials, by manipulating the geomet- ric arrangement of inclusions, or by altering their dielectric contrast. The analysis of the homogenised response of the designed materials at macro-scale requires effective medium modelling techniques. The existing effective medium approximation tech- niques have been discussed, and their pros and cons outlined. A homogenization scheme has been introduced that is based on the interaction of the incident wave and the nanoparticles at the micro-scale, which further requires efficient electromagnetic modelling. The conventional nanoparticle modelling techniques, as well as the state of the art, have been reviewed and a dipole-moment-based method to efficiently solve modern nanoparticle-based electromagnetic problems has been outlined. The appli- cability of the proposed scheme has been demonstrated by employing it to design various EM devices. An improved permittivity extraction scheme has been proposed for the homogenization of composites. Unlike classical homogenization schemes, the extracted parameters, using the proposed technique, follow the relation between the real and imaginary parts, that is, Kramers-Kronig relations. Several random and periodic structures have been simulated for the purpose of extracting the ef- fective electromagnetic properties and interpreting the results so as to establish a connection between them.
3

Modelling South African social unrest between 1997 and 2016

Smart, Sally-Anne January 2019 (has links)
Social unrest, terrorism and other forms of political violence events are highly unpredictable. These events are driven by human intent and intelligence, both of which are extremely difficult to model accurately. This has resulted in a scarcity of insurance products that cover these types of perils. Links have been found between the incidence of political violence and various economic and socioeconomic variables, but to date no relationships have been identified in South Africa. The aim of this study was to address this. Firstly, by identifying relationships between the incidence of social unrest events and economic and socio-economic variables in South Africa and secondly by using these interactions to model social unrest. Spearman’s rank correlation and trendline analysis were used to compare the direction and strength of the relationships that exist between protests and the economic and socio-economic variables. To gain additional insight with regards to South African protests, daily, monthly, quarterly and annual protest models were created. This was done using four different modelling techniques, namely univariate time series, linear regression, lagged regression and the VAR (1) model. The forecasting abilities of the models were analysed using both a one-step and n-step forecasting procedure. Variations in relationships for different types of protests were also considered for five different subcategories. Spearman’s rank correlation and trendline analysis showed that the relationships between protests and economic and socio-economic variables were sensitive to changes in data frequency and the use of either national or provincial data. The daily, monthly, quarterly and annual models all had power in explaining the variation that was observed in the protest data. The annual univariate model had the highest explanatory power (R2 = 0.8721) this was followed by the quarterly VAR (1) model (R2 = 0.8659), while the monthly lagged regression model had a R2 of 0.8138. The one-step forecasting procedure found that the monthly lagged regression model outperformed the monthly VAR (1) model in the short term. The converse was seen for the short-term performance of the quarterly models. In the long term, the VAR (1) model outperformed the other models. Limitations were identified within the lagged regression model’s forecasting abilities. As a model’s long-term forecasting ability is important in the insurance world, the VAR (1) model was deemed as the best modelling technique for South African social unrest. Further model limitations were identified when the subcategories of protests were considered. This study demonstrates that with the use of the applicable economic and socio-economic variables, social unrest events in South Africa can be modelled. / Dissertation (MSc)--University of Pretoria, 2019. / Absa Chair in Actuarial Science (UP) / South African Department of Science and Technology (DST) Risk Research Platform, under coordination of the North-West University (NWU) / Insurance and Actuarial Science / MSc Actuarial Mathematics / Unrestricted
4

Towards an integrative modelling technique between business and information system development

Joubert, Pieter 02 August 2013 (has links)
There are many situations during information system development (ISD) where there is a need to do modelling on a business level before more detailed and robust modelling are done on the technical system level. Most business level modelling uses some form of natural language constructs which are, on the one hand, easy to use by untrained users, but which are too vague and ambiguous to be used in subsequent systems level modelling by systems analysts, on the other hand. The goal of this study is to develop an integrative modelling technique that is easy enough to be used by most business users with little training, but robust and structured enough to be used in subsequent ISD modelling. The term “integrative” in the title refers to the fact that this technique attempts to bridge the current gap between modelling on a business level and modelling on a technical level. The research consists of two major phases. During the first phase, an integrative modelling technique is developed using a grounded approach. The data that is used for analysis is a representative example of the major ISD modelling techniques used currently. For instance, to represent all the UML techniques, the UML 2 standard is used. The purpose of this first phase is to understand what the fundamental concepts and relationships in ISD are and to develop an integrative technique based on that. During the second phase, the resultant artefact created by the first phase is evaluated and improved using the design science research approach. This artefact is used in a representative set of business modelling situations to evaluate its applicability and suitability as an integrative modelling technique between business and ISD. The integrative modelling technique is evaluated from three perspectives: how it represents business rules, how it handled various aspects of ISD and how it represents requirements expressed as use cases. These evaluations used the two main design criteria of ease of use for users and at the same time adequate levels of expressive power so that the model can be easily translated into existing ISD modelling languages. The integrative modelling technique developed identified the following three levels of modelling entities and their relationships: • Base entities (corresponding to the morphological level in linguistics) • Structure entities (corresponding to the syntactical level in linguistics) • Role entities (corresponding to the semantic level in linguistics) The contribution of this research is to provide a better understanding of the fundamental entities in business and ISD modelling and their relationships in order to improve informal, mostly textual, business modelling. / Thesis (PhD)--University of Pretoria, 2012. / Informatics / unrestricted
5

An evaluation of modelling approaches and column removal time on progressive collapse of building

Stephen, D., Lam, Dennis, Forth, J., Ye, J., Tsavdaridis, K.D. 25 October 2018 (has links)
Yes / Over the last few decades, progressive collapse disasters have drawn the attention of codified bodies around the globe; as a consequence, there has been a renewed research interest. Structural engineering systems are prone to progressive collapse when subjected to abnormal loads beyond the ultimate capacity of critical structural members. Sudden loss of critical structural member(s) triggers failure mechanisms which may result in a total or partial collapse of the structure proportionate or disproportionate to the triggering event. Currently, researchers adopt different modelling techniques to simulate the loss of critical load bearing members for progressive collapse assessment. GSA guidelines recommend a column removal time less than a tenth of the period of the structure in the vertical vibration mode. Consequently, this recommendation allows a wide range of column removal time which produces inconsistent results satisfying GSA recommendation. A choice of a load time history function assumed for gravity and the internal column force interaction affects the response of the structure. This paper compares different alternative numerical approaches to simulate the sudden column removal in frame buildings and to investigate the effect of rising time on the structural response.
6

Simulation of current crowding mitigation in GaN core-shell nanowire led designs

Connors, Benjamin James 07 July 2011 (has links)
Core-shell nanowire LEDs are light emitting devices which, due to a high aspect ratio, have low substrate sensitivity, allowing the possibility of low defect density GaN light emitting diodes. Current growth techniques and physical non-idealities make the production of high conductivity p-type GaN for the shell region of these devices difficult. Due to the structure of core-shell nanowires and the difference in conductivity between ntype and p-type GaN, the full junction area of a core-shell nanowire is not used efficiently. To address this problem, a series of possible doping profiles are applied to the core of a simulated device to determine effects on current crowding and overall device efficiency. With a simplified model it is shown that current crowding has a possible dependence on the doping in the core in regions other than those directly in contact with the shell. The device efficiency is found to be improved through the use of non-constant doping profiles in the core region with particularly large efficiency increases related to profiles which modify portions of the core not in contact with the shell
7

Multivariate GARCH and portfolio optimisation : a comparative study of the impact of applying alternative covariance methodologies

Niklewski, Jacek January 2014 (has links)
This thesis investigates the impact of applying different covariance modelling techniques on the efficiency of asset portfolio performance. The scope of this thesis is limited to the exploration of theoretical aspects of portfolio optimisation rather than developing a useful tool for portfolio managers. Future work may entail taking the results from this work further and producing a more practical tool from a fund management perspective. The contributions made by this thesis to the knowledge of the subject are that it extends literature by applying a number of different covariance models to a unique dataset that focuses on the 2007 global financial crisis. The thesis also contributes to the literature as the methodology applied also enables a distinction to be made in respect to developed and emerging/frontier regional markets. This has resulted in the following findings: First, it identifies the impact of the 2007–2009 financial crisis on time-varying correlations and volatilities as measured by the dynamic conditional correlation model (Engle 2002). This is examined from the perspective of a United States (US) investor given that the crisis had its origin in the US market. Prima facie evidence is found that economic structural adjustment has resulted in long-term increases in the correlation between the US and other markets. In addition, the magnitude of the increase in correlation is found to be greater in respect to emerging/frontier markets than in respect to developed markets. Second, the long-term impact of the 2007–2009 financial crisis on time-varying correlations and volatilities is further examined by comparing estimates produced by different covariance models. The selected time-varying models (DCC, copula DCC, GO-GARCH: MM, ICA, NLS, ML; EWMA and SMA) produce statistically significantly different correlation and volatility estimates. This finding has potential implication for the estimation of efficient portfolios. Third, the different estimates derived using the selected covariance models are found to have a significant impact on the calculated weights and turnovers of efficient portfolios. Interestingly, however, there was no significant difference between their respective returns. This is the main finding of the thesis, which has potentially very important implications for portfolio management.
8

A comparison of the impact of data vault and dimensional modelling on data warehouse performance and maintenance / Marius van Schalkwyk

Van Schalkwyk, Marius January 2014 (has links)
This study compares the impact of dimensional modelling and data vault modelling on the performance and maintenance effort of data warehouses. Dimensional modelling is a data warehouse modelling technique pioneered by Ralph Kimball in the 1980s that is much more effective at querying large volumes of data in relational databases than third normal form data models. Data vault modelling is a relatively new modelling technique for data warehouses that, according to its creator Dan Linstedt, was created in order to address the weaknesses of dimensional modelling. To date, no scientific comparison between the two modelling techniques have been conducted. A scientific comparison was achieved in this study, through the implementation of several experiments. The experiments compared the data warehouse implementations based on dimensional modelling techniques with data warehouse implementations based on data vault modelling techniques in terms of load performance, query performance, storage requirements, and flexibility to business requirements changes. An analysis of the results of each of the experiments indicated that the data vault model outperformed the dimensional model in terms of load performance and flexibility. However, the dimensional model required less storage space than the data vault model. With regards to query performance, no statistically significant differences existed between the two modelling techniques. / MSc (Computer Science), North-West University, Potchefstroom Campus, 2014
9

A comparison of the impact of data vault and dimensional modelling on data warehouse performance and maintenance / Marius van Schalkwyk

Van Schalkwyk, Marius January 2014 (has links)
This study compares the impact of dimensional modelling and data vault modelling on the performance and maintenance effort of data warehouses. Dimensional modelling is a data warehouse modelling technique pioneered by Ralph Kimball in the 1980s that is much more effective at querying large volumes of data in relational databases than third normal form data models. Data vault modelling is a relatively new modelling technique for data warehouses that, according to its creator Dan Linstedt, was created in order to address the weaknesses of dimensional modelling. To date, no scientific comparison between the two modelling techniques have been conducted. A scientific comparison was achieved in this study, through the implementation of several experiments. The experiments compared the data warehouse implementations based on dimensional modelling techniques with data warehouse implementations based on data vault modelling techniques in terms of load performance, query performance, storage requirements, and flexibility to business requirements changes. An analysis of the results of each of the experiments indicated that the data vault model outperformed the dimensional model in terms of load performance and flexibility. However, the dimensional model required less storage space than the data vault model. With regards to query performance, no statistically significant differences existed between the two modelling techniques. / MSc (Computer Science), North-West University, Potchefstroom Campus, 2014
10

Komplexní animace v 3D Studiu Max / Complex Animation in 3D Studio Max

Černý, Miloš January 2011 (has links)
The goal of this project is to inform a reader about compact workflow of creating complex computer animation using 3D modelling and animation software 3ds Max. It guides him through the whole process from creating models, texturing and skinning them, to animation pointed at more difficult parts of the animation process. Beside the practical examples, this paper includes necessary theoretical explanation of particular problems. After reading this paper, the reader should be well acquainted with the compact process of creating the complex animation.

Page generated in 0.1098 seconds