311 |
Municipal boundary demarcation in South Africa: processes and effects on governance in traditional rural areasNxumalo, Cleotilda January 2013 (has links)
Includes bibliographical references. / This research adopts a case study approach to investigate disputes involving municipal boundaries in rural communities under traditional authority. Further, a multiple case study approach is used to expose in-depth understanding of these disputes. The causes of the disputes are investigated and the processes of municipal demarcation and boundary dispute resolution are analysed against a number of frameworks such as the goals of good governance in land administration, management paradigm, rights, restrictions and responsibilities, Kotter's eight stages of change management, and 7Es performance measurement frameworks. From this critique, conclusions are drawn about the municipal demarcation processes and improvements are recommended. The study finds that the frameworks and tools applied are suitable for the analysis and evaluation of the municipal boundary demarcation process. The main findings indicate that several municipal demarcations negatively affected service delivery and threatened the role of traditional leaders.
|
312 |
The viscosity of bituminous substancesCogill, William Henry 22 November 2016 (has links)
No description available.
|
313 |
The thermoelectric probe for tissue blood flow measurementsHancock, Frederick Maynard 22 November 2016 (has links)
No description available.
|
314 |
Concrete at elevated temperatureRoux, Fred Johannes Paul 22 November 2016 (has links)
No description available.
|
315 |
Discriminative training of hidden Markov Models for gesture recognitionCombrink, Jan Hendrik 01 February 2019 (has links)
As homes and workplaces become increasingly automated, an efficient, inclusive and language-independent human-computer interaction mechanism will become more necessary. Isolated gesture recognition can be used to this end. Gesture recognition is a problem of modelling temporal data. Non-temporal models can be used for gesture recognition, but require that the signals be adapted to the models. For example, the requirement of fixed-length inputs for support-vector machine classification. Hidden Markov models are probabilistic graphical models that were designed to operate on time-series data, and are sequence length invariant. However, in traditional hidden Markov modelling, models are trained via the maximum likelihood criterion and cannot perform as well as a discriminative classifier. This study employs minimum classification error training to produce a discriminative HMM classifier. The classifier is then applied to an isolated gesture recognition problem, using skeletal features. The Montalbano gesture dataset is used to evaluate the system on the skeletal modality alone. This positions the problem as one of fine-grained dynamic gesture recognition, as the hand pose information contained in other modalities are ignored. The method achieves a highest accuracy of 87.3%, comparable to other results reported on the Montalbano dataset using discriminative non-temporal methods. The research will show that discriminative hidden Markov models can be used successfully as a solution to the problem of isolated gesture recognition
|
316 |
Using spatial multi-criteria analysis as an appraisal tool for bus rapid transit trunk and feeder routes: a case study in the City of Tshwane, South AfricaChitate, Nyasha 11 February 2019 (has links)
Private car use around the world has grown increasingly over the last decades. One effect of this is traffic congestion, which results in various detrimental environmental, economic and social impacts. Public transport has been identified as an effective solution to congestion. In South Africa, investment into public transport has led to the implementation of full and partial Bus Rapid Transit (BRT) systems. The policy and implementation measures of these BRT systems have been modelled, to varying degrees, according to Colombia’s TransMilenio BRT. However, BRT systems in South Africa have not been as successful as TransMilenio. Failures of South African BRTs can be traced back to many reasons, one of which is an inadequate conduction of an ex ante appraisal. This dissertation intended to close a literature gap on the use of ex ante appraisal in South African transport projects. At the time of composing this thesis (February 2017), South Africa did not have a standard appraisal tool for the selection of appropriate transport projects and road-based public transport routes. This resulted in systems that were not designed in context and, hence, underperformed for the context in which they were implemented. The contextually insensitive design of BRTs and the effects thereof constituted the conceptual departure point for this research. Accordingly, this dissertation aimed to explore Spatial MultiCriteria Analysis (SMCA) as a viable appraisal tool for BRT routes. The City of Tshwane formed the study area of the investigation. SMCA is a decision-support tool that combines multi-criteria analysis (MCA) and geographic information systems for evaluating decision problems whose criteria and alternatives have spatially explicit dimensions. This method was chosen over traditional appraisal tools such as MCA and cost-benefit analysis as it is more suited to routing problems. Suitable evaluation criteria were identified from five themes that were chosen from international and local trends: equity, transport efficiency and economic, social and environmental impact. Ultimately, composite suitability maps were generated according to the aforementioned themes, and optimal trunk and feeder routes were extracted by means of a vector-based network analysis. Four trunk and four feeder routes were quantitatively and qualitatively analysed. The quantitative analysis of the route involved determining the average impedance, route length and travel time of a route. The qualitative analysis involved determining if the optimal routes had changed to current or planned city routes. On average, trunk routes obtained a higher average impedance than feeder routes. All optimal routes differed to some degree from planned city routes. Following the determination of optimal routes, an uncertainty analysis showed that trunk routes were more sensitive than feeder routes. The sensitivity analysis also showed that the transport efficiency theme criteria were the most sensitive criteria, causing the highest mean average impedance change of all criteria. Transport efficiency criteria are thus the most important criteria in finding optimal routes. The method of research adopted in this study can be reproduced in any contemporary South African city with plans for BRT. Furthermore, the method of research can be improved upon by investigating standard evaluation criteria to be included in an SMCA routing problem to ensure a uniform appraisal standard.
|
317 |
A virtual element method for transversely isotropic elasticityVan Huyssteen, Daniel 07 May 2019 (has links)
This work studies the approximation of plane problems concerning transversely isotropic elasticity, using a low-order virtual element method (VEM). The VEM is an alternative finite element method characterised by complete freedom in determining element geometries that are otherwise polygonal in two dimensions, or polyhedral in three. Transversely isotropic materials are characterised by an axis of symmetry perpendicular to a plane of isotropy, and have applications ranging from fibre reinforcement to biological materials. The governing equations of the transversely isotropic elasticity problem are derived and a virtual element formulation of the problem is presented along with a sample implementation of the method. This work focuses on the treatment of near-incompressibility and near-inextensibility. These are explored both for homogeneous problems, in which the plane of isotropy is fixed; and non-homogeneous problems, in which the fibre directions defining the plane of isotropy vary with position. In the latter case various options are explored for approximating the non-homogeneous terms at an element level. The VEM approximations are shown through a range of numerical examples to be robust and locking-free, for a selection of element geometries, and fibre directions corresponding to mild and strong inhomogeneity.
|
318 |
Reconstruction of Functions From Non-uniformly Distributed Sampled Data in Shift-Invariant Frame SubspacesMkhaliphi, Mkhuseli Bruce 14 May 2019 (has links)
The focus of this research is to study and implement efficient iterative reconstruction algorithms. Iterative reconstruction algorithms are used to reconstruct bandlimited signals in shift-invariant L2 subspaces from a set of non-uniformly distributed sampled data. The Shannon-Whittaker reconstruction formula commonly used in uniform sampling problems is insufficient in reconstructing function from non-uniformly distributed sampled data. Therefore new techniques are required. There are many traditional approaches for non-uniform sampling and reconstruction methods where the Adaptive Weights (AW) algorithm is considered to be the most efficient. Recently, the Partitions of Unity (PoU) algorithm has been suggested to outperform the AW although there has been much literature covering its numerical performance. A study and analysis of the implementation of the Adaptive Weights (AW) and Partitions of Unity (PoU) reconstruction methods is conducted. The algorithms consider the missing data problem, defined as reconstructing continuous-time (CT) signals from non-uniform samples which resulted from missing samples on a uniform grid. Mainly, the algorithms convert the non-uniform grid to a uniform grid. The implemented iterative methods construct CT bandlimited functions in frame subspaces. Bandlimited functions are considered to be a superposition of basis functions, named frames. PoU is a variation of AW, they differ by the choice of frame because each frame produces a different approximation operator and convergence rate. If efficiency is defined as the norm convergence and computational time of an algorithm, then among the two methods, discussed, the PoU method is more efficient. The AW method is slow and converged to a higher error than that of the PoU. However, AW compensates for its slowness and less accuracy by being convergent and robust for large sampling gaps and less sensitive to the sampling irregularities. The impact of additive white Gaussian noise on the performance of the two algorithms is also investigated. The numerical tools utilized in this research consist of the theory of discrete irregular sampling, frames, and iterative techniques. The developed software provides a platform for sampling signals under non-ideal conditions with real devices.
|
319 |
Modelling computer network traffic using wavelets and time series analysisNtlangu, Mbulelo Brenwen 15 May 2019 (has links)
Modelling of network traffic is a notoriously difficult problem. This is primarily due to the ever-increasing complexity of network traffic and the different ways in which a network may be excited by user activity. The ongoing development of new network applications, protocols, and usage profiles further necessitate the need for models which are able to adapt to the specific networks in which they are deployed. These considerations have in large part driven the evolution of statistical profiles of network traffic from simple Poisson processes to non-Gaussian models that incorporate traffic burstiness, non-stationarity, self-similarity, long-range dependence (LRD) and multi-fractality. The need for ever more sophisticated network traffic models has led to the specification of a myriad of traffic models since. Many of these are listed in [91, 14]. In networks comprised of IoT devices much of the traffic is generated by devices which function autonomously and in a more deterministic fashion. Thus in this dissertation the activity of building time series models for IoT network traffic is undertaken. In the work that follows a broad review of the historical development of network traffic modelling is presented tracing a path that leads to the use of time series analysis for the said task. An introduction to time series analysis is provided in order to facilitate the theoretical discussion regarding the feasibility and suitability of time series analysis techniques for modelling network traffic. The theory is then followed by a summary of the techniques and methodology that might be followed to detect, remove and/or model the typical characteristics associated with network traffic such as linear trends, cyclic trends, periodicity, fractality, and long range dependence. A set of experiments is conducted in order determine the effect of fractality on the estimation of AR and MA components of a time series model. A comparison of various Hurst estimation techniques is also performed on synthetically generated data. The wavelet-based Abry-Veitch Hurst estimator is found to perform consistly well with respect to its competitors, and the subsequent removal of fractality via fractional differencing is found to provide a substantial improvement on the estimation of time series model parameters.
|
320 |
US energy policy and its position in the United Nations Framework Convention on Climate Change (UNFCCC) negotiations: a theory-guided historical analysisAzarch, Anna 22 January 2020 (has links)
The multilateral negotiations aimed at securing international cooperation on climate change and its mitigation have widely been criticised as a political deadlock since the establishment of the United Nations Framework Convention on Climate Change (UNFCCC) in 1992. Since the formation of the UNFCCC, the United States of America (USA) has being recognised as both an essential, but also as a highly controversial actor within the negotiations due to its historical responsibility for anthropogenic climate change and relative structural power which has allowed it to wield immense influence in the negotiations. Because of energy’s essential input into all economic sectors, any attempts to mitigate climate change will influence a sector that is essential for a country’s economic strength. Within these processes, there is a link between energy policy and the UNFCCC negotiations. The overarching research aim of this interdisciplinary study is to understand the historical interaction between the USA’s energy policy and its negotiating position in the UNFCCC. Within these dynamics, understanding how different administrations attempt to balance competing policy goals are pivotal in understanding these dynamics within domestic and international constraints. This study analyses this by conducting a historical case study of the USA’s position in these negotiations and how its energy policy interacts with this. This study makes use of the neoclassical realist framework to understand the cooperation of the USA through the interaction between its energy policy and its position in the UNFCCC negotiations within two levels of analysis, that is the unit- and structural-levels, referring to factors found at the state-level and the international distribution of power respectively. The policy positions and energy policies of the administrations of George H.W. Bush (1989 - 1993), Bill Clinton (1993 - 2001), George W. Bush (2001 - 2009) and Barack Obama (2009 - 2017) are investigated through the interaction of the unit- and structural-levels. A historical overview indicates the challenges that succeeding administrations faced in grappling with contradictory policy objectives in accordance with the perceived costs of various policy goals at both the domestic and international levels and its implications for their position and ability to cooperate within the UNFCCC. The thesis has indicated that the history of the United States’ position in the negotiations and the analysis of the influences on this position reveal that simple mono-causal explanations cannot satisfactorily attribute the differences amongst US administrations therein. As such, since 1992, successive US administrations have displayed varying degrees of cooperation towards the UNFCCC that have been based on the domestic and international distribution of power, and policy-makers’ perception of the related costs and benefits of pursuing a set of policy goals. The research has established that since the 1970s, efforts to design US energy policy to factor in environmental externalities have resulted in haphazard progress as two ideational frameworks emerged, one that viewed economic growth and environmental regulation as compatible, and another that promoted the opposing view, which has resulted in stalemate and a cyclical approach complicating the interaction between US energy policy and its position in the UNFCCC. Within these dynamics, domestic constraints, à la the two-level game, place an important limitation on US participation and ratification of climate change agreements and its energy policy and highlights the important role played by domestic institutions with bipartisan politics and ideology forming a staggering fissure. The thesis has found that rather than an objective set of criteria, policy-makers are influenced by a complex range and interaction of factors in their approach to energy policy, international negotiations, as well as international opportunities and threats. Within these dynamics, the structure of the international system is essential in understanding state behaviour. The thesis confirms that the influence of the distribution of power in the international system therefore complements domestic factors in analysing the motivation and behaviour of policy-makers acting on behalf of the state, although it is imperative to understand how its influence is filtered at the unitlevel. Understanding the historical context permits deeper insights into the multi-dimensional influences on decision-makers. It is therefore necessary to delve into the historical origins of state behaviour and the evolution of their domestic and foreign policies.
|
Page generated in 0.135 seconds