• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 684
  • 252
  • 79
  • 57
  • 42
  • 37
  • 30
  • 26
  • 25
  • 14
  • 9
  • 8
  • 7
  • 7
  • 7
  • Tagged with
  • 1504
  • 1030
  • 249
  • 238
  • 223
  • 215
  • 195
  • 185
  • 167
  • 163
  • 151
  • 124
  • 123
  • 122
  • 111
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1351

Modelling of Automotive Suspension Damper / Modellering av spjäll för fordon

Vyas, Saurabh, Jonnalagadda, Venkata Dinesh Raju January 2020 (has links)
A hydraulic damper plays an important role in tuning the handling and comfort characteristicsof a vehicle. Tuning and selecting a damper based on subjective evaluation, by considering theopinions of various users, would be an inefficient method since the comfort requirements of usersvary a lot. Instead, mathematical models of damper and simulation of these models in variousoperating conditions are preferred to standardize the tuning procedure, quantify the comfortlevels and reduce cost of testing. This would require a model, which is good enough to capture thebehaviour of damper in various operating and extreme conditions.The Force-Velocity (FV) curve is one of the most widely used model of a damper. This curve isimplemented either as an equation or as a look-up table. It is a plot between the maximum forceat each peak velocity point. There are certain dynamic phenomena like hysteresis and dependencyon the displacement of damper, which cannot be captured with a FV curve model, but are requiredfor better understanding of the vehicle behaviour.This thesis was conducted in cooperation with Volvo Cars with an aim to improve the existingdamper model which is a Force-Velocity curve. This work focuses on developing a damper model,which is complex enough to capture the phenomena discussed above and simple enough to beimplemented in real time simulations. Also, the thesis aims to establish a standard method toparameterise the damper model and generate the Force-Velocity curve from the tests performedon the damper test rig. A test matrix which includes the standard tests for parameterising andthe extreme test cases for the validation of the developed model will be developed. The final focusis to implement the damper model in a multi body simulation (MBS) software.The master thesis starts with an introduction, where the background for the project is described and then the thesis goals are set. It is followed by a literature review in which fewadvanced damper models are discussed in brief. Then, a step-by-step process of developing thedamper model is discussed along with few more possible options. Later, the construction of a testmatrix is discussed in detail followed by the parameter identification process. Next, the validationof the developed damper model is discussed using the test data from Volvo Hällered ProvingGround (HPG). After validation, implementation of the model in VI CarRealTime and Adams Caralong with the results are presented. Finally the thesis is concluded and the recommendations forfuture work are made on further improving the model. / En hydraulisk stötdämpare spelar en viktig roll för att fordonets hantering och komfort. Attjustera och välja en stötdämpare baserat på subjektiv utvärdering, genom att beakta olika användares åsikter, skulle vara en ineffektiv metod eftersom användarnas komfortkrav varierarmycket. Istället föredras matematiska modeller av stötdämpare och simulering av dessa modellerunder olika driftsförhållanden för att standardisera inställningsförfarandet, kvantifiera komfortnivåerna och minska testkostnaden. Detta skulle kräva en modell som är tillräckligt bra för attfånga upp stötdämparens beteende under olika drifts- och extrema förhållanden.Force-Velocity (FV) -kurvan är en av de mest använda stötdämparmodellerna. Denna kurvaimplementeras antingen som en ekvation eller som en uppslagstabell. Det är ett diagram somredovisar den maximala kraften vid varje maxhastighetspunkt. Det finns vissa dynamiskafenomen som hysteres och beroende av stötdämparens förskjutning, som inte kan fångas med enFV-kurvmodell, men som krävs för att bättre förstå fordonets beteende.Denna avhandling genomfördes i samarbete med Volvo Cars i syfte att förbättra den befintligastötdämparmodellen som är en Force-Velocity-kurva. Detta arbete fokuserar på att utveckla enstötdämparmodell, som är tillräckligt komplex för att fånga upp de fenomen som diskuteratsovan och tillräckligt enkel för att implementeras i realtidssimuleringar. Avhandlingen syftarockså till att upprätta en standardmetod för att parametrisera spjällmodellen och generera ForceVelocity-kurvan från de test som utförts på stötdämpartestriggen. En testmatris som innehållerstandardtest för parametrisering och extrema testfall för validering av den utvecklade modellenkommer att utvecklas. Det sista fokuset är att implementera stötdämparmodellen i en multi-bodysimulation (MBS) programvara.Examensarbetet inleds med en introduktion, där bakgrunden för projektet beskrivs ochdärefter definieras målen med arbetet. Det följs av en litteraturöversikt där några avanceradestötdämparmodeller diskuteras i korthet. Därefter diskuteras en steg-för-steg-process för attutveckla stötdämparmodeller tillsammans med några fler möjliga alternativ. Senare diskuteraskonstruktionen av en testmatris i detalj följt av parameteridentifieringsprocessen. Därefterdiskuteras valideringen av den utvecklade stötdämparmodellen med hjälp av testdata från VolvoHällered Proving Ground (HPG). Efter validering presenteras implementeringen av modellen iVI CarRealTime och Adams Car tillsammans med resultaten. Slutligen avslutas rapporten medslutsatser från arbetet och rekommendationer för framtida arbete görs för att ytterligare förbättramodellen.
1352

Online Secondary Path Modelling for Spatial Active Noise Control with Arbitrarily Spaced Arrays / Sekundärvägsmodellering för Aktiv Brusreducering i Rum med Godtyckligt Placerade Arrayer

Brunnström, Jesper January 2021 (has links)
In this work we explore online secondary path modelling (SPM) in the context of spatial active noise control (ANC). Specifically, we are interested in the reduction of broadband noise over a three-dimensional region, without restrictions on microphone and loudspeaker array placement. As spatial ANC generally requires many channels, both ANC and SPM methods must have low computational cost. The SPM methods are intended to be used with a specific spatial ANC algorithm based on kernel interpolation. By incorporating SPM, the spatial ANC method is enabled to operate under timevarying secondary paths. Four SPM methods are considered in detail, of which three are based on the auxiliary noise technique. Descriptions of the algorithms are presented for the multichannel case, in addition to block-based implementations taking advantage of the fast Fourier transform to drastically reduce computational cost. Impulse responses to simulate a soundfield are recorded using a programmable robot arm. The algorithms are evaluated through simulations to show their respective strengths and weaknesses. It is found that the auxiliary noise based SPM methods have good convergence properties for both control filter and secondary path estimate, although the auxiliary noise’s degrading effect on the residual noise leads to a similar total noise reduction as the auxiliary noise free method. For all algorithms, the noise control performance worsens, and the convergence time increases by more than an order of magnitude, compared to when the secondary paths are known. It is verified that the kernel interpolation based spatial ANC method successfully reduces noise over a region even when used with online SPM. / I detta projekt undersöks sekundärvägsmodellering för spatial aktiv brusreducering. Fokus ligger på minskning av brus över en tredimensionell region, för metoder utan några restriktioner när det gäller mikrofon- och högtalarplacering. Efterssom spatial brusreducering generellt kräver många kanaler, behöver både sekundärvägsmodellering samt brusreducering ha mycket låg beräkningskostnad. Metoderna för sekundärvägsmodellering är menade att användas tillsammans med en specifik spatial brusreduceringsalgoritm baserad på kärninterpolation. Genom att inkludera sekundärvägsmodellering kan den spatiala brusreduceringsmetoden operera även då sekundärvägarna är tidsvarierande. Fyra metoder för sekundärvägsmodellering är undersökta i detalj, tre av vilka är baserade på auxiliärbrusprincipen. Dessa algoritmer är beskrivna för multikanalsfallet, tillsammans med blockbaserade implementationer som utnyttjar den snabba Fouriertransformen för att drastiskt minska sina beräkningskostader. Impulssvar som kan användas för att simulera ett ljudfält är inspelade med hjälp av en programmerbar robotarm. Algoritmerna är utvärderade genom simuleringar för att visa deras respektive styrkor och svagheter. Experimenten visade att de algoritmer som använder sig av auxiliärbrus har bra konvergenskaraktäristik för både kontrollfilter och sekundärvägsestimat. Däremot, på grund av auxiliärbrusets negativa inverkan på residualbruset i rummet, är den totala brusreduceringen snarlik det den auxiliärbrusfria metoden ger. För alla algoritmer blir brusreduceringen försämrad och konvergenstiden ökad med mer än en storleksordning när sekundärvägsmodellering används, jämfört med när sekundärvägarna är kända. Det verifierades också att den spatiala brusreduceringsmetoden baserad på kärninterpolation kan reducera brus över en region även när den används tillsammans med sekundärvägsmodellering.
1353

Methods for 3D Structured Light Sensor Calibration and GPU Accelerated Colormap

Kurella, Venu January 2018 (has links)
In manufacturing, metrological inspection is a time-consuming process. The higher the required precision in inspection, the longer the inspection time. This is due to both slow devices that collect measurement data and slow computational methods that process the data. The goal of this work is to propose methods to speed up some of these processes. Conventional measurement devices like Coordinate Measuring Machines (CMMs) have high precision but low measurement speed while new digitizer technologies have high speed but low precision. Using these devices in synergy gives a significant improvement in the measurement speed without loss of precision. The method of synergistic integration of an advanced digitizer with a CMM is discussed. Computational aspects of the inspection process are addressed next. Once a part is measured, measurement data is compared against its model to check for tolerances. This comparison is a time-consuming process on conventional CPUs. We developed and benchmarked some GPU accelerations. Finally, naive data fitting methods can produce misleading results in cases with non-uniform data. Weighted total least-squares methods can compensate for non-uniformity. We show how they can be accelerated with GPUs, using plane fitting as an example. / Thesis / Doctor of Philosophy (PhD)
1354

An Evaluation of Technological, Organizational and Environmental Determinants of Emerging Technologies Adoption Driving SMEs’ Competitive Advantage

Dobre, Marius January 2022 (has links)
This research evaluates the technological, organizational, and environmental determinants of emerging technologies adoption represented by Artificial Intelligence (AI) and Internet of Things (IoT) driving SMEs’ competitive advantage within a resource-based view (RBV) theoretical approach supported by the technological-organizational-environmental (TOE)-framework setting. Current literature on SMEs competitive advantage as outcome of emerging technologies in the technological, organisational, and environmental contexts presents models focused on these contexts individual components. There are no models in the literature to represent the TOE framework as an integrated structure with gradual levels of complexity, allowing for incremental evaluation of the business context in support of decision making towards emerging technologies adoption supporting the firm competitive advantage. This research gap is addressed with the introduction of a new concept, the IT resource-based renewal, underpinned by the RBV, and supported by the TOE framework for providing a holistic understanding of the SMEs strategic renewal decision through information technology. This is achieved through a complex measurement model with four level constructs, leading into a parsimonious structural model that evaluates the relationships between IT resource-based renewal, and emerging technologies adoption driving SMEs competitive advantage. The model confirms the positive association between the IT resource-based renewal and emerging technologies adoption, and between the IT resource-based renewal and SME competitive advantage for the SMEs managers model, with the SME owners model outcomes are found not being supportive towards emerging technologies adoption driving SME competitive advantage. As methodology, PLS-SEM is used for its capabilities of assessing complex paths among model variables. Analysis is done on three models, one for the full sample, with two subsequent ones for owners and managers, respectively, as SME decision makers, with data collected using a web-based survey in Canada, the UK, and the US, that has provided 510 usable answers. This research has a theoretical contribution represented by the introduction of the IT resource-based renewal concept, that integrates the RBV perspective and the TOE framework for supporting organization’s decision on emerging technologies adoption driving SMEs competitive advantage. As practical implications, this thesis provides SMEs with a reference framework on adopting emerging technologies, offering SME managers and owners a comprehensive model of hierarchical factors contributing to SMEs competitive advantage acquired as outcome of AI and IoT adoption. This research makes an original contribution to the enterprise management, information systems adoption, and SME competitive advantage literature, with an empirical approach that verifies a model of emerging technologies adoption determinants driving SMEs competitive advantage.
1355

A multivariate approach to characterization of drug-like molecules, proteins and the interactions between them

Lindström, Anton January 2008 (has links)
En sjukdom kan många gånger härledas till en kaskadereaktion mellan proteiner, co-faktorer och substrat. Denna kaskadreaktion blir många gånger målet för att behandla sjukdomen med läkemedel. För att designa nya läkemedelsmoleyler används vanligen datorbaserade verktyg. Denna design av läkemedelsmolekyler drar stor nytta av att målproteinet är känt och då framförallt dess tredimensionella (3D) struktur. Är 3D-strukturen känd kan man utföra så kallad struktur- och datorbaserad molekyldesign, 3D-geometrin (f.f.a. för inbindningsplatsen) blir en vägledning för designen av en ny molekyl. Många faktorer avgör interaktionen mellan en molekyl och bindningsplatsen, till exempel fysikalisk-kemiska egenskaper hos molekylen och bindningsplatsen, flexibiliteten i molekylen och målproteinet, och det omgivande lösningsmedlet. För att strukturbaserad molekyldesign ska fungera väl måste två viktiga steg utföras: i) 3D anpassning av molekyler till bindningsplatsen i ett målprotein (s.k. dockning) och ii) prediktion av molekylers affinitet för bindningsplatsen. Huvudsyftena med arbetet i denna avhandling var som följer: i) skapa modeler för att prediktera affiniteten mellan en molekyl och bindningsplatsen i ett målprotein; ii) förfina molekyl-protein-geometrin som skapas vid 3D-anpassning mellan en molekyl och bindningsplatsen i ett målprotein (s.k. dockning); iii) karaktärisera proteiner och framför allt deras sekundärstruktur; iv) bedöma effekten av olika matematiska beskrivningar av lösningsmedlet för förfining av 3D molekyl-protein-geometrin skapad vid dockning och prediktion av molekylers affinitet för proteiners bindningsfickor. Ett övergripande syfte var att använda kemometriska metoder för modellering och dataanalys på de ovan nämnda punkterna. För att sammanfatta så presenterar denna avhandling metoder och resultat som är användbara för strukturbaserad molekyldesign. De rapporterade resultaten visar att det är möjligt att skapa kemometriska modeler för prediktion av molekylers affinitet för bindningsplatsen i ett protein och att dessa presterade lika bra som andra vanliga metoder. Dessutom kunde kemometriska modeller skapas för att beskriva effekten av hur inställningarna för olika parametrar i dockningsprogram påverkade den 3D molekyl-protein-geometrin som dockingsprogram skapade. Vidare kunde kemometriska modeller andvändas för att öka förståelsen för deskriptorer som beskrev sekundärstrukturen i proteiner. Förfining av molekyl-protein-geometrin skapad genom dockning gav liknande och ickesignifikanta resultat oberoende av vilken matematisk modell för lösningsmedlet som användes, förutom för ett fåtal (sex av 30) fall. Däremot visade det sig att användandet av en förfinad geometri var värdefullt för prediktion av molekylers affinitet för bindningsplatsen i ett protein. Förbättringen av prediktion av affintitet var markant då en Poisson-Boltzmann beskrivning av lösningsmedlet användes; jämfört med prediktionerna gjorda med ett dockningsprogram förbättrades korrelationen mellan beräknad affintiet och uppmätt affinitet med 0,7 (R2). / A disease is often associated with a cascade reaction pathway involving proteins, co-factors and substrates. Hence to treat the disease, elements of this pathway are often targeted using a therapeutic agent, a drug. Designing new drug molecules for use as therapeutic agents involves the application of methods collectively known as computer-aided molecular design, CAMD. When the three dimensional (3D) geometry of a macromolecular target (usually a protein) is known, structure-based CAMD is undertaken and structural information of the target guides the design of new molecules and their interactions with the binding sites in targeted proteins. Many factors influence the interactions between the designed molecules and the binding sites of the target proteins, such as the physico-chemical properties of the molecule and the binding site, the flexibility of the protein and the ligand, and the surrounding solvent. In order for structure-based CAMD to be successful, two important aspects must be considered that take the abovementioned factors into account. These are; i) 3D fitting of molecules to the binding site of the target protein (like fitting pieces of a jigsaw puzzle), and ii) predicting the affinity of molecules to the protein binding site. The main objectives of the work underlying this thesis were: to create models for predicting the affinity between a molecule and a protein binding site; to refine the geometry of the molecule-protein complex derived by or in 3D fitting (also known as docking); to characterize the proteins and their secondary structure; and to evaluate the effects of different generalized-Born (GB) and Poisson-Boltzmann (PB) implicit solvent models on the refinement of the molecule-protein complex geometry created in the docking and the prediction of the molecule-to-protein binding site affinity. A further objective was to apply chemometric methodologies for modeling and data analysis to all of the above. To summarize, this thesis presents methodologies and results applicable to structure-based CAMD. Results show that predictive chemometric models for molecule-to-protein binding site affinity could be created that yield comparable results to similar, commonly used methods. In addition, chemometric models could be created to model the effects of software settings on the molecule-protein complex geometry using software for molecule-to-binding site docking. Furthermore, the use of chemometric models provided a more profound understanding of protein secondary structure descriptors. Refining the geometry of molecule-protein complexes created through molecule-to-binding site docking gave similar results for all investigated implicit solvent models, but the geometry was significantly improved in only a few examined cases (six of 30). However, using the geometry-refined molecule-protein complexes was highly valuable for the prediction of molecule-to-binding site affinity. Indeed, using the PB solvent model it yielded improvements of 0.7 in correlation coefficients (R2) for binding affinity parameters of a set of Factor Xa protein drug molecules, relative to those obtained using the fitting software.
1356

A existência e a divulgação de ativos intangíveis em processos de fusões & aquisições na frança e o desempenho empresarial financeiro

Feitosa, Evelyn Seligmann 10 November 2011 (has links)
Made available in DSpace on 2016-03-15T19:30:46Z (GMT). No. of bitstreams: 1 Evelyn Seligmann Feitosa.pdf: 4150862 bytes, checksum: c2fb95c13060f06c44c6788bbbfd1fc6 (MD5) Previous issue date: 2011-11-10 / Fundo Mackenzie de Pesquisa / The allocation of resources and the constant search for competitive advantages differentiators to reach best results are always business challenges. In the contemporary context, in order to achieve superior performance, it reinforces the company's need to have, and make good use, of scarce, valuable, non-substitutable and inimitable resources. These resources include brands, customer base, knowledge, ability and competence of the work teams, corporate culture, partnerships and operational processes established, among other intangible assets, usually arising from a long and risky development process. Mergers and acquisitions (M & A) arise, then, as an important strategic action, being an alternative means to obtain and accelerate the accumulation of these resources within the companies. That is the subject of this work, which discusses the importance of existing and intangible assets disclosed, previous to the M & A transactions, their classification into various types, measurement, and impact on the resulting firm's financial performance in long term. The overall objective of this thesis was to analyze how this performance, after a minimum period of 36 months of the event, is related to the existence, level of disclosure and the nature of intangible assets in the organizations involved. One hundred-eighteen (118) companies were investigated in fifty-nine (59) cases of M & A occurred in France between 1997 and 2007; the study reflects a multi-method research, pluralistic, on qualitative and quantitative aspects. Intangible assets disclosure indicators were built by applying the content analysis technique to financial and accounting reports provided by the companies prior to the events, as well as financial indicators (proxies) for the existence of intangibles were calculated. These indicators were initially confronted with each other and later their explanatory power in relation to financial ratios of growth and profitability (for the corporation and its shareholders), which are the analyzed dimensions of financial performance. Many methods for statistical analysis were used in the multivariate data analysis (correlations and factor analysis, multiple regressions) and in the structural equation modeling (SEM), via Partial Least Squares (PLS). A total of twelve models, with statistics significance, were established to express the relationship among the constructs examined. Best results were achieved in the models developed with variables of semantic origin, in detriment of those with financial indicators only. The results obtained in this thesis leads to deduce that, in this study, there are positive relationships between the existence and the disclosure of intangible assets by firms involved in the operations of M & A and subsequent financial performance, measured by the corporate profitability and the growth of the resulting organization. This suggests that the strategic choice for business growth via M & A operations is favorable to the accumulation of intangible assets in the firms, in search for better results. / A alocação de recursos e a constante busca por diferenciais competitivos, visando melhores resultados, são grandes desafios empresariais. No contexto contemporâneo, para obter desempenho superior, reforça-se a necessidade de a empresa dispor, e fazer bom uso, de recursos raros, valiosos, não-substituíveis e de difícil imitação. Dentre estes recursos, destacam-se aspectos como as marcas, a base de clientes, o conhecimento, a capacidade e competência das equipes de trabalho, a cultura corporativa, as parcerias e os processos operacionais estabelecidos, dentre outros ativos intangíveis, geralmente decorrentes de longos e arriscados processos de desenvolvimento. As fusões e aquisições (F&A) surgem, então, como movimentos estratégicos importantes, sendo meio alternativo para obter e acelerar a acumulação destes recursos nas empresas. É essa a temática deste trabalho, que discorre sobre a importância dos ativos intangíveis existentes e divulgados previamente às operações de F&A de empresas, sobre a classificação dos seus diversos tipos, a sua mensuração e o seu impacto sobre o desempenho financeiro da firma resultante, no longo prazo. O objetivo geral desta tese foi analisar como este desempenho, após prazo mínimo de 36 meses do evento, está relacionado à existência, ao nível de divulgação e à natureza dos ativos intangíveis das organizações envolvidas. Foram investigadas 118 empresas, em 59 casos de F&A ocorridos na França entre 1997 e 2007, em uma pesquisa multi-métodos, pluralística, nas vertentes qualitativa e quantitativa. Foram construídos indicadores de divulgação (disclosure) de ativos intangíveis, mediante aplicação da técnica de análise de conteúdos aos relatórios contábil-financeiros disponibilizados pelas empresas antes do evento, e calculados indicadores financeiros (proxies) para a existência de intangíveis. Estes indicadores foram inicialmente confrontados entre si e posteriormente quanto ao seu poder explicativo em relação aos índices financeiros de crescimento e de lucratividade (empresarial e para os acionistas), que são as dimensões analisadas do desempenho financeiro. Utilizaram-se métodos de análise estatística de dados multivariados (análises de correlações, fatoriais, regressões múltiplas) e modelagem em equações estruturais, via Partial Least Squares (SEM- PLS). Foram estabelecidos, no total, doze modelos com significância estatística para expressar o relacionamento entre os construtos examinados. Alcançaram-se melhores resultados nos modelos desenvolvidos com variáveis de origem semântica, em detrimento daqueles que utilizaram indicadores exclusivamente financeiros. Os resultados obtidos nesta tese permitiram deduzir que há relações positivas entre a existência e a divulgação de ativos intangíveis pelas firmas envolvidas nas operações de F&A estudadas e o posterior desempenho financeiro, mensurado pela lucratividade empresarial e pelo crescimento, da organização resultante. Isto sugere que a opção estratégica por crescimento empresarial via operações de F&A é favorável ao acúmulo de recursos intangíveis nas firmas, na busca por melhores resultados.
1357

Adaptive least-squares finite element method with optimal convergence rates

Bringmann, Philipp 29 January 2021 (has links)
Die Least-Squares Finite-Elemente-Methoden (LSFEMn) basieren auf der Minimierung des Least-Squares-Funktionals, das aus quadrierten Normen der Residuen eines Systems von partiellen Differentialgleichungen erster Ordnung besteht. Dieses Funktional liefert einen a posteriori Fehlerschätzer und ermöglicht die adaptive Verfeinerung des zugrundeliegenden Netzes. Aus zwei Gründen versagen die gängigen Methoden zum Beweis optimaler Konvergenzraten, wie sie in Carstensen, Feischl, Page und Praetorius (Comp. Math. Appl., 67(6), 2014) zusammengefasst werden. Erstens scheinen fehlende Vorfaktoren proportional zur Netzweite den Beweis einer schrittweisen Reduktion der Least-Squares-Schätzerterme zu verhindern. Zweitens kontrolliert das Least-Squares-Funktional den Fehler der Fluss- beziehungsweise Spannungsvariablen in der H(div)-Norm, wodurch ein Datenapproximationsfehler der rechten Seite f auftritt. Diese Schwierigkeiten führten zu einem zweifachen Paradigmenwechsel in der Konvergenzanalyse adaptiver LSFEMn in Carstensen und Park (SIAM J. Numer. Anal., 53(1), 2015) für das 2D-Poisson-Modellproblem mit Diskretisierung niedrigster Ordnung und homogenen Dirichlet-Randdaten. Ein neuartiger expliziter residuenbasierter Fehlerschätzer ermöglicht den Beweis der Reduktionseigenschaft. Durch separiertes Markieren im adaptiven Algorithmus wird zudem der Datenapproximationsfehler reduziert. Die vorliegende Arbeit verallgemeinert diese Techniken auf die drei linearen Modellprobleme das Poisson-Problem, die Stokes-Gleichungen und das lineare Elastizitätsproblem. Die Axiome der Adaptivität mit separiertem Markieren nach Carstensen und Rabus (SIAM J. Numer. Anal., 55(6), 2017) werden in drei Raumdimensionen nachgewiesen. Die Analysis umfasst Diskretisierungen mit beliebigem Polynomgrad sowie inhomogene Dirichlet- und Neumann-Randbedingungen. Abschließend bestätigen numerische Experimente mit dem h-adaptiven Algorithmus die theoretisch bewiesenen optimalen Konvergenzraten. / The least-squares finite element methods (LSFEMs) base on the minimisation of the least-squares functional consisting of the squared norms of the residuals of first-order systems of partial differential equations. This functional provides a reliable and efficient built-in a posteriori error estimator and allows for adaptive mesh-refinement. The established convergence analysis with rates for adaptive algorithms, as summarised in the axiomatic framework by Carstensen, Feischl, Page, and Praetorius (Comp. Math. Appl., 67(6), 2014), fails for two reasons. First, the least-squares estimator lacks prefactors in terms of the mesh-size, what seemingly prevents a reduction under mesh-refinement. Second, the first-order divergence LSFEMs measure the flux or stress errors in the H(div) norm and, thus, involve a data resolution error of the right-hand side f. These difficulties led to a twofold paradigm shift in the convergence analysis with rates for adaptive LSFEMs in Carstensen and Park (SIAM J. Numer. Anal., 53(1), 2015) for the lowest-order discretisation of the 2D Poisson model problem with homogeneous Dirichlet boundary conditions. Accordingly, some novel explicit residual-based a posteriori error estimator accomplishes the reduction property. Furthermore, a separate marking strategy in the adaptive algorithm ensures the sufficient data resolution. This thesis presents the generalisation of these techniques to three linear model problems, namely, the Poisson problem, the Stokes equations, and the linear elasticity problem. It verifies the axioms of adaptivity with separate marking by Carstensen and Rabus (SIAM J. Numer. Anal., 55(6), 2017) in three spatial dimensions. The analysis covers discretisations with arbitrary polynomial degree and inhomogeneous Dirichlet and Neumann boundary conditions. Numerical experiments confirm the theoretically proven optimal convergence rates of the h-adaptive algorithm.
1358

An evaluation of the anti-corruption initiatives in Botswana and their relationship to Botswana's development

Mwamba, Leon Tshimpaka 12 1900 (has links)
The study focuses on an evaluation of the anti-corruption initiatives in Botswana and their relation to Botswana’s development. An evaluation was needed to find out whether the anti-corruption initiatives were effective and whether there were a correlation between the effectiveness of the Directorate on Corruption and Economic Crimes (DCEC) and the level of development in Botswana. This study showed that, the DCEC has succeeded to low corruption in Botswana through its most successful public education mandate and debatable good governance. The DCEC has helped to enhance service delivery in the public sector through the establishment of the Anti-Corruption Units (ACUs) within the Ministries aimed at tackling corruption in-house. Consequently, a significant slight improvement was registered in both public health and education sectors. However, that improvement was still minimal to the extent that it has been hampered by the challenging working conditions of the DCEC attributable to the inadequacy of legislation, lack of manpower, shortage of required skills and slow criminal justice system as well as the debatable independence of the DCEC, evidenced by its reporting and appointing lines. This implies that the impact of the DCEC in the development of Botswana has been minimal, as the country is still devastated by socio-economic disparities especially in rural areas. / Development Studies / M.A. (Development Studies)
1359

Political and economic events 1988 to 1998 : their impact on the specification of the nonlinear multifactor asset pricing model described by the arbitrage pricing theory for the financial and industrial sector of the Johannesburg Stock Exchange

Stephanou, Costas Michael 05 1900 (has links)
The impact of political and economic events on the asset pricing model described by the arbitrage pricing theory (APTM) was examined in order to establish if they had caused any changes in its specification. It was concluded that the APTM is not stationary and that it must be continuously tested before it can be used as political and economic events can change its specification. It was also found that political events had a more direct effect on the specification of the APTM, in that their effect is more immediate, than did economic events, which influenced the APTM by first influencing the economic environment in which it operated. The conventional approach that would have evaluated important political and economic events, case by case, to determine whether they affected the linear factor model (LFM), and subsequently the APTM, could not be used since no correlation was found between the pricing of a risk factor in the LFM and its subsequent pricing in the APTM. A new approach was then followed in which a correlation with a political or economic event was sought whenever a change was detected in the specification of the APTM. This was achieved by first finding the best subset LFM, chosen for producing the highest adjusted R2 , month by month, over 87 periods from 20 October1991 to 21 June 1998, using a combination of nine prespecified risk factors (five of which were proxies for economic events and one for political events). Multivariate analysis techniques were then used to establish which risk factors were priced most often during the three equal subperiods into which the 87 periods were broken up. Using the above methodology, the researcher was able to conclude that political events changed the specification of the APTM in late 1991. After the national elections in April 1994 it was found that the acceptance of South Africa into the world economic community had again changed the specification of the APTM and the two most important factors were proxies for economic events. / Business Leadership / DBL
1360

Socio-economic factors that affect livestock numbers : a case study of smallholder cattle and sheep farmers in the Free State province of South Africa

Ogunkoya, Folasade Temitope 05 1900 (has links)
The study was conducted across the four district municipalities in the Free State province of South Africa. The objective of the study was to determine socio-economic factors that affected livestock numbers among smallholder cattle and sheep farmers in the Free State province of South Africa. The research was qualitative and quantitative in nature. Proportionate random sampling method was used to collect data. The population comprised of smallholder cattle and sheep farmers that kept at least 30 livestock. Data between the 2008 and 2012 farming seasons were collected by administering well-structured questionnaires to 250 smallholder cattle and sheep farmers. Data collected were captured and analysed using the statistical package for social sciences (SPSS version 22 of 2013) to obtain frequency, cross-tabulation, descriptive statistics and ordinary least square (OLS) regression. Descriptive statistics results indicated that lack of camp systems, drought prevalence, increased feed costs, poor veterinary interventions, insufficient breeding stock, high cost of fuel and transportation, lack of equipment, diseases, stock theft and pilfering, and insufficient grazing land were the prevalent factors that affected cattle and sheep farming in the province.The OLS regression results indicated that the variables that significantly affected livestock numbers were district, household size, livestock numbers in 2008, planted pastures, grazing land condition, grazing land acquisition, service, advice / training, veterinary services, purchase of dosing products and sales per year. The results also indicated that the majority (96.8%) of the smallholder cattle and sheep farmers would like to increase their livestock numbers. It was therefore recommended that extension and veterinary services should be strengthened in the study area. In addition, it was recommended that smallholder livestock farmers should be encouraged to plant pastures to reduce pressure on the natural veld and make forage available throughout the year. Lastly, as a recommendation, government should provide subsidies with distribution policies that will ensure that all smallholder livestock farmers can benefit. / Agriculture and  Animal Health / M. Sc. (Agriculture)

Page generated in 0.0498 seconds