• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 91
  • 33
  • 11
  • 7
  • 5
  • 5
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 200
  • 29
  • 25
  • 21
  • 20
  • 17
  • 16
  • 16
  • 15
  • 15
  • 14
  • 14
  • 13
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

[pt] ENSAIOS EM FINANÇAS COMPORTAMENTAIS / [en] ESSAYS ON BEHAVIORAL FINANCE

ARNALDO JOAO DO NASCIMENTO JUNIOR 31 May 2021 (has links)
[pt] Baseado na Teoria Cumulativa da Perspectiva, três ensaios são apresentados nessa tese. Todos os três trabalhos estão conectados pelo entendimento aprofundado da Função de Ponderação de Probabilidade e suas conexões cenários de decisão sob risco. O primeiro ensaio é um trabalho empírico utilizando a teoria da perspectiva para analisar o viés do efeito de enquadramento em decisões de investimentos em certos países emergentes: Brasil, China, Russia, México e África do Sul. Em todos os casos, identificamos empiricamente o poder preditivo da teoria da perspectiva para os retornos dos ativos. Também encontramos que a função de ponderação de probabilidade é o fator mais importante para o poder preditivo. O segundo ensaio é um trbalho teórico propondo uma axiomatização da função de ponderação de Goldstein-Einhorn. Desde 1987, a conhecida função de ponderação de Goldstein-Einhorn é largamente utilizada em trabalhos em muitos artigos empíricos e teóricos. Richard Gonzalez e GeorgeWu propuseram uma axiomatização para esta função em 1999. O trabalho que apresentamos analisa a condição de preferência dos autores e encontra uma família maior de funções de ponderação. Fornecemos exemplos úteis e sugerimos uma nova condição de preferência que é necessária e suficiente para a função de Goldstein- Einhorn. Esta nova condição de prefer6encia simula o comportamento das pessoas em situações que envolvem atitutes arriscadas. O terceiro ensaio propõe uma medida para as características psicológicas chamadas de atratividade e discriminabilidade, no contexto das funções de ponderação de probabilidades. Esse conceitos são importantes para nos ajudar a entender como algumas emoções influenciam nosso comportamento. Propomos medidas no sentido absoluto e relativo e as comparamos com alguns exemplos particulares encontrados na literatura. Nossos resultados são consistentes com o entendimento qualitativo encontrado na literatura e fornece um entendimento quantitativo para ele. / [en] Based on Cumulative Prospect Theory, three essays are presented in this thesis. All three works are linked by a deeper understanding of Probability Weighting Functions and its connection with decisions in a risk scenario. The first essay is an empirical work using prospect theory to analyze the narrow framing bias in investment decisions in certain emerging countries: Brazil, China, Russia, Mexico and South Africa. In all cases, we empirically identified the predictive power of prospect theory for stock returns. We also found that the probability weighting function is the most important factor in this predictive power. The second essay is a theoretical work proposing an axiomatization for the Goldstein-Einhorn weighting function. Since 1987, the well known Goldstein- Einhorn Weighting Function is widely used in many empirical and theoretical papers. Richard Gonzalez and George Wu proposed an axiomatization for it in 1999. The present work analyses their preference condition and finds a bigger family of weighting functions. We provide useful examples and suggest a new preference condition which is necessary and sufficient for Goldstein-Einhorn function. This new preference condition simulates the behavior of people in risky attitudes. The third essay propose a measure to evaluate the psychological features of attractiveness and discriminability in the context of probability weighting functions. These concepts are important to help us understand how some emotions drive our behavior. We propose measures in absolute and in the relative sense and compare with some particular cases found in the literature. Our findings are consistent with the qualitative understanding widespread in the literature and provide a quantitative analysis for it.
142

Arabic Language Processing for Text Classification. Contributions to Arabic Root Extraction Techniques, Building An Arabic Corpus, and to Arabic Text Classification Techniques.

Al-Nashashibi, May Y.A. January 2012 (has links)
The impact and dynamics of Internet-based resources for Arabic-speaking users is increasing in significance, depth and breadth at highest pace than ever, and thus requires updated mechanisms for computational processing of Arabic texts. Arabic is a complex language and as such requires in depth investigation for analysis and improvement of available automatic processing techniques such as root extraction methods or text classification techniques, and for developing text collections that are already labeled, whether with single or multiple labels. This thesis proposes new ideas and methods to improve available automatic processing techniques for Arabic texts. Any automatic processing technique would require data in order to be used and critically reviewed and assessed, and here an attempt to develop a labeled Arabic corpus is also proposed. This thesis is composed of three parts: 1- Arabic corpus development, 2- proposing, improving and implementing root extraction techniques, and 3- proposing and investigating the effect of different pre-processing methods on single-labeled text classification methods for Arabic. This thesis first develops an Arabic corpus that is prepared to be used here for testing root extraction methods as well as single-label text classification techniques. It also enhances a rule-based root extraction method by handling irregular cases (that appear in about 34% of texts). It proposes and implements two expanded algorithms as well as an adjustment for a weight-based method. It also includes the algorithm that handles irregular cases to all and compares the performances of these proposed methods with original ones. This thesis thus develops a root extraction system that handles foreign Arabized words by constructing a list of about 7,000 foreign words. The outcome of the technique with best accuracy results in extracting the correct stem and root for respective words in texts, which is an enhanced rule-based method, is used in the third part of this thesis. This thesis finally proposes and implements a variant term frequency inverse document frequency weighting method, and investigates the effect of using different choices of features in document representation on single-label text classification performance (words, stems or roots as well as including to these choices their respective phrases). This thesis applies forty seven classifiers on all proposed representations and compares their performances. One challenge for researchers in Arabic text processing is that reported root extraction techniques in literature are either not accessible or require a long time to be reproduced while labeled benchmark Arabic text corpus is not fully available online. Also, by now few machine learning techniques were investigated on Arabic where usual preprocessing steps before classification were chosen. Such challenges are addressed in this thesis by developing a new labeled Arabic text corpus for extended applications of computational techniques. Results of investigated issues here show that proposing and implementing an algorithm that handles irregular words in Arabic did improve the performance of all implemented root extraction techniques. The performance of the algorithm that handles such irregular cases is evaluated in terms of accuracy improvement and execution time. Its efficiency is investigated with different document lengths and empirically is found to be linear in time for document lengths less than about 8,000. The rule-based technique is improved the highest among implemented root extraction methods when including the irregular cases handling algorithm. This thesis validates that choosing roots or stems instead of words in documents representations indeed improves single-label classification performance significantly for most used classifiers. However, the effect of extending such representations with their respective phrases on single-label text classification performance shows that it has no significant improvement. Many classifiers were not yet tested for Arabic such as the ripple-down rule classifier. The outcome of comparing the classifiers' performances concludes that the Bayesian network classifier performance is significantly the best in terms of accuracy, training time, and root mean square error values for all proposed and implemented representations. / Petra University, Amman (Jordan)
143

Development of an Advanced Two-Dimensional Microdosimetric Detector based on THick Gas Electron Multipliers / Development of an Advanced 2D THGEM Microdosimetric Detector

Darvish-Molla, Sahar January 2016 (has links)
The THick Gas Electron Multiplier (THGEM) based tissue-equivalent proportional counter (TEPC) has been proven to be useful for microdosimetry due to its flexibility in varying the gaseous sensitive volume and achieving high multiplication gain. Aiming at measuring the spatial distribution of radiation dose for mixed neutron-gamma fields, an advanced two-dimensional (2D) THGEM TEPC was designed and constructed at McMaster University which will enable us to overcome the operational limitation of the classical TEPCs, particularly for high dose rate fields. Compared to the traditional TEPCs, anode wire electrodes were replaced by THGEM layer, which not only enhances the gas multiplication gain but also offers a flexible and convenient fabrication or building 2D detectors. The 2D THGEM TEPC consists of an array of 3×3 sensitive volumes, equivalent to 9 individual TEPCs, each of which has a dimension of 5 mm diameter and length. Taking the overall cost, size and flexibility into account, to process 9 detectors signals simultaneously, a multi-input digital pulse processing system was developed by using modern microcontrollers, each of which is coupled to a 12-bit sampling ADC with a sampling rate of 42 Msps. The signal processing system was tested using a NaI(Tl) detector, which has proven that is it faster than a traditional analogue system and a commercial digital system. Using the McMaster Tandetron 7Li(p,n) accelerator neutron source, both fundamental detector performance, as well as neutron dosimetric response of the 2D THGEM TEPC, has been extensively investigated and compared to the data acquired by a spherical TEPC. It was shown that the microdosimetric response and the measured absorbed dose rate of the 2D THGEM detector developed in this study are comparable to the standard 1/2" TEPC which is commercially available. / Thesis / Doctor of Philosophy (PhD)
144

Estimating the Causal Effect of High School Mathematics Coursetaking on Placement out of Postsecondary Remedial Mathematics

Showalter, Daniel A. 12 June 2014 (has links)
No description available.
145

Multialternative Decision Field Theory Model Fitting Using Different Measures of Attribute Weighting

Zhang, Ruohui 14 July 2015 (has links)
No description available.
146

Integrating planetary boundaries into the life cycle assessment of electric vehicles : A case study on prioritising impact categories through environmental benchmarking in normalisation and weighting methods when assessing electric heavy-duty vehicles / Integrering av planetära gränser i en livscykelanalys av elektriska fordon : En fallstudie för att prioritera påverkanskategorier genom metoderna normalisering och viktning i en livscykelanalys av elektriska tunga fordon

Pehrson, Ida January 2020 (has links)
The transport sector is facing great challenges for achieving development within the Earth’s boundaries. Currently, LCA studies on heavy- and medium-duty vehicles have mainly assessed the ‘well-to-wheel’ stage and the impact category climate change. To understand the full range of environmental impacts from a truck, a holistic view needs to be adopted, to acknowledge several sustainability dimensions. The development of new vehicle technologies, such as battery electrical vehicles (BEV), the impact will mainly occur in the production and end-of-life stage, thereby it is crucial to adapt a cradle-to-grave approach in LCA. This thesis have interpret Scania’s current LCA results through normalization and weighting. The normalization and weighting methods used have been based on the planetary boundaries (PBs) and other scientific thresholds of earth’s carrying capacity. The normalised results display that considering a heavy-duty truck with diesel (B5) climate change is the major impact, but for BEV with EU electricity mix it is freshwater ecotoxicity, stratospheric ozone formation and climate change that are the main impacts to consider. For the BEV with wind electricity, it is freshwater ecotoxicity and climate change which are the major impacts. According to the weighed results, the impact on ́climate change ́ and ́fossil resource scarcity ́ are most important for diesel (B5) and considering BEV with EU mix it is the impact categories of ́climate change ́ and ́fossil resource depletion ́ followed by ́mineral resource scarcity ́. Considering BEV with wind electricity it is ́mineral resource scarcity ́ followed by ́climate change ́ and ́fossil resource scarcity ́. The weighted results also display that the impact categories, ‘human toxicity cancer’, ‘freshwater ecotoxicity’, ‘particulate matter’ and ‘water resource scarcity’ are important to consider in an LCA of a BEV. Concludingly, it is a need for future research in the area of connecting the PBs with the LCA framework. Moreover, it is a need to develop normalisation reference (NR) and weighting factors (WF) based on a company and sectorial allowances of the carrying capacity to understand a product or company’s environmental impact in absolute terms. / Transportsektorn står inför stora utmaningar för att nå en utveckling inom planetens gränser. I nuläget har LCA studier för tunga och medeltunga transporter fokuserat på ‘well-to-wheel’ vilket är stegen bränsleproduktionen (från källan till tanken) och konsekvenserna av fordonets användning (från tank till hjul) och påverkanskategorin klimat. För att förstå fordonets totala miljöpåverkan, behövs ett holistiskt synsätt för att förstå flera hållbarhetsdimensioner av fordonets miljöpåverkan. Utvecklingen av nya fordonstekniker, så som batterifordon, kommer leda till att miljöpåverkan möjligen främst uppstår i produktions och avfallsfasen av livscykeln, det är därav viktigt att analysera ett fordon från ́vaggan till graven ́. Denna uppsats har analyserat Scanias LCA resultat genom normalisering och viktning. Normaliserings- och viktningsmetoderna som används är baserade på dom planetära gränserna och andra tröskelvärden för planetens bärkapacitet. Det normaliserade resultatet visar att för en diesel lastbil är klimat en betydande påverkanskategori, dock för en BEV (”Battery Electric Vehicle”) med EU elektricitet är det sötvattentoxicitet, stratosfärisk ozonbildning och klimat som är dom mest betydande påverkanskategorierna. Det normaliserade resultatet för BEV med vindenergi visar att det är sötvattentoxicitet och klimat som dom mest betydande påverkanskategorierna. Enligt den valda viktningsmetoden framgår det att klimat och fossil resursutarmning är dom viktigaste påverkanskategorierna för en diesel lastbil. För en BEV med EU mix är den viktigaste klimat och fossil resursutarmning följt av mineralresursbrist. För BEV laddad med energi från vindkraft, är dom viktigaste påverkanskategorierna mineralresursbrist, klimat och fossil resursutarmning. Det viktade resultatet visade även att påverkanskategorierna, humantoxicitet cancer, sötvatten ekotoxicitet, partiklar och vattenresursbrist bör tas i beaktning i en LCA av en BEV. Slutligen behövs det mer forskning kring sammankoppling av planetära gränser och LCA ramverket, även utveckling av normaliseringsreferenser och viktningsfaktorer som är baserat på företags- och sektorsnivåer för utsläppsrätter behövs för att ett företag ska förstå produkters absoluta miljöpåverkan.
147

<b>MULTI-CRITERIA ANALYSIS FOR </b><b>HUMAN-LIKE </b><b>DECISION MAKING IN AUTONOMOUS VEHICLE PERATIONS</b>

Aishwarya Sharma (18429147) 25 April 2024 (has links)
<p dir="ltr">Highway safety continues to pose a serious challenge to the social sustainability of transportation systems, and initiatives are being pursued at all levels of government to reduce the high fatality count of 42,000. At the same time, it is sought to ensure higher travel efficiency in order to increase economic productivity. The emergence of automated transportation provides great promise to mitigate these ills of the transportation sector that have persisted for so many decades. With regards to safety, such promise is rooted in the capability of autonomous vehicles to self-drive some or all of the time, thus reducing the impact of inherently errant human driving to which 95% of all crashes have been attributed. With regards to mobility, such promise is guided by the capability of the autonomous vehicle to carry out path planning, navigation, and vehicle controls in ways that are far more efficient than the human brain, thereby facilitating mobility and reducing congestion-related issues such as delay, emissions, driver frustration, and so on.</p><p dir="ltr">Unfortunately, the two key outcomes (safety and mobility) are reciprocal in the sense that navigation solutions that enhance safety generally tend to reduce mobility, and vice versa. As such, there is a need to assign values explicit to these performance criteria in order to develop balanced solutions for AV decisions. Most existing machine-learning-based path planning algorithms derive these weights using a learning approach. Unfortunately, the stability of these weights across time, individuals, and trip types, is not guaranteed. It is necessary to develop weights and processes that are trip situation-specific. Secondly, user trust in automation remains a key issue, given the relatively recent emergence of this technology and a few highly-publicized crashes, which has led to reservations among potential users.</p><p dir="ltr">To address these research questions, this thesis identifies various situational contexts of the problem, identifies the alternatives (the viable trajectories by fitting curves between the vehicle maneuver’s initial and final positions), develops the decision criteria (safety, mobility, comfort), carries out weighting of the criteria to reflect their relative significance, and scales the criteria to develop dimensionless equivalents of their raw values. Finally, a process for amalgamating the overall impacts of each driving decision alternative is developed based on the weighted and scaled criteria, to identify the best decision (optimal trajectory path). This multi-criteria decision making (MCDM) problem involves the collection of data through questionnaire surveys.</p><p dir="ltr">The weights obtained early in the MCDM process could be integrated into any one of two types of planning algorithms. First, they could be incorporated into interpolating curve-based planning algorithms, to identify the optimal trajectory based on human preferences. Additionally, they can be integrated into optimization-based planning algorithms to allocate weights to the various functions used.</p><p dir="ltr">Overall, this research aims to align the behavior of autonomous vehicles closely with human-driven vehicles, serving two primary purposes: first, facilitating their seamless coexistence on mixed-traffic roads and second, enhancing public acceptance of autonomous vehicles.</p>
148

An investigation of sea-breeze driven convection along the northern Gulf Coast

Ford, Caitlin 13 May 2022 (has links) (PDF)
Although sea-breezes frequently initiate convection, it is oftentimes challenging to forecast the precise location of storm development. This research examines temporal and spatial characteristics of sea-breeze driven convection and environmental conditions that support convective or non-convective sea-breeze days along the Northern Gulf Coast. Base reflectivity products were used to identify the initial time of convection (values greater than 30 dBZs) along the sea-breeze front. It was found that convective sea-breezes initiated earlier in the day compared to non-convective sea-breezes. Mapping convective cells in ArcGIS revealed favored locations of thunderstorm development including the southeastern cusp of Mobile County, Alabama and convex coastlines. Meteorological variables from the North American Regional Reanalysis dataset were compared between convective and non-convective sea-breeze days via a bootstrap analysis to reveal environmental characteristics pertinent to forecasting sea-breeze driven convection. Lapse rates, CAPE, CIN, specific humidity, dew point temperature, relative humidity, and precipitable water values were statistically significant.
149

Non-response error in surveys

Taljaard, Monica 06 1900 (has links)
Non-response is an error common to most surveys. In this dissertation, the error of non-response is described in terms of its sources and its contribution to the Mean Square Error of survey estimates. Various response and completion rates are defined. Techniques are examined that can be used to identify the extent of nonresponse bias in surveys. Methods to identify auxiliary variables for use in nonresponse adjustment procedures are described. Strategies for dealing with nonresponse are classified into two types, namely preventive strategies and post hoc adjustments of data. Preventive strategies discussed include the use of call-backs and follow-ups and the selection of a probability sub-sample of non-respondents for intensive follow-ups. Post hoc adjustments discussed include population and sample weighting adjustments and raking ratio estimation to compensate for unit non-response as well as various imputation methods to compensate for item non-response. / Mathematical Sciences / M. Com. (Statistics)
150

Non-response error in surveys

Taljaard, Monica 06 1900 (has links)
Non-response is an error common to most surveys. In this dissertation, the error of non-response is described in terms of its sources and its contribution to the Mean Square Error of survey estimates. Various response and completion rates are defined. Techniques are examined that can be used to identify the extent of nonresponse bias in surveys. Methods to identify auxiliary variables for use in nonresponse adjustment procedures are described. Strategies for dealing with nonresponse are classified into two types, namely preventive strategies and post hoc adjustments of data. Preventive strategies discussed include the use of call-backs and follow-ups and the selection of a probability sub-sample of non-respondents for intensive follow-ups. Post hoc adjustments discussed include population and sample weighting adjustments and raking ratio estimation to compensate for unit non-response as well as various imputation methods to compensate for item non-response. / Mathematical Sciences / M. Com. (Statistics)

Page generated in 0.0424 seconds