• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 762
  • 222
  • 87
  • 68
  • 60
  • 33
  • 30
  • 24
  • 20
  • 15
  • 10
  • 7
  • 7
  • 6
  • 5
  • Tagged with
  • 1552
  • 271
  • 203
  • 186
  • 154
  • 147
  • 143
  • 143
  • 128
  • 125
  • 87
  • 87
  • 85
  • 81
  • 81
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Using Large-Scale Datasets to Teach Abstract Statistical Concepts: Sampling Distribution

Kanyongo, Gibbs Y. 16 March 2012 (has links)
No description available.
242

KodEd / KodEd

Pettersson, Erik, Holmberg, Matthias January 2017 (has links)
Projektet undersökte olika verktyg och metoder som i skrivande stund används för att lära ut programmering. Den sammanfattar tidigare undersökningar och drar slutsatser kring hur lämpliga de är. Utifrån dessa slutsatser utformades en webbapplikation för undervisning.   Rapporten redogör för hur webbapplikationen konstruerades, samt vilka metoder och verktyg som användes i dess utveckling.   Detta ämne är högaktuellt idag då Sveriges regering ämnar införa programmering i grundskolan och då krävs både bra metoder och verktyg för detta. / This project investigated different tools and methods used today in teaching programming. It summarizes earlier studies and draws conclusions about how suited they are for the job. The accompanying web application would be crafted based on the conclusions made.   The report intends to describe how the web application was constructed, and which methods and tools were used during its development.   The subject of the report is highly relevant today since the Swedish government is discussing the introduction of programming classes in the elementary school. Thus demanding both suitable methods and tools.
243

An Accelerated Method for Mean Flow Boundary Conditions for Computational Aeroacoustics

Samani, Iman January 2018 (has links)
No description available.
244

Multi-period portfolio optimization given a priori information on signal dynamics and transactions costs

Yassir, Jedra January 2018 (has links)
Multi-period portfolio optimization (MPO) has gained a lot of interest in modern portfolio theory due to its consideration for inter-temporal trading e effects, especially market impacts and transactions costs, and for its subtle reliability on return predictability. However, because of the heavy computational demand, portfolio policies based on this approach have been sparsely explored. In that regard, a tractable MPO framework proposed by N. Gârleanu & L. H. Pedersen has been investigated. Using the stochastic control framework, the authors provided a closed form expression of the optimal policy. Moreover, they used a specific, yet flexible return predictability model. Excess returns were expressed using a linear factor model, and the predicting factors were modeled as mean reverting processes. Finally, transactions costs and market impacts were incorporated in the problem formulation as a quadratic function. The elaborated methodology considered that the market returns dynamics are governed by fast and slow mean reverting factors, and that the market transactions costs are not necessarily quadratic. By controlling the exposure to the market returns predicting factors, the aim was to uncover the importance of the mean reversion speeds in the performance of the constructed trading strategies, under realistic market costs. Additionally, for the sake of comparison, trading strategies based on a single-period mean variance optimization were considered. The findings suggest an overall superiority in performance for the studied MPO approach even when the market costs are not quadratic. This was accompanied with evidence of better usability of the factors' mean reversion speed, especially fast reverting factors, and robustness in adapting to transactions costs. / Portföljoptimering över era perioder (MPO) har fått stort intresse inom modern portföljteori. Skälet till detta är att MPO tar hänsyn till inter-temporala handelseffekter, särskilt marknadseffekter och transaktionskostnader, plus dess tillförlitlighet på avkastningsförutsägbarhet. På grund av det stora beräkningsbehovet har dock portföljpolitiken baserad på denna metod inte undersökts mycket. I det avseendet, har en underskriven MPO ramverk som föreslagits av N.Gârleanu L. H. Pedersen undersökts. Med hjälp av stokastiska kontrollramen tillhandahöll författarna formuläret för sluten form av den optimala politiken. Dessutom använde de en specifik, men ändå flexibel returförutsägbarhetsmodell. Överskjutande avkastning uttrycktes med hjälp av en linjärfaktormodell och de förutsägande faktorerna modellerades som genomsnittligaåterföringsprocesser. Slutligen inkorporerades transaktionskostnader och marknadseffekter i problemformuleringen som en kvadratisk funktion. Den utarbetade metodiken ansåg att marknadens avkastningsdynamik styrs av snabba och långsammaåterhämtningsfaktorer, och att kostnaderna för marknadstransaktioner inte nödvändigtvis är kvadratiska. Genom att reglera exponeringen mot marknaden återspeglar förutsägande faktorer, var målet att avslöja vikten av de genomsnittliga omkastningshastigheterna i utförandet av de konstruerade handelsstrategierna, under realistiska marknadskostnader. Dessutom, för jämförelses skull, övervägdes handelsstrategier baserade på en enstaka genomsnittlig variansoptimering. Resultaten tyder på en överlägsen överlägsenhet i prestanda för det studerade MPO-tillvägagångssättet, även när marknadsutgifterna inte är kvadratiska. Detta åtföljdes av bevis för bättre användbarhet av faktorernas genomsnittliga återgångshastighet, särskilt snabba återställningsfaktorer och robusthet vid anpassning till transaktionskostnader
245

How to Get Rich by Fund of Funds Investment - An Optimization Method for Decision Making

Colakovic, Sabina January 2022 (has links)
Optimal portfolios have historically been computed using standard deviation as a risk measure.However, extreme market events have become the rule rather than the exception. To capturetail risk, investors have started to look for alternative risk measures such as Value-at-Risk andConditional Value-at-Risk. This research analyzes the financial model referred to as Markowitz 2.0 and provides historical context and perspective to the model and makes a mathematicalformulation. Moreover, practical implementation is presented and an optimizer that capturesthe risk of non-extreme events is constructed, which meets the needs of more customized investment decisions, based on investment preferences. Optimal portfolios are generated and anefficient frontier is made. The results obtained are then compared with those obtained throughthe mean-variance optimization framework. As concluded from the data, the optimal portfoliowith the optimal weights generated performs better regarding expected portfolio return relativeto the risk level for the investment.
246

Spatial Modelling of Monthly Climate Across Mountainous Terrain in Southern Yukon and Northern British Columbia

Ackerman, Hannah 11 November 2022 (has links)
Two measures of air temperature trends across southern Yukon and northern British Columbia were modelled based on measurements from 83 monitoring sites across seven areas, operating for up to 14 years. Both mean monthly air temperature (MMAT) and freezing and thawing degree days (FDD and TDD, respectively) were modelled across this area (59 °N to 64.5 °N) at elevations ranging from 330-1480 m asl. Lapse rates in this region show inversions in the winter months (November - March) varying in inversion strength and length in relation to degree of continentality. The spatial and elevation range of these sites allowed for regional lapse rate modelling at the monthly scale for MMAT and at the annual scale for FDD and TDD. Lapse rates below treeline were found to be correlated (p < 0.1) with degree of continentality in the colder months (November - April) and August. In these months, lapse rates were modelled using kriging trend surfaces. In months where degree of continentality was not found to have a significant impact on lapse rates (p > 0.1) (May - October, excluding August), an average lapse rate calculated from the seven study regions was used across the study region. A combination of lapse rate trend surfaces, elevation, and temperatures at sea level were used to model MMAT and F/TDD below treeline. A treeline trend surface was created using a 4th order polynomial, allowing for temperatures at treeline to be determined. MMAT and F/TDD above treeline were calculated using a constant lapse rate of -6 °C/km, elevation, and temperature at treeline. The above and below treeline models were combined to create continuous models of MMAT and F/TDD. Modelled MMAT showed a high degree of homogeneity across the study region in warmer months. Inversions in lapse rates are evident in the colder months, especially December through February, when colder temperatures are easily identified in valley bottoms, increasing to treeline, and decreasing above treeline. Modelled MMAT values were validated using 20 sites across the study region, using both Environment and Climate Change Canada and University of Ottawa sites. The RMSE between modelled and observed MMAT was highest in January (4.4 °C) and lowest in June (0.7 °C). Sites below treeline showed a stronger relationship between modelled and observed values than sites above treeline. Edge effects of the model were evident in the northeast of the study region as well as in the ice fields in the southwest along the Alaska border. The new MMAT maps can be used to help understand species range change, underlying permafrost conditions, and climate patterns over time. FDD values were found to be highly influenced by both degree of continentality as well as latitude, whereas TDD values were mainly dependent on elevation, with degree of continentality and latitude being lesser influences. FDD and TDD were validated using the same 20 sites across the study region, with FDD showing a larger RMSE (368 degree days) between modelled and observed values than TDD (150 degree days). TDD modelling performed better on average, with a lower average absolute difference (254 degree days) between modelled and observed values at the validation sites than FDD modelling (947 degree days). The models of FDD and TDD represent a component of temperature at top of permafrost (TTOP) modelling for future studies. Two mean annual air temperature (MAAT) maps were created, one calculated from the MMAT models, and the other from the F/TDD models. Most of the study region showed negative MAAT, mainly between -6 °C and 0 °C for both methods. The average MAAT calculated from FDD and TDD values was -2.4 ºC, whereas the average MAAT calculated from MMAT values was -2.8 ºC. Models of MAAT were found to be slightly warmer than in previous studies, potentially indicating warming temperature trends.
247

Analysis of Weighted Fraction of Length for Interfacial Gap in Cervical Composite Restorations as a Function of the Number of B-Scans of OCT Volume Scans

Schneider, Hartmut, Meißner, Tobias, Rüger, Claudia, Haak, Rainer 26 April 2023 (has links)
In dental research, the morphometric assessment of restorations is a challenge. This also applies to the assessment of the length of interfacial adhesive defects in composite restorations as a measure of tooth-restoration bond failure. The determined mean fractions of interfacial gap length on enamel and dentin interfaces deviate from the true means (N → ∞), depending on the number (Ni) of object layers assessed. Cervical composite restorations were imaged with spectral domain optical coherence tomography (SD-OCT). The mean fractions of interfacial gap length on enamel and dentin were determined for an increasing number of OCT cross-sectional images (B-scans) per restoration and were graphically displayed as a function of the number of B-scans. As the number of B-scans increased, the calculated object means approached a range of ±2.5%. This analysis is appropriate for displaying the relationship between the determined mean fraction of interfacial gap length at the enamel/dentin-restoration interface and the number of B-scans.
248

Överpresterar små bolag i en sektor som strukturellt missgynnar dem? : En studie om storlekseffekten i halvledarsektorn / Are mall Companies Outperforming in a Sector that Structurally Disadvantages them? : A Study of the Size Effect in the Semiconductor Sector

Eriksson, Caroline, Jakobsson, Rasmus January 2021 (has links)
Detta arbete syftar till att undersöka relationen mellan företagsstorlek och dess aktieavkastning,annars känt som storlekseffekten, inom halvledarsektorn. Vi använder oss av två portföljer bestående av de tio största och tio minsta halvledarbolagen och görutfallstestet under perioden 2004–2015. Tre olika allokeringsstrategier tillämpas: equal weight, meanvariance och equal risk contribution samt tre olika ombalanseringsperioder. Vårt resultat visar på ett negativt samband mellan företagsstorlek och riskjusterad avkastning oavsettallokeringsstrategi. Resultaten tyder på att effekten inte är en proxy för fundamentala skillnader ellerberor på en felspecificering av β. / This thesis aims to examine the relationship between firm size and stock return, otherwise known asthe size effect, within the semiconductor industry. We construct two portfolios each comprising the ten largest and smallest semiconductor companiesand conduct a back test between 2004-2015. We examine three allocation strategies: equal weight,mean variance, and equal risk contribution along three difference rebalancing periods. Our results show a negative relationship between firm size and risk adjusted return regardless ofallocation strategy. The results also show that size effect is not a proxy for fundamental differencesnor a misspecification of β.
249

PREDICTING NET GAME REVENUE USING STATISTICAL MODELING : A seasonal ARIMA model including exogenous variables

Engman, Amanda, Venell, Alva January 2024 (has links)
Spelbolag AB has a long history in the Swedish market. Their products are all based on randomness, with a predetermined probability of winning. Some of Spelbolag AB's products are stable in sales throughout the year, while others fluctuate with holidays. Spelbolag AB offers products whose sales are largely influenced by the prize value; higher prize amounts attract more gamblers, while lower prize amounts attract fewer gamblers. Spelbolag AB also has products that are purchased more or less based on the value of the prize, i.e. a higher prize pot increases the number of gamblers and vice versa. Through campaigns, the company wishes to enhance the interest in their products. To estimate the total revenue from the products, a statistical tool has been used. The predictions are made for different key performance indexes (KPIs) which are used as the foundation for some strategic decisions. A wish to improve the statistical tool used by the company has risen due to poor performance. This thesis aimed to create an updated statistical tool. This tool was based on a time series analysis of the weekly net game revenue (NGR). The goal of the time series analysis was to find a statistical model with high forecast accuracy. To find the optimal model for forecast accuracy, a grid search algorithm was used. The performance measure mean squared prediction error (MSPE) was used as a decision base in the grid search along with the mean absolute percentage error (MAPE). Akaike information criterion (AIC) was also estimated as a goodness-of-fit measure. The thesis work resulted in two different SARIMAX models that were analyzed and tested, both including the same exogenous variables. The recommended SARIMAX(1, 0, 2)(1, 1, 1)52 model obtained an MAPE of 4.49%. / Spelbolag AB har en lång historia på den svenska marknaden. Deras produkter är alla slumpmässiga i dess utfall, med en förbestämd chans att vinna. Vissa av Spelbolag ABs produkter har stabil försäljning, medan andra flukturerar med högtider. Spelbolag AB har även produkter vars försäljning påverkas av vinstsumman; fler personer spelar när vinstsumman är hägre och tvärtom. Genom kampanjer önskar företaget öka intresset för sina produkter, och på så vis öka försäljningen. För att prediktera och kunna förutse de totala intäkterna från produkternas försäljning har ett statistisk verktyg använts. Dessa prediktioner har gjorts för olika KPIer, vilka används för att fatta strategiska beslut. Detta verktyg har på den senaste tiden resulterat i dåliga prediktioner, varpå en önskan om att förnya verktyget har uppkommit. Syftet med denna uppsats har därmed varit att uppdatera det statistiska verktyget. Verktyget har baserats på en tidsserieanalys av veckovist netto spelinkomst (NSI). Målet med tidsserieanalysen var att hitta en statistisk modell med hög träffsäkerhet i prediktionerna. För att hitta en optimal modell för just prediktionsnoggrannhet användes algoritmen rutnätssökning. Beslutsunderlaget i denna rutnätssökning var medelkvadratisk predikteringsfel (MSPE) samt medelabsolut procentuellt fel (MAPE). Dessutom estimerades akaike informationskriteriet (AIC) som ett mått på modellanpassning. Uppsatsen resulterade i två olika SARIMAX modeller som båda analyserades och testades, och dessa modeller inkluderade samma exogena variabler. Den rekommenderade SARIMAX(1, 0, 2)(1, 1, 1)52 modellen erhöll ett MAPE av 4.49%.
250

Channel Equalization and Spatial Diversity for Aeronautical Telemetry Applications

Williams, Ian E. 10 1900 (has links)
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California / This work explores aeronautical telemetry communication performance with the SOQPSK- TG ARTM waveforms when frequency-selective multipath corrupts received information symbols. A multi-antenna equalization scheme is presented where each antenna's unique multipath channel is equalized using a pilot-aided optimal linear minimum mean-square error filter. Following independent channel equalization, a maximal ratio combining technique is used to generate a single receiver output for detection. This multi-antenna equalization process is shown to improve detection performance over maximal ratio combining alone.

Page generated in 0.06 seconds