• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 27
  • 13
  • 12
  • 8
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 79
  • 79
  • 16
  • 16
  • 13
  • 12
  • 10
  • 10
  • 10
  • 9
  • 9
  • 8
  • 8
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Estimation of Turbulence using Magnetic Resonance Imaging

Dyverfeldt, Petter January 2005 (has links)
In the human body, turbulent flow is associated with many complications. Turbulence typically occurs downstream from stenoses and heart valve prostheses and at branch points of arteries. A proper way to study turbulence may enhance the understanding of the effects of stenoses and improve the functional assessment of damaged heart valves and heart valve prostheses. The methods of today for studying turbulence in the human body lack in either precision or speed. This thesis exploits a magnetic resonance imaging (MRI) phenomenon referred to as signal loss in order to develop a method for estimating turbulence intensity in blood flow. MRI measurements were carried out on an appropriate flow phantom. The turbulence intensity results obtained by means of the proposed method were compared with previously known turbulence intensity results. The comparison indicates that the proposed method has great potential for estimation of turbulence intensity.
12

Prissättningsmetoder vid börsintroduktioner : En studie om volatilitet och avkastning / Pricing methods at an IPO : A study about volatility and return

Johansson Rydell, Marta, Vendela Rosenblad, Lisa January 2011 (has links)
Bakgrund/Motiv: Historiskt sett tillämpades vanligen fast prissättning vid börsintroduktioner vilket innebar att aktierna ofta blev underprissatta och det var lätt för investerare att generera hög avkastning första handelsdagen. Numera används i större utsträckning anbudsförfarande och intervallprissättning där investerare lämnar anbud om pris och antal vilket har minskat underprissättningen. Studien utgår från att en del av marknadens förväntningar inkluderas i priset vid intervallprissättning vilket i sin tur skulle minska aktiens volatilitet efter introduktion. Syfte: Syftet med denna studie är att undersöka om aktiens volatilitet skiljer sig efter att den introducerats beroende på vilken av de två metoderna som använts för att prissätta aktien samt hur val av prissättningsmetod påverkar en aktiens underprissättning och avkastning efter introduktionen. Genomförande: Studien består av kvantitativa historiska data i form av aktiekurser och övrig information från de prospekt som upprättats i samband med bolagens introduktion på börsen. Utöver bearbetning av data och analyser i Excel har ett flertal ekonometriska tester genomförts med hjälp av ickelinjära regressionsanalyser där prissättningsmetod, betavärde, underprissättning och varians testats som beroende variabel mot ett flertal kombinationer av förklarande variabler. Slutsats: Studien visar att bolag som tillämpat fast prissättning uppvisar högre volatilitet efter börsintroduktion och att valet av prissättningsmetod därmed har en viss påverkan på volatiliteten. Vidare kan det konstateras att dessa bolag generellt varit mer underprissatta och genererat högre avkastning det första handelsåret. / Background: In the past, most companies performing an Initial Public Offering, IPO, applied the fixed pricing method, which often lead to an extensive underpricing of the shares. By doing so, it was easy for investors to gain high return on the first trading day. Nowadays, companies use auction pricing to a greater extent where investors bid for a certain amount of shares to a certain price. This procedure has resulted in a decrease of the underpricing. With the assumption that some of the market’s expectations are included in the price, whilst using an auction pricing method, these stocks would possibly appear less volatile after the IPO. Purpose: The aim of this study is to investigate whether the volatility of the shares is different after the introduction on the market, based on which method that has been applied when pricing the shares. The thesis also investigates to what extent the choice of pricing method influences the underpricing and returns of a share after its introduction. Method: The study comprises quantitative historical data, such as share prices as well as additional information gathered from the prospectus of each IPO. In addition to arranging the data and the analyses, made in Excel, numerous econometric analyses have been made by using non-linear regressions, where variables such as pricing method, beta, underpricing on the first trading day, and variance have been examined as a dependent variable in relation to several different combinations of explanatory variables. Findings: The study finds that companies that have practiced a fixed pricing method show a higher volatility after the introduction on the market. Thus, the choice of either pricing method has some influence on the volatility. Furthermore, it was proved that companies using a fixed pricing method were more underpriced and gained higher returns during the first year of trading compared to companies using an auction pricing method.
13

Ambient Noise Analysis in Shallow Water Ambient Noise Analysis in Shallow Water at Southwestern Sea of Taiwan

Tsai, Chung-Ting 31 December 2007 (has links)
Sound wave has much better transmission in ocean environment than electromagnetic waves, therefore sonar systems are widely applied in underwater investigations. However, not only the target signal is received by the sonar but also the noise from different directions. The noise will affect the performance of the sonar, so the understanding of ocean ambient is an important issue both in academic study and military applications. The ambient noise data of this research was collected by a passive acoustic recording system deployed in the southwest sea of Taiwan, along with the information of wind velocity in the experimented area. The influence on noise level fluctuations by the variation of the wind velocity was first discussed in light of correlation analysis. The fluctuations were expressed in terms of statistic distribution, mean value, standard deviation in different time series. As results, 500 Hz and 1.5k Hz were saturated by high levels signal from unknown sources in spring and summer, so the average sound levels were higher than in fall and winter, about 10 dB and 5 dB higher for 500 Hz and 1.5k Hz respectively. In seasonal analysis, 2.4k and 3.6k Hz have quite stable the mean levels and their standard deviations were around 3 dB. Especially, the noise level of 3.6 Hz has the least fluctuation throughout the year than any other frequencies analyzed. It was also observed that the noise level was decreased with the increase of frequency. Calculated by linear regression, this research worked out the estimation equation for the ambient noise level at high wind speed. However, the estimated values are higher than the measured data, it is due to the distribution of wind velocity. The wind data in this study was skewed towards the lower velocity, consequently the predicted values were overestimated.
14

Understanding Operating Speed Variation of Multilane Highways with New Access Density Definition and Simulation Outputs

Huang, Bing 01 January 2012 (has links)
Traffic speed is generally considered a core issue in roadway safety. Previous studies show that faster travel is not necessarily associated with an increased risk of being involved in a crash. When vehicles travel at the same speed in the same direction (even high speeds, as on interstates), they are not passing one another and cannot collide as long as they maintain the same speed. Conversely, the frequency of crashes increases when vehicles are traveling at different rates of speed. There is no doubt that the greater speed variation is, the greater the number of interactions among vehicles is, resulting in higher crash potential. This research tries to identify all major factors that are associated with speed variation on multilane highways, including roadway access density, which is considered to be the most obvious contributing factor. In addition, other factors are considered for this purpose, such as configuration of speed limits, characteristics of traffic volume, geometrics of roadways, driver behavior, environmental factors, etc. A microscopic traffic simulation method based on TSIS (Traffic Software Integrated System) is used to develop mathematical models to quantify the impacts of all possible factors on speed variation.
15

Reinforcement learning for qualitative group behaviours applied to non-player computer game characters

Bradley, Jay January 2010 (has links)
This thesis investigates how to train the increasingly large cast of characters in modern commercial computer games. Modern computer games can contain hundreds or sometimes thousands of non-player characters that each should act coherently in complex dynamic worlds, and engage appropriately with other non-player characters and human players. Too often, it is obvious that computer controlled characters are brainless zombies portraying the same repetitive hand-coded behaviour. Commercial computer games would seem a natural domain for reinforcement learning and, as the trend for selling games based on better graphics is peaking with the saturation of game shelves with excellent graphics, it seems that better artificial intelligence is the next big thing. The main contribution of this thesis is a novel style of utility function, group utility functions, for reinforcement learning that could provide automated behaviour specification for large numbers of computer game characters. Group utility functions allow arbitrary functions of the characters’ performance to represent relationships between characters and groups of characters. These qualitative relationships are learned alongside the main quantitative goal of the characters. Group utility functions can be considered a multi-agent extension of the existing programming by reward method and, an extension of the team utility function to be more generic by replacing the sum function with potentially any other function. Hierarchical group utility functions, which are group utility functions arranged in a tree structure, allow character group relationships to be learned. For illustration, the empirical work shown uses the negative standard deviation function to create balanced (or equal performance) behaviours. This balanced behaviour can be learned between characters, groups and also, between groups and single characters. Empirical experiments show that a balancing group utility function can be used to engender an equal performance between characters, groups, and groups and single characters. It is shown that it is possible to trade some amount of quantitatively measured performance for some qualitative behaviour using group utility functions. Further experiments show how the results degrade as expected when the number of characters and groups is increased. Further experimentation shows that using function approximation to approximate the learners’ value functions is one possible way to overcome the issues of scale. All the experiments are undertaken in a commercially available computer game engine. In summary, this thesis contributes a novel type of utility function potentially suitable for training many computer game characters and, empirical work on reinforcement learning used in a modern computer game engine.
16

A study of the surface finish produced by grinding

Jones, G. J. January 1985 (has links)
A survey of the literature of grinding and surface texture shows the influence of dressing and wear on surfaces involved in the process and the advantages of stylus profilometry for data collection from both grinding wheels and ground surfaces. Statistical analysis is favoured for surface profile characterization and, of the various parameters used, power spectral density alone offers some prospect of effective comparison between these surfaces. Work on grinding with single crystals of natural corundum was eventually discontinued in favour of experiments with conventional bonded grinding wheels subjected to a dressing operation and some wear in grinding steel surfaces. Statistical parameters representing the surfaces are computed using data obtained from profilograms. Results in terms of power spectral density are presented showing progressive improvement following upon developments in apparatus and methods which facilitated the use of larger surface profile samples. Transfer functions are used to relate power spectra representing corresponding pairs of surfaces. The significance of power spectral density applied to surface profile characterization is discussed and, in this context, it is suggested that these should be described as variance spectra. Attention is drawn to certain disadvantages of variance spectra applied to grinding wheel and ground surface profiles. Methods designed to improve presentation of variance spectra lead to development of a proposed new and more suitable spectrum in which density of standard deviation of surface profile ordinates with respect to frequency is plotted against frequency. Transfer functions calculated from related pairs of these standard deviation spectra show a strong linear correlation with frequency and offer prospects of convenient comparison between the profiles of the various surfaces involved in grinding.
17

IT-bubblans inverkan på den amerikanska aktiemarknadens volatilitet

Zhang, Henry, Sahlman, Alex January 2013 (has links)
Syfte: Syftet med denna studie var att se hur och varför volatiliteten påverkades i DJIA, S&P 500 och NASDAQ Composite under IT-bubblan. Metod: Års- och månadsvolatiliteten för DJIA, S&P 500 och NASDAQ Composite har beräknats under 1995-2004 med hjälp av data från Yahoo Finance. Empiri: Resultatet visar att volatiliteten var väsentligt högre i NASDAQ Composite än vad den var i S&P 500 och DJIA som i sin tur höll en liknande volatilitet i förhållande till varandra. Analys: I analysen framträdde det att volatiliteten blev väsentligt högre i samband med att bubblan sprack under maj 2000 fram till dess att paniken lade sig kort efter maj 2002. Det fanns en hög överensstämmelse mellan denna rapport och övriga tidigare studier. Teorierna var mestadels väl applicerbara. Slutsats: Volatiliteten för DJIA, S&P 500 och NASDAQ Composite var som högst mellan 2000 och 2002 under undersökningsperioden 1995-2004. IT-bubblan uppstod samt sprack till följd av irrationellt investeringsbeteende bland investerarna på aktiemarknaden och paniken som uppstod efteråt gjorde att volatiliteten på aktiemarknaden höll sig förhållandevis hög fram tills den lade sig kort efter maj 2002. NASDAQ Composite hade högst volatilitet till följd av IT-bubblan medan DJIA och S&P 500 hade likvärdig volatilitet. Samtliga index följde ett liknande mönster, detta var troligtvis på grund av att företag från NASDAQ Composite kunde återfinnas i S&P 500 samt DJIA. / Purpose: The purpose of this thesis is to see how and why the volatility was affected in DJIA, S&P 500 and NASDAQ Composite during the Dot-com bubble. Method: The yearly and monthly volatility of DJIA, S&P 500 and NASDAQ Composite were computed with data from a period spanning 1995-2004, which were collected from Yahoo Finance. Empiricism: The results illustrate that the volatility was vastly higher in NASDAQ Composite than in DJIA and S&P 500 which in turn yielded a comparable volatility in relation to each other. Analysis: The analysis extracted the fact that the volatility rose considerably after the bubble burst during May 2000 and started waning after the panic died out circa May 2002. There were a relatively high harmony between the results of this report and the earlier studies which it was compared to. Conclusion: The volatility for DJIA, S&P 500 and NASDAQ 500 was higher between 200 and 2002 than during the rest of the observed period. The Dot-com bubble arose due to irrational investment behavior among investors and the panic which arose afterwards contributed to the increasing volatility which maintained a high level until it subsided after May 2002. NASDAQ Composite had the highest volatility during the Dot-com bubble while DJIA and S&P 500 had a similar volatility. All indexes followed a similar pattern, this was probably due to that companies from NASDAQ Composite reasonably should be found in S&P 500 and DJIA.
18

Abordagem do ensino de desvio padrão em livros didáticos

Francisco, Sérgio Luiz 30 September 2013 (has links)
Made available in DSpace on 2016-06-02T20:29:25Z (GMT). No. of bitstreams: 1 5559.pdf: 4011578 bytes, checksum: da9d129d8556cec59dfe06e20b028d0b (MD5) Previous issue date: 2013-09-30 / Financiadora de Estudos e Projetos / Teaching Statistics as part of the axis Treatment Information is seen by PCN (Parâmetros Curriculares Nacionais) as a tool for the interpretation of the world that surrounds the pupil, in view of the diversity of areas using the typical elements of Statistics (such as tables, graphs, etc.) in the dissemination of information. Hence, the need for teaching this subject should strive for contextualization of its contents and provide information for decision-making. The objective of this work was to present a didactic sequence that judges be more appropriate as the significance for the student with regard to the teaching of some of these elements of statistics, specifically the Standard Deviation, the application of an exercise in a class of 3rd. year of high school public school education in the city of Jahu. Together, adds that, after analysis of seven textbooks indicated by PNLEM programs (Programa Nacional do Livro Didático do Ensino Médio) and PNLD ((Programa Nacional do Livro Didático) as to approach the teaching of statistics, it was found that in one there is the option for this instructional sequence in which the settings of the approach is associated with the Normal Curve probabilities related to a frequency distribution. Expected to lead teachers and mentors to reflect on the teaching of this subject, both in performance in the classroom or guidance for professionals in education, as in the choice of books or other educational materials. / O ensino de Estatística como parte do eixo de Tratamento de Informações é visto pelos PCNs (Parâmetros Curriculares Nacionais) como uma ferramenta para a interpretação do mundo que cerca o aluno, tendo em vista a diversidade de áreas que utilizam os elementos típicos da Estatística (como tabelas, gráficos, etc) na divulgação de informações. Daí, a necessidade de que o ensino deste tema deva primar pela contextualização de seus conteúdos e fornecer subsídios para as tomadas de decisões. Assim, o objetivo deste trabalho foi apresentar uma sequência didática que julga ser mais adequada quanto à significância para o aluno no que diz respeito ao ensino de alguns desses elementos da Estatística, especificamente o Desvio Padrão, com a aplicação de um exercício em uma classe da 3ª. série do Ensino Médio da rede pública estadual de educação, na cidade de Jahu. Juntamente, acrescenta-se que, após a análise de sete livros didáticos indicados pelos programas PNLEM (Programa Nacional do Livro Didático do Ensino Médio) e PNLD (Programa Nacional do Livro Didático), quanto à abordagem do ensino de Estatística, verificou-se que em apenas um há a opção por essa sequência didática, na qual a abordagem das definições da Curva Normal é associada às Probabilidades referentes a uma distribuição de frequências. Espera-se levar professores e orientadores pedagógicos à reflexão sobre o ensino desse tema, tanto na atuação em sala de aula ou em orientações para profissionais da área de educação, como, na escolha de livros ou quaisquer outros materiais didáticos.
19

Meranie výkonnosti portfólia / Portfolio performance measurement

Csörgö, Tomáš January 2013 (has links)
The goal of the master thesis is to analyze portfolio performance. The theoretical part of the thesis describes risk, portfolio performance measurement, investment funds, theory of portfolio. The analysis of portfolio performance is measured by different portfolio measurement tools.
20

Napěťové reference v bipolárním a CMOS procesu / Voltage References in Bipolar and CMOS Process

Kotrč, Václav January 2015 (has links)
This diploma thesis deals with precise design of Brokaw BandGap voltage reference comparing with MOS references. There is STEP BY STEP separation and analysis of proposed devices, using Monte Carlo analysis. There are also presented the methods for achieving a lower deviation of the output voltage for yielding device, which needs no trimming.

Page generated in 0.1109 seconds