11 |
Task-based optimization of flip angle for fibrosis detection in T1-weighted MRI of liverBrand, Jonathan F., Furenlid, Lars R., Altbach, Maria I., Galons, Jean-Philippe, Bhattacharyya, Achyut, Sharma, Puneet, Bhattacharyya, Tulshi, Bilgin, Ali, Martin, Diego R. 21 July 2016 (has links)
Chronic liver disease is a worldwide health problem, and hepatic fibrosis (HF) is one of the hallmarks of the disease. The current reference standard for diagnosing HF is biopsy followed by pathologist examination; however, this is limited by sampling error and carries a risk of complications. Pathology diagnosis of HF is based on textural change in the liver as a lobular collagen network that develops within portal triads. The scale of collagen lobules is characteristically in the order of 1 to 5 mm, which approximates the resolution limit of in vivo gadolinium-enhanced magnetic resonance imaging in the delayed phase. We use MRI of formalin-fixed human ex vivo liver samples as phantoms that mimic the textural contrast of in vivo Gd-MRI. We have developed a local texture analysis that is applied to phantom images, and the results are used to train model observers to detect HF. The performance of the observer is assessed with the area-under-the-receiver-operator-characteristic curve (AUROC) as the figure-of-merit. To optimize the MRI pulse sequence, phantoms were scanned with multiple times at a range of flip angles. The flip angle that was associated with the highest AUROC was chosen as optimal for the task of detecting HF. (C) The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
|
12 |
A Theoretical and Empirical Analysis of the Impact of the Digital Age on the Music IndustryMichel, Norbert 19 December 2003 (has links)
We present an in-depth analysis of the music industry and use our findings to judge the practical assumptions and design of an original theoretical model. The model is in three stages, where, in a Hotelling-type framework, the last agents to act are consumers who choose between copying, purchasing, or staying out of the market for music. Prior to the last stage, the record label chooses its profit maximizing price and, in the first stage, we incorporate the artist-label bargaining agreement into a theoretical framework using the Nash cooperative bargaining solution. The current structure of the music industry is a combination of the oligopoly and monopolistic competition models, consisting of five major labels and many independents. Despite major labels' advantage in large-scale distribution, we argue that digital downloading has the potential to radically alter the current industry structure, and that artists would be unable to sell their music in such an environment without enforceable copyrights. Our model assumes that the most important determinants of CD and copy demand are consumers' tastes and transaction costs of copying, CD prices, and the substitutability between CDs and copies. We hypothesize that Internet file-sharing has been undertaken by both consumers who were previously not in the market, and by those who decided to copy rather than buy. In regard to firm strategy, the model suggests that labels could increase the sales of CDs by trying to increase consumers' taste for music, perhaps by reducing the price of CDs. Our model also predicts a positive relationship between artists' optimal share of album sales and their bargaining power, as well as a negative relationship between artists' optimal share and their risk aversion. Since lowering the reliance on labels for distribution would increase artists' bargaining power, our model predicts that artists' share of profits should increase as legitimate digital distribution gains prominence. We also provide empirical testing of our hypothesis that some music file-sharing has been done by consumers frequently not in the market. After examining consumers' expenditures and aggregate industry sales, we are unable to reject our hypothesis
|
13 |
以Hotelling模型分析多重產品選擇策略與網路外部性洪麗文 Unknown Date (has links)
本篇文章主要是Hotelling Spatial模型的延伸,不同於之前相關議題的探討,作者打破廠商一次只能選擇一種產品的侷限,而改用多重產品選擇的方法,並使用廠商為序列進入的方法分析雙占市場中廠商產品選擇的決策。另外,引進產品具有網路外部性且產品間彼此不相容的特色,分別討論在廠商為獨占時,最適的產品差異及多樣化的程度,以及雙占市場中,消費者對不同廠商的產品所產生的網路效果評價不同時,採取Bertrand的價格競爭後,均衡時的市場佔有率及產品價格,如何受到網路效果強弱的影響。
|
14 |
慈善組織區位決策與網路效果 / Charity Niche Marketing and Network Effect劉建宏 Unknown Date (has links)
在進行捐贈行為時,若以極大化個人效用為出發點,捐贈者首要應以慈善組織的理念與關懷對象與自身關懷對象是否相近為主要考量,本文將此關懷對象的定位定義為慈善組織的區位,討論慈善組織在競爭上的區位策略。除此之外,同時也將慈善組織的規模與公信力對捐贈者的影響力納入考量。
本文以線性城市(linear city)模型補捉捐贈者與慈善組織的關懷對象的區位,並考慮到捐贈者對於慈善組織規模的考量,將傳統應用在電信產品的「網路效果」引入捐贈者的效用函數中,同時考慮區位差距與組織規模對捐贈者的影響。本文研究後有幾點發現:第一,慈善組織為了避免過度的募款競爭,會避免服務對象的性質過度重疊;第二:網路效果的引入會加劇慈善組織之間的競爭進而提升慈善組織的募款努力程度;第三:當慈善組織之間的網路效果規模有差距時,捐贈資源會集中到規模較大的慈善組織,同時規模較大的慈善組織會擴大其服務對象的範圍,而規模較小的慈善組織所得的捐贈資源也會因此減少。 / Naturally, utility maximizing donors first consider charities sharing their same ideology towards those in need of helps. Therefore, in setting up fund-raising charities,
the choice of “location”, in the spectrum of all potential donees, will have effects on the fund-raising results. In addition, donors also often take into account operation scales and credibility when it comes to choosing among different charities.
This study proposes a model that the location choice of two homogenous charities is captured through a linear city framework, and a “network effect” in the utility of donors is introduced to account for the influence of scale and credibility of charities. Several findings emerge. First, in equilibrium, charities differentiate in the
choice of location to avoid intense competition in fund-raising efforts. Second, the existence of network effect drives competing charities to exert more fund-raising
efforts. Third, asymmetric network externality has the effect to redistribute donations away from the small network charity when the large network charity moves towards
to “center” of the market. Finally, some welfare implications are explored.
|
15 |
An attempt to value Canadian oil and natural gas reserves : an extension of the hotelling valuation principleShumlich, Michael 16 July 2008
The importance of the Hotelling Valuation Principle (HVP) in economic study lies in its ability to examine and drive the decision of how much of a non-renewable natural resource to produce now versus how much to conserve for future generations - the root of natural resource policy, conservation, regulation, and taxation. Hotelling (1931) assumes that net price (selling price less cost per unit of production) will grow at the discount rate, which in a deterministic setting implies that reserve value is equal to current net price. However, the application of this ideal theory to the oil and gas industry may be difficult.<p>The oil and gas industry is influenced by government regulation, potential monopolistic forces, and well production characteristics - each of which violate the assumptions of Hotellings (1931) basic theory. How these violations affect the HVP is an open question. Most have the effect of limiting current supply, and thus driving prices higher than they would be in a perfectly competitive market. On the other hand, at least in the Canadian context, government regulation tends to increase costs, whereas technological advancement tends to reduce costs. The net result of these effects on future net prices and their discounted value, and therefore the effect on the HVP, is not clear a priori.<p>Another problem relating Hotellings (1931) basic theory to the oil and gas industry lies in the stochastic nature of a firms future net prices and extraction quantities, the product of which gives the firms future cash flows. Correlation between quantity and net price may result from expanding production when prices are high and reducing production when prices are low. Of course such correlation will affect the expected cash flows, and therefore firm value. Or, in other words, the ability to adjust production quantity provides real options for oil and gas firms which may add value.<p>Previous tests of the HVP on oil and gas reserves have utilized data that may contain confounding information that results in unreliable conclusions. The two major deficiencies include using (1) acquisition values, which utilize basin-average rather than firm specific net price data, and (2) conventional oil and gas company market valuations, which incorporate additional management exploration expertise value beyond the reserves value.<p>This study contributes to the literature by providing a more definitive test of the HVP through the use of Canadian oil and gas royalty trusts. These pure play publicly traded entities are focused on production rather than exploration and essentially remediate the deficiencies found in previous literature. Additionally, I include an ancillary variable to proxy real option value and control variables for firm characteristics such as oil weighting (proportion of oil relative to natural gas reserves), reserve quality (proportion of proven producing reserves relative to proven non-producing reserves), and firm size (based on enterprise value). This gives the reader a better understanding of value drivers in the Canadian oil and gas royalty trust sector and how they relate to the HVP.<p>My study generally fails to find support for the HVP. In particular, the results indicate that the HVP overestimates reserve value. This suggests that market participants expect net prices to grow at a rate significantly lower than the fair cost of capital, and production constraints limiting the extraction rate are binding. I do find that the real option proxy explains a significant amount of the difference between the value observed and the value predicted by the HVP. This differs markedly from what previous literature on the HVP applied to market data for the oil and gas industry documents. Each of these papers fails to reject the HVP. The fact that I generally find the value to be lower than that predicted by the HVP is not surprising given the previous literature using market data to test it. Since these studies use conventional oil and gas companies, which likely overvalue reserves because of an exploration premium, finding support for the HVP likely means that royalty trusts will likely correspond to a value lower than that predicted. The difference could account for the exploration premium. On the other hand, when I use the log-linear specification over the second, more volatile sub-sample, I also fail to reject Hotellings theoretical value, which is consistent with previous literature using market data.
|
16 |
An attempt to value Canadian oil and natural gas reserves : an extension of the hotelling valuation principleShumlich, Michael 16 July 2008 (has links)
The importance of the Hotelling Valuation Principle (HVP) in economic study lies in its ability to examine and drive the decision of how much of a non-renewable natural resource to produce now versus how much to conserve for future generations - the root of natural resource policy, conservation, regulation, and taxation. Hotelling (1931) assumes that net price (selling price less cost per unit of production) will grow at the discount rate, which in a deterministic setting implies that reserve value is equal to current net price. However, the application of this ideal theory to the oil and gas industry may be difficult.<p>The oil and gas industry is influenced by government regulation, potential monopolistic forces, and well production characteristics - each of which violate the assumptions of Hotellings (1931) basic theory. How these violations affect the HVP is an open question. Most have the effect of limiting current supply, and thus driving prices higher than they would be in a perfectly competitive market. On the other hand, at least in the Canadian context, government regulation tends to increase costs, whereas technological advancement tends to reduce costs. The net result of these effects on future net prices and their discounted value, and therefore the effect on the HVP, is not clear a priori.<p>Another problem relating Hotellings (1931) basic theory to the oil and gas industry lies in the stochastic nature of a firms future net prices and extraction quantities, the product of which gives the firms future cash flows. Correlation between quantity and net price may result from expanding production when prices are high and reducing production when prices are low. Of course such correlation will affect the expected cash flows, and therefore firm value. Or, in other words, the ability to adjust production quantity provides real options for oil and gas firms which may add value.<p>Previous tests of the HVP on oil and gas reserves have utilized data that may contain confounding information that results in unreliable conclusions. The two major deficiencies include using (1) acquisition values, which utilize basin-average rather than firm specific net price data, and (2) conventional oil and gas company market valuations, which incorporate additional management exploration expertise value beyond the reserves value.<p>This study contributes to the literature by providing a more definitive test of the HVP through the use of Canadian oil and gas royalty trusts. These pure play publicly traded entities are focused on production rather than exploration and essentially remediate the deficiencies found in previous literature. Additionally, I include an ancillary variable to proxy real option value and control variables for firm characteristics such as oil weighting (proportion of oil relative to natural gas reserves), reserve quality (proportion of proven producing reserves relative to proven non-producing reserves), and firm size (based on enterprise value). This gives the reader a better understanding of value drivers in the Canadian oil and gas royalty trust sector and how they relate to the HVP.<p>My study generally fails to find support for the HVP. In particular, the results indicate that the HVP overestimates reserve value. This suggests that market participants expect net prices to grow at a rate significantly lower than the fair cost of capital, and production constraints limiting the extraction rate are binding. I do find that the real option proxy explains a significant amount of the difference between the value observed and the value predicted by the HVP. This differs markedly from what previous literature on the HVP applied to market data for the oil and gas industry documents. Each of these papers fails to reject the HVP. The fact that I generally find the value to be lower than that predicted by the HVP is not surprising given the previous literature using market data to test it. Since these studies use conventional oil and gas companies, which likely overvalue reserves because of an exploration premium, finding support for the HVP likely means that royalty trusts will likely correspond to a value lower than that predicted. The difference could account for the exploration premium. On the other hand, when I use the log-linear specification over the second, more volatile sub-sample, I also fail to reject Hotellings theoretical value, which is consistent with previous literature using market data.
|
17 |
Objective assessment of image quality (OAIQ) in fluorescence-enhanced optical imagingSahu, Amit K. 15 May 2009 (has links)
The statistical evaluation of molecular imaging approaches for detecting, diagnosing,
and monitoring molecular response to treatment are required prior to their adoption. The
assessment of fluorescence-enhanced optical imaging is particularly challenging since
neither instrument nor agent has been established. Small animal imaging does not
address the depth of penetration issues adequately and the risk of administering
molecular optical imaging agents into patients remains unknown. Herein, we focus
upon the development of a framework for OAIQ which includes a lumpy-object model
to simulate natural anatomical tissue structure as well as the non-specific distribution of
fluorescent contrast agents. This work is required for adoption of fluorescence-enhanced
optical imaging in the clinic.
Herein, the imaging system is simulated by the diffusion approximation of the
time-dependent radiative transfer equation, which describes near infra-red light
propagation through clinically relevant volumes. We predict the time-dependent light
propagation within a 200 cc breast interrogated with 25 points of excitation illumination
and 128 points of fluorescent light collection. We simulate the fluorescence generation
from Cardio-Green at tissue target concentrations of 1, 0.5, and 0.25 µM with backgrounds containing 0.01 µM. The fluorescence boundary measurements for 1 cc
spherical targets simulated within lumpy backgrounds of (i) endogenous optical
properties (absorption and scattering), as well as (ii) exogenous fluorophore crosssection
are generated with lump strength varying up to 100% of the average background.
The imaging data are then used to validate a PMBF/CONTN tomographic reconstruction
algorithm. Our results show that the image recovery is sensitive to the heterogeneous
background structures. Further analysis on the imaging data by a Hotelling observer
affirms that the detection capability of the imaging system is adversely affected by the
presence of heterogeneous background structures. The above issue is also addressed
using the human-observer studies wherein multiple cases of randomly located targets
superimposed on random heterogeneous backgrounds are used in a “double-blind”
situation. The results of this study show consistency with the outcome of above
mentioned analyses. Finally, the Hotelling observer’s analysis is used to demonstrate (i)
the inverse correlation between detectability and target depth, and (ii) the plateauing of
detectability with improved excitation light rejection.
|
18 |
Πολυμεταβλητή στατιστική ανάλυσηΚαλκούνου, Δήμητρα 02 April 2014 (has links)
Τις τελευταίες δεκαετίες κυρίως λόγω της εμφάνισης των ηλεκτρονικών υπολογιστών έχει μεταβληθεί η φυσιογνωμία της στατιστικής επιστήμης και οι εφαρμογές της έχουν επεκταθεί σε πολλούς τομείς, αφού έγινε εύκολη και σύντομη επεξεργασία μεγάλου όγκου δεδομένων. Αυτές οι νέες συνθήκες στατιστικών αναλύσεων οδήγησαν τους στατιστικούς στην εξέλιξη πολλών θεωρητικών μεθόδων. Μια μεγάλη ενότητα αυτών των μεθόδων είναι η Πολυμεταβλητή Στατιστική Ανάλυση, που αποτελεί μια εξαιρετικά ενδιαφέρουσα κατεύθυνση της στατιστικής επιστήμης.
Για να έχουμε το τελικό προιόν από μια έρευνα πρέπει να προκύψει μετά από ένα σύνολο μετρήσεων που θα γίνει σε ένα σύνολο πειραμάτων. Με τις συνήθεις όμως στατιστικές αναλύσεις εξετάζεται κάθε φορά και όχι η συνισταμένη δράση αυτών ταυτόχρονα. Έτσι πρέπει να καταφύγουμε στην Πολυμεταβλητή Ανάλυση.
Η Πολυμεταβλητή Ανάλυση ασχολείται με στατιστικές μεθόδους συλλογής, περιγραφής και ανάλυσης δεδομένων που αποτελούνται από μετρήσεις πολλών μεταβλητών σε ένα πλήθος ατόμων ή γενικότερων πειραματικών μονάδων.
Σε αυτήν την εργασία θα δούμε τις τεχνικές και τα αποτελέσματα, ώστε να αναπτύξουμε τεχνικές για την ανάλυση δεδομένων. Ο στόχος μου θα είναι να παρουσιάσω μια πλήρη στατιστική ανάλυση του στοιχείου που βασίζεται σε ταυτόχρονες δηλώσεις εμπιστοσύνης. Ένα από τα κεντρικά μηνύματα της πολυμεταβλητής ανάλυσης είναι ότι οι p - μεταβλητές πρέπει να αναλυθούν από κοινού. Επομένως θα πρέπει να χρησιμοποιήσουμε πολυμεταβλητούς ελέγχους υποθλεσεων, οι οποίοι θα εξετάζουν όλο το διάνυσμα κάθε παρατήρησης και όχι μεμονωμένες μεταβλητές.
Επιπλέον, θα δούμε τη μέθοδο της Πολυμεταβλητής Ανάλυσης Διακύμανσης, η οποία αποτελεί γενίκευση της Μονομεταβλητής Ανάλυσης Διακύμανσης, όταν εξετάζουμε περισσότερες από μια μεταβλητές. Άποτελεί λοιπόν μια μέθοδο ελέγχου του αν οι μέσοι δυο ή περισσοτέρων ομάδων διαφέρουν και γενικεύοντας σε περιπτώσεις πολλών παραγόντων, αν οι παράγοντες αυτοί επιδρούν στη μέση τιμή ( μιλώντας πια για διάνυσμα μέσων τιμών. Πολλά πράγματα που ισχύουν στη Μονομεταβλητή περίπτωση μεταφέρονται με ανάλογο τρόπο και στην Πολυμεταβλητή περίπτωση, όπως για παράδειγμα η διάσπαση της συνολικής διακύμανσης στη μεταξύ των ομάδων και εντός των ομάδων διακύμανση. / --
|
19 |
Decision Making for Information Security InvestmentsYeo, M. Lisa Unknown Date
No description available.
|
20 |
Quantification and Maximization of Performance Measures for Photon Counting Spectral Computed TomographyYveborg, Moa January 2015 (has links)
During my time as a PhD student at the Physics of Medical Imaging group at KTH, I have taken part in the work of developing a photon counting spectrally resolved silicon detector for clinical computed tomography. This work has largely motivated the direction of my research, and is the main reason for my focus on certain issues. Early in the work, a need to quantify and optimize the performance of a spectrally resolved detector was identified. A large part of my work have thus consisted of reviewing conventional methods used for performance quantification and optimization in computed tomography, and identifying which are best suited for the characterization of a spectrally resolved system. In addition, my work has included comparisons of conventional systems with the detector we are developing. The collected result after a little more than four years of work are four publications and three conference papers. This compilation thesis consists of five introductory chapters and my four publications. The introductory chapters are not self-contained in the sense that the theory and results from all my published work are included. Rather, they are written with the purpose of being a context in which the papers should be read. The first two chapters treat the general purpose of the introductory chapters, and the theory of computed tomography including the distinction between conventional, non-spectral, computed tomography, and different practical implementations of spectral computed tomography. The second chapter consists of a review of the conventional methods developed for quantification and optimization of image quality in terms of detectability and signal-to-noise ratio, part of which are included in my published work. In addition, the theory on which the method of material basis decomposition is based on is presented, together with a condensed version of the results from my work on the comparison of two systems with fundamentally different practical solutions for material quantification. In the fourth chapter, previously unpublished measurements on the photon counting spectrally resolved detector we are developing are presented, and compared to Monte Carlo simulations. In the fifth and final chapter, a summary of the appended publications is included. / <p>QC 20150303</p>
|
Page generated in 0.1697 seconds