Spelling suggestions: "subject:"value off information"" "subject:"value oof information""
41 |
ENERGIEFFEKTIVISERING AV INDUSTRIELLA VERKSAMHETER : Värderingar grundade i ekonomiska, miljö- och sociala aspekter för GKN ePowertrain, KöpingSöder Altschul, Joakim, Karlsson, Tomas January 2019 (has links)
To decrease the ecological footprint, humans either have to adjust their lifestyles, or the large scale industries must take corporate social responsibility. This study is based on the well-developed field of energy efficiency in industries by applying technology and organizational-focused proposals. The proposals are based on three different aspects, the economic, the environmental, and the social. These three aspects combined are called the triple bottom line perspective. An original case of the study objects energy balance was determined to develop the conclusions, with the simulation program IDA ICE. The study object was GKN ePowertrain, located in Köping. Energy efficiency cases were simulated in IDA ICE to observe the change in the energy balance. The cases and interviews of the employees were the foundation of the discussion where the improvements were critically reviewed from the triple bottom line perspective. The result shown that the temperature was too high for working conditions, the ventilation system consumes a large quantity of energy, and the internal flow of information is insufficient. In conclusion, GKN ePowertrain would increase their overall value by investing in a cooling system and more efficient heat exchangers for their ventilation system. These investments would notably increase their short term value of environmental sustainability and the social aspect. Furthermore, their economic value would increase in the long term. The cooling system would improve the working environment, and a new ventilation system would increase the heat recovery and decrease the energy consumed, even more than the consumption of the cooling system. Finally, GKN should also be more distinct in their information to the employees in the building regarding energy aims and their working environment, to have a positive gain of value in all the fields.
|
42 |
A framework for simulation-based integrated design of multiscale products and design processesPanchal, Jitesh H. 23 November 2005 (has links)
The complexity in multiscale systems design is significantly greater than in conventional systems because in addition to interactions between components, couplings between physical phenomena and scales are also important. This complexity amplifies two design challenges: a) complexity of coupled simulation models prohibits design space exploration, and b) unavailability of complete simulation models that capture all the interactions. Hence, the challenge in design of multiscale systems lies in managing this complexity and utilizing the available simulation models and information in an efficient manner to support effective decision-making.
In order to address this challenge, our primary hypothesis is that the information and computational resources can be utilized in an efficient manner by designing design-processes (meta-design) along with the products. The primary hypothesis is embodied in this dissertation as a framework for integrated design of products and design processes. The framework consists of three components 1) a Robust Multiscale Design Exploration Method (RMS-DEM), 2) information-economics based metrics and methods for simplification of complex design processes and refinement of simulation models, and 3) an information modeling strategy for implementation of the theoretical framework into a computational environment.
The framework is validated using the validation-square approach that consists of theoretical and empirical validation. Empirical validation of the framework is carried out using various examples including: pressure vessel design, datacenter cooling system design, linear cellular alloy design, and multifunctional energetic structural materials design. The contributions from this dissertation are categorized in three research domains: a) multiscale design methodology, b) materials design, and c) computer-based support for collaborative, simulation-based multiscale design. In the domain of design methodology, new methods and metrics are developed for integrating the design of products and design processes. The methods and metrics are applied in the field of materials design to develop design-processes and specifications for Multifunctional Energetic Structural Materials. In the domain of computer-based support for design, an information modeling strategy is developed to provide computational support for meta-design. Although the framework is developed in the context of multiscale systems it is equally applicable to design of any other complex system.
|
43 |
Assessing reservoir performance and modeling risk using real optionsSingh, Harpreet 02 August 2012 (has links)
Reservoir economic performance is based upon future cash flows which can be generated from a reservoir. Future cash flows are a function of hydrocarbon volumetric flow rates which a reservoir can produce, and the market conditions. Both of these functions of future cash flows are associated with uncertainties. There is uncertainty associated in estimates of future hydrocarbon flow rates due to uncertainty in geological model, limited availability and type of data, and the complexities involved in the reservoir modeling process. The second source of uncertainty associated with future cash flows come from changing oil prices, rate of return etc., which are all functions of market dynamics. Robust integration of these two sources of uncertainty, i.e. future hydrocarbon flow rates and market dynamics, in a model to predict cash flows from a reservoir is an essential part of risk assessment, but a difficult task. Current practices to assess a reservoir’s economic performance by using Deterministic Cash Flow (DCF) methods have been unsuccessful in their predictions because of lack in parametric capability to robustly and completely incorporate these both types of uncertainties.
This thesis presents a procedure which accounts for uncertainty in hydrocarbon production forecasts due to incomplete geologic information, and a novel real options methodology to assess the project economics for upstream petroleum industry. The modeling approach entails determining future hydrocarbon production rates due to incomplete geologic information with and without secondary information. The price of hydrocarbons is modeled separately, and the costs to produce them are determined based on market dynamics. A real options methodology is used to assess the effective cash flows from the reservoir, and hence, to determine the project economics. This methodology associates realistic probabilities, which are quantified using the method’s parameters, with benefits and costs. The results from this methodology are compared against the results from DCF methodology to examine if the real options methodology can identify some hidden potential of a reservoir’s performance which DCF might not be able to uncover. This methodology is then applied to various case studies and strategies for planning and decision making. / text
|
44 |
Essays in Financial Econometric Investigations of Farmland ValuationsXu, Jin 16 December 2013 (has links)
This dissertation consists of three essays wherein tools of financial econometrics are used to study the three aspects of farmland valuation puzzle: short-term boom-bust cycles, overpricing of farmland, and inconclusive effects of direct government payments.
Essay I addresses the causes of unexplained short-term boom-bust cycles in farmland values in a dynamic land pricing model (DLPM). The analysis finds that gross return rate of farmland asset decreases as the farmland asset level increases, and that the diminishing return function of farmland asset contributes to the boom-bust cycles in farmland values. Furthermore, it is mathematically proved that land values are potentially unstable under diminishing return functions. We also find that intertemporal elasticity of substitution, risk aversion, and transaction costs are important determinants of farmland asset values.
Essay II examines the apparent overpricing of farmland by decomposing the forecast error variance of farmland prices into forward looking and backward looking components. The analysis finds that in the short run, the forward looking Capital Asset Pricing Model (CAPM) portion of the forecast errors are significantly higher in a boom or bust stage than in a stable stage. This shows that the farmland market absorbs economic information in a discriminative manner according to the stability of the market, and the market (and actors therein) responds to new information gradually as suggested by the theory. This helps to explain the overpricing of farmland, but this explanation works primarily in the short run.
Finally, essay III investigates the duel effects of direct government payments and climate change on farmland values. This study uses a smooth coefficient semi-parametric panel data model. The analysis finds that land valuation is affected by climate change and government payments, both through discounted revenues and through effects on the risk aversion of land owners. This essay shows that including heterogeneous risk aversion is an efficient way to mitigate the impacts of misspecifications in a DLPM, and that precipitation is a good explanatory variable. In particular, precipitation affects land values in a bimodal manner, indicating that farmland prices could have multiple peaks in precipitation due to adaption through crop selection and technology alternation.
|
45 |
圖書書目著錄加值分析之研究 / A Study of Value-Added Information Analysis on Bibliographic Descriptions鄧英蘭, Deng, Ying-Lan Unknown Date (has links)
由於資訊科技的發展,傳統的編目作業面臨轉型的壓力,必須尋求自我提昇,提供更多的加值服務,才能因應知識經濟時代的需要。本文主要探討圖書書目著錄應加強那些項目的加值分析,以提昇資料的檢索與使用率。本研究以問卷調查編目館員,並訪談教授分類編目相關課程的教師,目的在探討傳統編目作業的轉型,及研究資料組織中的加值作業要點,進而探討編目館員的在職訓練及編目教育如何因應加值分析的需要。
本研究發現,在新科技對編目作業的衝擊下,編目部門組織與作業進行調整;編目人力縮編;原始編目所佔的比率降低;書目著錄應加強著錄的項目為內容分析、目次、摘要、書評、封面、附錄,以及書目資料的連結關係;作好加值分析必須克服人力不足及系統配合之問題,瞭解讀者的檢索需求,以及透過「全國圖書資訊網」推動書目著錄加值分析作業;同時必須重視編目館員的在職訓練及編目教育,培養編目館員的學科背景、外語能力,以及具備運用現代技術和設備的能力。
對圖書館進行書目著錄加值分析作業方面的建議:(1)圖書館應修訂書目著錄加值分析的作業規範;(2)各館應重視書目著錄之加值分析;(3)瞭解讀者對目錄使用的需要,設計多元化的檢索方式;(4)編目館員應隨時注意標準與新科技的發展;(5)在圖書資訊學系所課程中強調加值分析的理念,或加開相關課程,並加強在職訓練的課程內容。希望藉此喚起對書目著錄的重視,作好書目著錄的加值分析,提供完善的資訊,滿足讀者多元化的檢索需求。
|
46 |
Valor da flexibilização e informação em desenvolvimento de campo por modulos / Value of information in development of oil filed by modulesHayashi, Suzana Hisako Deguchi 15 August 2018 (has links)
Orientador: Denis Jose Schiozer / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica e Instituto de Geociencias / Made available in DSpace on 2018-08-15T12:53:43Z (GMT). No. of bitstreams: 1
Hayashi_SuzanaHisakoDeguchi_M.pdf: 16467779 bytes, checksum: 4345ae4f5ff27ff7cee4722d4b305f55 (MD5)
Previous issue date: 2006 / Resumo: O risco é inerente às várias fases da vida de um campo de petróleo, devido às incertezas, geológicas, econômicas e tecnológicas que influenciam o valor de um projeto. A aquisição de informações e a adição de flexibilidade na implantação de um projeto são os principais processos que permitem a mitigação dos riscos associados. O conceito de Valor da Informação (VDI) permite medir quantitativamente os benefícios resultantes da aquisição adicional de dados, que permite definir o projeto de desenvolvimento com mais precisão, podendo trazer modificações significativas em relação à concepção inicial (projeto conceitual). O conceito de Valor de Flexibilização (VDF) permite medir os benefícios de adicionar flexibilidade, por exemplo, no cronograma de implantação de um projeto, com o objetivo de possibilitar um melhor gerenciamento de reservatórios frente aos possíveis cenários. Os conceitos de VDI e VDF são usados neste trabalho para determinar o valor de adquirir novas informações para o projeto, considerando um atraso no cronograma causado pela flexibilização do momento de definição e aprovação do projeto básico. Uma técnica baseada nos Modelos Geológicos Representativos (MOR} e nas árvores de decisão é aplicada no processo de análise de decisão. Os resultados deste trabalho mostram que a metodologia proposta neste trabalho é aplicável em modelos de grande porte. Outras conclusões são que a relevância da aquisição de informações aumenta em cenários de preço de óleo mais baixo e que é importante analisar a redução de risco como variável adicional ao retomo financeiro no processo de decisão como o analisado neste trabalho / Abstract: The risk is inherent to several phases of a petroleum field development due to geological, economic and technological uncertainties, which influence the value of a project. The acquisition, of additional information and flexibility in the implementation, of a project are the main processes, which permit risk mitigation. The concept of Value of Information (VoI) permits to measure quantitatively the benefits of the new information that yield more accuracy in the definition of the development project and it can bring important modifications in comparison with the initial conception of the project. The concept of Value of Flexibility (VoF) allows measuring the benefits of flexibility in the implementation of a project yielding better reservoir management. The concepts of VoI and VoF are used in this work to determine the value of new information in a project, considering a delay in the schedule caused by the flexibility in the moment of definition and approval of the final project. A decision tree technique, associated to Geological Representative Models (GRM), is applied in the process of the quantification of the value of information and flexibility. Based on the results of this work, it is possible to conclude that: the methodology is useful for large fields; the relevance of information acquisition increases in low prices scenarios and; if is important to analyze risk mitigation in addition to financial gain in decision making processes like the one studied in this work / Mestrado / Engenharia de Petroleo / Mestre em Ciências e Engenharia de Petróleo
|
47 |
Management of Civil Infrastructure based on Structural Health MonitoringTonelli, Daniel 30 July 2020 (has links)
The interest in structural health monitoring (SHM) has grown considerably in the past half century, due to an explosive growth in the availability of new sensors, the development of powerful data analysis techniques, and the increasing number of civil infrastructure that are approaching or exceeding their initial design life. In SHM, we acquire observation on the behavior of a structure to understand its condition state, based on which we decide how to manage it properly. However, this optimistic view of SHM is in contrast with what happen in real life: infrastructure operators are typically skeptical about the capacity of monitoring to support decisions, and instead of following the suggestions provided by SHM, they often act based on their experience or common sense. The reason is that at present it is not fully clear how in practice to make decisions based on monitoring observation. To fill this gap between theory and practice, I propose to consider SHM as a logical process of making decision based on observation consisting of two steps: judgment, in which the condition state of structures is inferred based on SHM data, and decision, in which the optimal action is identified based on a rational and economic principle. From this perspective, a monitoring system should provide information that can improe he managers knoledge on he srcral condiion sae enough to allow them to make better decision on the structure management. Therefore, in designing a monitoring system, the design target must be the accuracy in the knowledge of structural state achieved analyzing the observations provided by it. However, when an engineer designs a monitoring system, the approach is often heuristic, with performance evaluation based on experience or common sense rather than on quantitative analysis. For this reason, I propose a performance-based monitoring system design, which is a quantitative method for the calculation of the expected performance of a monitoring solution a pre-posteriori and for checking it effectiveness in the design phase. It is based on the calculation of the monitoring capacity and the monitoring demand the counterparts of structural capacity and demand in the semi-probabilistic structural design, and like in structural design, the solution is satisfactory if the capacity is equal or better than the demand. The choice in whether to invest a limited budget on a monitoring system or in a retrofit is another critical choice for infrastructure managers: a retrofit work can increase the capacity and the safety of a structure, while sensors do not change the capacity, nor reduce the loads. Recently, the SHM-community has acknowledged that the benefit of installing a monitoring system can be properly quantified using the concept of Value of Information (VoI). A typical assumption in the VoI estimation is that a single decision-maker is in charge for decisions on both the investment in SHM for a structure, and its management based on SHM data. However, this process is usually more complex in the real world, with more individuals involved in the decision chain. Therefore, I formalize a rational method for quantifying the conditional value of information when two different actors are involved in the decision chain: the manager, who operate the structure based on monitoring data; and the owner, who chooses whether to install the monitoring system or not, before having access to these data. The results are particularly interested, showing that under appropriate conditions, the owner may be willing to pay to prevent the manager to use the monitoring system. Application to case studies are presented for all the research contribution presented in this doctoral thesis.
|
48 |
Money talks and matters / three essays on the theory of monetary policyStoltenberg, Christian 03 November 2009 (has links)
Wie sollten Zentralbanken Geldpolitik gestalten und der Öffentlichkeit kommunizieren, um die Ökonomie bestmöglich zu stabilisieren? Diese Dissertation, bestehend aus drei selbständigen Essays in dynamischer Makroökonomik, widmet sich in erster Linie dem normativen Aspekt von Geldpolitik. Das Hauptresultat im ersten Essay ist, dass bei idiosynkratischen Risiko die öffentliche Bekanntgabe von Informationen zu aggregierten Risiko einen negativen Effekt auf die soziale Wohlfahrt haben kann: durch die Veröffentlichung von Informationen zu nicht-versicherbaren aggregierten Risiko werden die Versicherungsanreize der Individuen verzerrt und damit das individuelle Konsumrisiko erhöht. Als eine Anwendung, analysieren wir die Situation einer Zentralbank, die die Möglichkeit hat, Veränderungen in ihren Inflationszielen anzukündigen und dokumentieren, das der negative Effekt der verzerrten Versicherungsanreize konventionelle positive Aspekte der Ankündigung überwiegt. In zweiten Essay untersuchen wir optimale Geldpolitik in Falle von nominalen Rigiditäten und einer Transaktionsfriktion. In einem Standardmodell, Money-in-the-Utility function, zeigen wir, dass das langfristige Optimum durch die Friedmansche Regel gegeben ist. Daraus folgt für die kurze Frist, dass das Primat von Geldpolitik auf die Stabilisierung der Zinsen und nicht auf Inflationsstabilisierung ausgelegt sein sollte. Im dritten Essay untersuche ich, ob die Existenz und die Terminierung von Realkasseneffekten eine wichtige Rolle für die Determiniertheit des allgemeinen Preisniveaus spielen. Als wichtigstes neues Resultat zeige ich, dass auch bei Zinspolitik ein eindeutiges Preisniveau bestimmt werden kann, wenn die Geldmenge zu Beginn der Periode in Transaktionen verwendet wird. Unter diesen Umständen, hat prädeterminiertes reales Geld die Funktion einer Zustandsvariable und die Zinspolitik sollte passiv sein, um eindeutige, stabile und nicht-oszillierende Gleichgewichtssequenzen zu erreichen. / How should central banks conduct and communicate their policies to serve the goal of stabilizing the macroeconomy? This thesis -- consisting of three self-contained essays on dynamic macroeconomics -- is mainly intended as a progress report on exploring the normative aspect of monetary policy. The main result of the first essay is, that in the presence of idiosyncratic risk, the public revelation of information about uncertain aggregate outcomes can be detrimental. By announcing informative signals on non-insurable aggregate risk, the policy maker distorts agents'' insurance incentives and increases the riskiness of the optimal allocation that is feasible in self-enforceable arrangements. We consider a monetary authority that may reveal changes in the inflation target, and document that the negative effect of distorted insurance incentives can very well dominate conventional effects in favor for the release of better information. In the second essay, we study optimal monetary policy with the nominal interest rate as the single policy instrument. Firms set prices in a staggered way without indexation and real money balances contribute separately to households'' utility. The optimal deterministic steady state under commitment is the Friedman rule for a broad range of parameters. Optimal monetary policy in the short run is then characterized by stabilization of the nominal interest rate instead of inflation stabilization as the predominant principle. In the third essay, I examine whether the existence and the timing of real balance effects contribute to the determination of the absolute price level. As the main novel result, I show that there exists a unique price level sequence that is consistent with an equilibrium under interest rate policy, if beginning-of-period money yields transaction services. Predetermined real money balances can then serve as a state variable, implying that interest rate setting should be passive -- a violation of the Taylor-principle.
|
49 |
Optimization of production allocation under price uncertainty : relating price model assumptions to decisionsBukhari, Abdulwahab Abdullatif 05 October 2011 (has links)
Allocating production volumes across a portfolio of producing assets is a complex optimization problem. Each producing asset possesses different technical attributes (e.g. crude type), facility constraints, and costs. In addition, there are corporate objectives and constraints (e.g. contract delivery requirements). While complex, such a problem can be specified and solved using conventional deterministic optimization methods. However, there is often uncertainty in many of the inputs, and in these cases the appropriate approach is neither obvious nor straightforward. One of the major uncertainties in the oil and gas industry is the commodity price assumption(s). This paper investigates this problem in three major sections: (1) We specify an integrated stochastic optimization model that solves for the optimal production allocation for a portfolio of producing assets when there is uncertainty in commodity prices, (2) We then compare the solutions that result when different price models are used, and (3) We perform a value of information analysis to estimate the value of more accurate price models. The results show that the optimum production allocation is a function of the price model assumptions. However, the differences between models are minor, and thus the value of choosing the “correct” price model, or similarly of estimating a more accurate model, is small. This work falls in the emerging research area of decision-oriented assessments of information value. / text
|
50 |
Analyse de connectivité et techniques de partitionnement de données appliquées à la caractérisation et la modélisation d'écoulement au sein des réservoirs très hétérogènes / Connectivity analysis and clustering techniques applied for the characterisation and modelling of flow in highly heterogeneous reservoirsDarishchev, Alexander 10 December 2015 (has links)
Les techniques informatiques ont gagné un rôle primordial dans le développement et l'exploitation des ressources d'hydrocarbures naturelles ainsi que dans d'autres opérations liées à des réservoirs souterrains. L'un des problèmes cruciaux de la modélisation de réservoir et les prévisions de production réside dans la présélection des modèles de réservoir appropriés à la quantification d'incertitude et au le calage robuste des résultats de simulation d'écoulement aux réelles mesures et observations acquises du gisement. La présente thèse s'adresse à ces problématiques et à certains autres sujets connexes.Nous avons élaboré une stratégie pour faciliter et accélérer l'ajustement de tels modèles numériques aux données de production de champ disponibles. En premier lieu, la recherche s'était concentrée sur la conceptualisation et l'implémentation des modèles de proxy reposant sur l'analyse de la connectivité, comme une propriété physique intégrante et significative du réservoir, et des techniques avancées du partitionnement de données et de l'analyse de clusters. La méthodologie développée comprend aussi plusieurs approches originales de type probabiliste orientées vers les problèmes d'échantillonnage d'incertitude et de détermination du nombre de réalisations et de l'espérance de la valeur d'information d'échantillon. Afin de cibler et donner la priorité aux modèles pertinents, nous avons agrégé les réalisations géostatistiques en formant des classes distinctes avec une mesure de distance généralisée. Ensuite, afin d'améliorer la classification, nous avons élargi la technique graphique de silhouettes, désormais appelée la "séquence entière des silhouettes multiples" dans le partitionnement de données et l'analyse de clusters. Cette approche a permis de recueillir une information claire et compréhensive à propos des dissimilarités intra- et intre-cluster, particulièrement utile dans le cas des structures faibles, voire artificielles. Finalement, la séparation spatiale et la différence de forme ont été visualisées graphiquement et quantifiées grâce à la mesure de distance probabiliste.Il apparaît que les relations obtenues justifient et valident l'applicabilité des approches proposées pour améliorer la caractérisation et la modélisation d'écoulement. Des corrélations fiables ont été obtenues entre les chemins de connectivité les plus courts "injecteur-producteur" et les temps de percée d'eau pour des configurations différentes de placement de puits, niveaux d'hétérogénéité et rapports de mobilité de fluides variés. Les modèles de connectivité proposés ont produit des résultats suffisamment précis et une performance compétitive au méta-niveau. Leur usage comme des précurseurs et prédicateurs ad hoc est bénéfique en étape du traitement préalable de la méthodologie. Avant le calage d'historique, un nombre approprié et gérable des modèles pertinents peut être identifié grâce à la comparaison des données de production disponibles avec les résultats de... / Computer-based workflows have gained a paramount role in development and exploitation of natural hydrocarbon resources and other subsurface operations. One of the crucial problems of reservoir modelling and production forecasting is in pre-selecting appropriate models for quantifying uncertainty and robustly matching results of flow simulation to real field measurements and observations. This thesis addresses these and other related issues. We have explored a strategy to facilitate and speed up the adjustment of such numerical models to available field production data. Originally, the focus of this research was on conceptualising, developing and implementing fast proxy models related to the analysis of connectivity, as a physically meaningful property of the reservoir, with advanced cluster analysis techniques. The developed methodology includes also several original probability-oriented approaches towards the problems of sampling uncertainty and determining the sample size and the expected value of sample information. For targeting and prioritising relevant reservoir models, we aggregated geostatistical realisations into distinct classes with a generalised distance measure. Then, to improve the classification, we extended the silhouette-based graphical technique, called hereafter the "entire sequence of multiple silhouettes" in cluster analysis. This approach provided clear and comprehensive information about the intra- and inter-cluster dissimilarities, especially helpful in the case of weak, or even artificial, structures. Finally, the spatial separation and form-difference of clusters were graphically visualised and quantified with a scale-invariant probabilistic distance measure. The obtained relationships appeared to justify and validate the applicability of the proposed approaches to enhance the characterisation and modelling of flow. Reliable correlations were found between the shortest "injector-producer" pathways and water breakthrough times for different configurations of well placement, various heterogeneity levels and mobility ratios of fluids. The proposed graph-based connectivity proxies provided sufficiently accurate results and competitive performance at the meta-level. The use of them like precursors and ad hoc predictors is beneficial at the pre-processing stage of the workflow. Prior to history matching, a suitable and manageable number of appropriate reservoir models can be identified from the comparison of the available production data with the selected centrotype-models regarded as the class representatives, only for which the full fluid flow simulation is pre-requisite. The findings of this research work can easily be generalised and considered in a wider scope. Possible extensions, further improvements and implementation of them may also be expected in other fields of science and technology.
|
Page generated in 0.1268 seconds