121 |
Pricing and Hedging the Guaranteed Minimum Withdrawal Benefits in Variable AnnuitiesLiu, Yan January 2010 (has links)
The Guaranteed Minimum Withdrawal Benefits (GMWBs) are optional riders provided
by insurance companies in variable annuities. They guarantee the policyholders' ability to get the initial investment back by making periodic withdrawals regardless of the
impact of poor market performance. With GMWBs attached, variable annuities become more attractive. This type of guarantee can be challenging to price and hedge.
We employ two approaches to price GMWBs. Under the constant static withdrawal
assumption, the first approach is to decompose the GMWB and the variable annuity
into an arithmetic average strike Asian call option and an annuity certain. The second
approach is to treat the GMWB alone as a put option whose maturity and payoff are
random.
Hedging helps insurers specify and manage the risks of writing GMWBs, as well
as find their fair prices. We propose semi-static hedging strategies that offer several
advantages over dynamic hedging. The idea is to construct a portfolio of European
options that replicate the conditional expected GMWB liability in a short time period,
and update the portfolio after the options expire. This strategy requires fewer portfolio
adjustments, and outperforms the dynamic strategy when there are random jumps in
the underlying price. We also extend the semi-static hedging strategies to the Heston
stochastic volatility model.
|
122 |
Object Histories in JavaNair, Aakarsh 21 April 2010 (has links)
Developers are often faced with the task of implementing new features or diagnosing problems in large software systems. Convoluted control and data flows in large object-oriented software systems, however, make even simple tasks extremely difficult, time-consuming, and frustrating. Specifically, Java programs manipulate objects by adding and removing them from collections and by putting and getting them from other objects' fields. Complex object histories hinder program understanding by forcing software maintainers to track the provenance of objects through their past histories when diagnosing software faults.
In this thesis, we present a novel approach which answers queries about the evolution of objects throughout their lifetime in a program. On-demand answers to object history queries aids the maintenance of large software systems by allowing developers to pinpoint relevant details quickly.
We describe an event-based, flow-insensitive, interprocedural program analysis technique for computing object histories and answering history queries. Our analysis technique identifies all relevant events affecting an object and uses pointer analysis to filter out irrelevant events. It uses prior knowledge of the meanings of methods in the Java collection classes to improve the quality of the histories.
We present the details of our technique and experimental results that highlight the utility of object histories in common programming tasks.
|
123 |
Collection Disjointness Analysis in JavaChu, Hang January 2011 (has links)
This thesis presents a collection disjointness analysis to find disjointness relations between collections in Java. We define the three types of disjointness relations between collections: must-shared, may-shared and not-may-shared. The collection- disjointness analysis is implemented following the way of a forward data-flow analysis using Soot Java bytecode analysis framework. For method calls, which are usually difficult to analyze in static analysis, our analysis provide a way of generating and reading annotations of a method to best approximate the behavior of the calling methods. Finally, this thesis presents the experimental results of the collection-disjointness analysis on several tests.
|
124 |
Modeling of Deterministic Within-Die Variation in Timing Analysis, Leakage current Analysis, and Delay Fault DiagnosisChoi, Munkang 04 April 2007 (has links)
As semiconductor technology advances into the nano-scale era and more functional blocks are added into systems on chip (SoC), the interface between circuit design and manufacturing is becoming blurred. An increasing number of features, traditionally ignored by designers are influencing both circuit performance and yield. As a result, design tools need to incorporate new factors. One important source of circuit performance degradation comes from deterministic within-die variation from lithography imperfections and Cu interconnect chemical mechanical polishing (CMP).
To determine how these within-die variations impact circuit performance, a new analysis tool is required. Thus a methodology has been proposed to involve layout-dependent within-die variations in static timing analysis. The methodology combines a set of scripts and commercial tools to analyze a full chip. The tool has been applied to analyze delay of ISCAS85 benchmark circuits in the presence of imperfect lithography and CMP variation.
Also, this thesis presents a methodology to generate test sets to diagnose the sources of within-die variation. Specifically, a delay fault diagnosis algorithm is developed to link failing signatures to physical mechanisms and to distinguish among different sources of within-die variation. The algorithm relies on layout-dependent timing analysis, path enumeration, test pattern generation, and correlation of pass/fail signatures to diagnose lithography-caused delay faults. The effectiveness in diagnosis is evaluated for ISCAS85 benchmark circuits.
|
125 |
Worst Case Execution time Analysis Support for the ARM Processor Using GCCYen, Cheng-Yu 09 August 2010 (has links)
This thesis presents a tool for obtaining worst-case execution time (WCET) guarantees for ARM processors. This tool is an interface between ARM¡¦s GCC compiler and the SWEET WCET analyzer. SWEET is an open-source static analyzer that derives a guaranteed upper bound on the WCET of a program.
The WCET of a program is an important metric in real-time systems. The task scheduler must decide how much time to allot for each process; if the allotted time exceeds the WCET, the process can be guaranteed to always finish in time. Although the WCET value is therefore useful, it is difficult to find. But, for the purpose of guaranteeing that a process finishes on time, an upper bound on the WCET suffices. Static program analysis has been proposed as a method to derive such an upper-bound on the WCET, by means of conservatively approximating the runtime of the individual parts of a complete program. SWEET is one such static analyzer.
Our tool works inside of ARM-GCC, extracting all of the information that SWEET needs about the program¡¦s behavior. Our tool then packages the information into the SWEET¡¦s ALF format. The tool has been tested and works correctly for every input source that we have tested (including all 34 benchmarks from the WCET BENCHMARK SUITE[1]).
This work was funded by Taiwan¡¦s National Science Council, grant NSC 97-2218-E-110-003
|
126 |
Integrating the SWEET WCET Analyzer into ARM-GCC with Extra WCFP Information to Enable WCET-Targeted Compiler OptimizationsHao, Wen-Chuan 23 December 2011 (has links)
Finding the worst-case execution time (WCET) on a hard real-time system is extremely important. Only static WCET analysis can give us an upper bound of WCET which guarantees the deadline, however, industrial practice still relies on measurement-based WCET analysis, even for many hard real-time systems; because static analysis tools are not a mature technology.
We use SWEET (SWEdish Execution Time tool) to provide WCET analysis support for the ARM. SWEET is a static WCET analyzer developed by the Mälardalen Real-Time Research Center (MRTC). We modified ARM-GCC to obtain input files in specific format for SWEET: ALF, TCD, and MAP. Besides, for WCET optimization supporting and over-optimizing issue, we modified SWEET to obtain additional worst-case flow path (WCFP) and the second worst-case information.
By testing with benchmark files from [1], our modified ARM-GCC can create correct input files for SWEET, and also the modified SWEET can produce additional worst-case information.
|
127 |
Code Classification Based on Structure SimilarityYang, Chia-hui 14 September 2012 (has links)
Automatically classifying malware variants source code is the most important research issue in the field of digital forensics. By means of malware classification, we can get complete behavior of malware which can simplify the forensics task. In previous researches, researchers use malware binary to perform dynamic analysis or static analysis after reverse engineering. In the other hand, malware developers even use anti-VM and obfuscation techniques try to cheating malware classifiers.
With honeypots are increasingly used, researchers could get more and more malware source code. Analyzing these source codes could be the best way for malware classification. In this paper, a novel classification approach is proposed which based on logic and directory structure similarity of malwares. All collected source code will be classified correctly by hierarchical clustering algorithm. The proposed system not only helps us classify known malwares correctly but also find new type of malware. Furthermore, it avoids forensics staffs spending too much time to reanalyze known malware. And the system could also help realize attacker's behavior and purpose. The experimental results demonstrate the system can classify the malware correctly and be applied to other source code classification aspect.
|
128 |
Forces on laboratory model dredge cutterheadYoung, Dustin Ray 2009 December 1900 (has links)
Dredge cutting forces produced by the movement of the cutterhead through the sediment have been measured with the laboratory dredge carriage located at the Haynes Coastal Engineering Laboratory. The sediment bed that was used for the dredging test was considered to be relatively smooth and the sediment used was sand with a d50=0.27 mm. Forces on the dredge carriage were measured using five 13.3 kN (3000 lb) one directional load cells placed on the dredge ladder in various places so the transmitted cutting forces could be obtained. The objectives for this study are to determine the vertical, horizontal, and axial forces that are produced by the cutterhead while testing. So, to find these cutter forces, a static analysis was performed on the carriage by applying static loads to the cutterhead in the vertical, horizontal, and axial directions, and for each load that was applied, readings were recorded for all five of the load cells. Then, static equilibrium equations were developed for the dredge carriage ladder to determine loads in the five load cells. Also, equilibrium equations can be applied to a dredging test to find the cutterhead forces by taking the measured data from the five load cells and applying the known forces to the equations, and the cutterhead forces can be determined. These static equilibrium equations have been confirmed by using a program called SolidWorks, which is modeling software that can be used to do static finite element analysis of structural systems to determine stresses, displacement, and pin and bolt forces. Data that were gathered from the experimental procedure and the theoretical calculations show that the force on the dredge cutterhead can be determined.
However, the results from the static equilibrium calculations and the results from the SolidWorks program were compared to the experiment procedure results, and from the comparison the procedure results show irregularities when a force of approximately 0.889 kN (200 lb) or above is applied to the cutterhead in a north, south, west, or east orientation. The SolidWorks program was used to determine the results for displacements of the dredge carriage ladder system, which showed that large displacements were occurring at the location of the cutterhead, and when the cutterhead displaces it means that the carriage ladder is also moving, which causes false readings in the five load cells. From this analysis it was determined that a sixth force transducer was needed to produce more resistance on the ladder; and the cell #1 location needed to be redesigned to make the ladder system as rigid as possible and able to produce good testing results. The SolidWorks program was used to determine the best location where the sixth force transducer would give the best results, and this location was determined to be on the lower south-west corner oriented in the direction east to west. The static equilibrium equations were rewritten to include the new redesigned cell #1 location and the new location of the sixth load cell. From the new system of equations, forces on the cutterhead can be determined for future dredging studies conducted with the dredge carriage.
Finally, the forces on the laboratory cuttersuction dredge model cutterhead were scaled up to the prototype 61 cm (24 in) cuttersuction dredge. These scaled up cutting forces on the dredge cutterhead can be utilized in the design of the swing winches, swing cable size, ladder supports, and ladder.
|
129 |
Programming Language Evolution and Source Code RejuvenationPirkelbauer, Peter Mathias 2010 December 1900 (has links)
Programmers rely on programming idioms, design patterns, and workaround
techniques to express fundamental design not directly supported by the language.
Evolving languages often address frequently encountered problems by adding language
and library support to subsequent releases. By using new features, programmers can
express their intent more directly. As new concerns, such as parallelism or security,
arise, early idioms and language facilities can become serious liabilities. Modern code
sometimes bene fits from optimization techniques not feasible for code that uses less
expressive constructs. Manual source code migration is expensive, time-consuming,
and prone to errors.
This dissertation discusses the introduction of new language features and libraries,
exemplifi ed by open-methods and a non-blocking growable array library. We
describe the relationship of open-methods to various alternative implementation techniques.
The benefi ts of open-methods materialize in simpler code, better performance,
and similar memory footprint when compared to using alternative implementation
techniques.
Based on these findings, we develop the notion of source code rejuvenation, the
automated migration of legacy code. Source code rejuvenation leverages enhanced
program language and library facilities by finding and replacing coding patterns that can be expressed through higher-level software abstractions. Raising the level of
abstraction improves code quality by lowering software entropy. In conjunction with
extensions to programming languages, source code rejuvenation o ers an evolutionary
trajectory towards more reliable, more secure, and better performing code.
We describe the tools that allow us efficient implementations of code rejuvenations.
The Pivot source-to-source translation infrastructure and its traversal mechanism
forms the core of our machinery. In order to free programmers from representation
details, we use a light-weight pattern matching generator that turns a C like
input language into pattern matching code. The generated code integrates seamlessly
with the rest of the analysis framework.
We utilize the framework to build analysis systems that find common workaround
techniques for designated language extensions of C 0x (e.g., initializer lists). Moreover,
we describe a novel system (TACE | template analysis and concept extraction)
for the analysis of uninstantiated template code. Our tool automatically extracts
requirements from the body of template functions. TACE helps programmers understand
the requirements that their code de facto imposes on arguments and compare
those de facto requirements to formal and informal specifications.
|
130 |
The study of the Bioeconomics analysis Of Grey mullet in TaiwanCheng, Man-chun 29 January 2007 (has links)
Abstract
This study is based on the theory of biology and economy to establish the open access model, dynamic optimization model and static optimization of fishery mathematical models, to discuss the problem of fishery management.
To be aimed at getting the equilibrium of resource stock and effort, research data are mainly analyzed by comparative statues. In so doing, the amount of grey mullet, collect and analyze the estimation of exogenous variable. Then, we can use Mathematica program to calculate the equilibrium value resource stock and the effort, and do the sensitivity analysis by standing on the change of estimation of exogenous variable.
The result of analysis is as follow: These three fishery mathematical models¡¦ resource stock and effort are consistency. In another view of CPUE, it is not obvious of the economic effect of open access model. We must strengthen the management in policy of fishing for grey mullet, to let the fisherman earn the highest economic benefits.
Keyword: open access model
static optimization model.
dynamic optimization model.
|
Page generated in 0.09 seconds