91 |
A Statistical Analysis of Key Factors Influencing the Location of Biomass-using FacilitiesLiu, Xu 01 December 2009 (has links)
Bioenergy and biofuels are emerging industries in the U.S. economy that will require statistical and economical analyses of woody biomass resources, supply chains, and other key factors that influence the siting of industrial facilities. This thesis develops models using logistic regression to improve the understanding of the key factors that influence the locations of existing wood-using bioenergy and biofuels plants, and other wood-using plants. The scope of the study is 13 Southeastern states.1 Logistic regression models are developed at the state and regional levels. The resolution of the study is the ZIP Code tabulation area (ZCTA). There are 9,416 ZCTAs in the 13–state study region.
Because a small number of woody biomass-using bioenergy and biofuels plants exist relative to the large number of traditional woody biomass-using facilities (e.g., wood composites, sawmills, and secondary mills), two sample groups are developed. The first group combines all wood-using mills with wood-using bioenergy and biofuels plants, and compares ZCTAs with these types of mills with ZCTAs that do not contain any such facilities. This follows a more modern planning view of total woody biomass management. The second group combines only one type of mill, pulp and paper mills, with wood-using bioenergy and biofuels plants, and compares ZCTAs of these mill types with ZCTAs that do not contain such facilities.
For both groups in the entire study region, logging residues harvesting costs (negative influence) and the availability of thinnings within an 80-mile haul distance (positive influence) are statistically significant factors (p-value < 0.0001) in the logistic models. Population is statistically significant and has a negative influence on site location for six of the thirteen states in the region (p-values ranged from < 0.0001 to 0.0197) for the first group. Twenty-five optimal locations in the Southeastern states (ZCTAs) are predicted from the logistic regression models. A de-clustering algorithm is developed as part of this study to avoid locating potential bioenergy and biofuels sites in close proximity to competing mills within same ZCTA. ______________________
1 Alabama, Arkansas, Florida, Georgia, Kentucky, Louisiana, Mississippi, North Carolina, Oklahoma, South Carolina, Tennessee, Texas, Virginia.
|
92 |
On 4-Regular Planar Hamiltonian GraphsHigh, David 01 May 2006 (has links)
In order to research knots with large crossing numbers, one would like to be able to select a random knot from the set of all knots with n crossings with as close to uniform probability as possible. The underlying graph of a knot diagram can be viewed as a 4-regular planar graph. The existence of a Hamiltonian cycle in such a graph is necessary in order to use the graph to compute an upper bound on rope length for a given knot. The algorithm to generate such graphs is discussed and an exact count of the number of graphs is obtained. In order to allow for the existence of such a count, a somewhat technical definition of graph equivalence is used. The main result of the thesis is the asymptotic results of how fast the number of graphs with n vertices (crossings) grows with n.
|
93 |
Hedging Contingent Claims in Markets with JumpsKennedy, J. Shannon 20 September 2007 (has links)
Contrary to the
Black-Scholes paradigm,
an option-pricing model which incorporates the possibility of
jumps
more
accurately reflects the
evolution of stocks in the real world.
However, hedging a contingent claim
in such a model is a non-trivial issue: in many cases, an infinite
number of hedging instruments are required to eliminate the
risk of an option position.
This thesis develops practical techniques for hedging contingent claims in
markets with jumps. Both regime-switching and
jump-diffusion models are considered.
|
94 |
Comparison of Approximation Schemes in Stochastic Simulation Methods for Stiff Chemical SystemsWells, Chad January 2009 (has links)
Interest in stochastic simulations of chemical systems is growing. One of the aspects
of simulation of chemical systems that has been the prime focus over the past
few years is accelerated simulation methods applicable when there is a separation
of time scale. With so many new methods being developed we have decided to look
at four methods that we consider to be the main foundation for this research area.
The four methods that will be the focus of this thesis are: the slow scale stochastic
simulation algorithm, the quasi steady state assumption applied to the stochastic
simulation algorithm, the nested stochastic simulation algorithm and the implicit
tau leaping method. These four methods are designed to deal with stiff chemical
systems so that the computational time is decreased from that of the "gold
standard" Gillespie algorithm, the stochastic simulation algorithm.
These approximation methods will be tested against a variety of sti examples
such as: a fast reversible dimerization, a network of isomerizations, a fast species
acting as a catalyst, an oscillatory system and a bistable system. Also, these
methods will be tested against examples that are marginally stiff, where the time
scale separation is not that distinct.
From the results of testing stiff examples, the slow scale SSA was typically the
best approximation method to use. The slow scale SSA was highly accurate and
extremely fast in comparison with the other methods. We also found for certain
cases, where the time scale separation was not as distinct, that the nested SSA was
the best approximation method to use.
|
95 |
A Multilevel Method for Image SegmentationAu, Adley January 2010 (has links)
Image segmentation is a branch of computer vision that has received a considerable
amount of interest in recent years. Segmentation describes a process that divides or partitions the pixels of a digital image into groups that correspond to the entities represented in the image. One such segmentation method is the Segmentation by Weighted Aggregation algorithm (SWA). Inspired by Algebraic Multigrid (AMG), the SWA algorithm provides a fast multilevel method for image segmentation.
The SWA algorithm takes a graph-based approach to the segmentation problem. Given
an image Ω, the weighted undirected graph A = (N,E) is constructed with each pixel corresponding to a node in N and each weighted edge connecting neighbouring nodes in E. The edge weight between nodes is calculated as a function of the difference in intensity between connected pixels.
To determine whether a group of pixels should be declared as a segment in the SWA
algorithm, a new scale-invariant measure to calculate the saliency of the group of pixels is introduced. This new measure determines the saliency of a potential segment by taking the ratio of the average similarity to its neighbours and its internal similarity. For complex images, intensity alone is not sufficient in providing a suitable segmentation. The SWA algorithm provides a way to improve the segmentation by incorporating other vision cues
such as texture, shape and colour.
The SWA algorithm with the new scale-invariant saliency measure was implemented
and its performance was tested on simple test images and more complex aerial-view images.
|
96 |
Adaptive finite element methods for linear-quadratic convection dominated elliptic optimal control problemsJanuary 2010 (has links)
The numerical solution of linear-quadratic elliptic optimal control problems requires the solution of a coupled system of elliptic partial differential equations (PDEs), consisting of the so-called state PDE, the adjoint PDE and an algebraic equation. Adaptive finite element methods (AFEMs) attempt to locally refine a base mesh in such a way that the solution error is minimized for a given discretization size. This is particularly important for the solution of convection dominated problems where inner and boundary layers in the solutions to the PDEs need to be sufficiently resolved to ensure that the solution of the discretized optimal control problem is a good approximation of the true solution.
This thesis reviews several AFEMs based on energy norm based error estimates for single convection dominated PDEs and extends them to the solution of the coupled system of convection dominated PDEs arising from the optimality conditions for optimal control problems.
Keywords Adaptive finite element methods, optimal control problems, convection-diffusion equations, local refinement, error estimation.
|
97 |
Implicitly Restarted DEIM_Arnoldi: An inner product free Krylov method for eigenproblemsJanuary 2010 (has links)
This thesis proposes an inner product free Krylov method called Implicitly Restarted DEIM_Arnoldi (IRD) to solve large scale eigenvalue problems. This algorithm is based on the Implicitly Restarted Arnoldi (IRA) scheme, which is very efficient for solving eigenproblems. IRA uses the Arnoldi factorization, which requires inner products. In contrast, IRD employs the Discrete Empirical Interpolation (DEIM) technique and the DEIM_Arnoldi algorithm to avoid inner products, thereby resulting in faster running times for large eigenproblems. Furthermore, IRD may be able to greatly reduce the latency caused by inner products in parallel computation. This work conducts many numerical experiments to compare the performance of IRD and IRA in serial computation, and discusses the possible ways to avoid the need for communication in parallel computation.
|
98 |
Source localization in cluttered acoustic waveguidesJanuary 2010 (has links)
Mode coupling due to scattering by weak random inhomogeneities leads to the loss of coherence in the wave field measured a long distances of propagation. This in turn leads to the deterioration of coherent source localization methods such as matched-field. In this dissertation, we study with analysis and numerical simulations how such deterioration occurs and we introduce an original incoherent source localization approach for random waveguides. This approach is based on a special form of transport theory for the incoherent fluctuations of the wave field. The statistical stability of the method is analyzed and its performance is illustrated with numerical simulations. In addition, this method is used to estimate the correlation function of the random fluctuations of the wave speed.
|
99 |
The syntax-pragmatics interface of BanglaGhosh, Sanjukta 12 1900 (has links)
Syntax-pragmatics interface
|
100 |
A morphological analyzer for TamilRamaswamy, Vaishnavi 09 1900 (has links)
Analyzer for Tamil
|
Page generated in 0.0758 seconds