71 |
A Basic Scheme for Displaying Fonts and Images on Hexagonal Grid DevicesHsu, Ming-Jin 26 June 2001 (has links)
Due to the advances of image system, most researches are developed on high-resolution system. However, the low-resolution system have an advantage over the high-resolution system on processing speed, saving space and power consumption. From the research of hexagonal grid, we know that from the view of microcosmic, the angle resolution and connection of hexagonal grid are better than rectangular grid, so images on hexagonal grid also have the better quality.
Almost input and output systems of image device are on rectangular grid, so its technology and theory are developed on rectangular grid system. For a displaying system, fonts and images are the main elements. If we want to substitute hexagonal grid system for rectangular grid system, they will be primary factories that we consider.
In this research based on rectangular grid system, we will apply the plane parametric curves and fill algorithm on hexagonal grid system, and probe into the displaying method of fonts and images on hexagonal grid system.
|
72 |
Modeling software artifact count attribute with s-curvesMa, Norman K 10 October 2008 (has links)
The estimation of software project attributes, such as size, is important for software
project resource planning and process control. However, research regarding software
attribute modeling, such as size, effort, and cost, are high-level and static in nature. This
research defines a new operation-level software project attribute that describes the
operational characteristic of a software project. The result is a measurement based on the
s-curve parameter that can be used as a control variable for software project
management. This result is derived from modeling the count of artifact instances created
by the software engineering process, which are stored by software tools. Because of the
orthogonal origin of this attribute in regard to traditional static estimators, this s-curve
based software attribute can function as an additional indicator of software project
activities and also as a quantitative metric for assessing development team capability.
|
73 |
Efficient Implementation of the Weil PairingLu, Yi-shan 31 August 2009 (has links)
The most efficient algorithm for solving the elliptic curve discrete logarithm problem can only be done in exponential time. Hence, we can use it in many cryptographic applications. Weil pairing is a mapping which maps a pair of points on elliptic curves to a multiplicative group of a finite field with nondegeneracy and bilinearity. Pairing was found to reduce the elliptic
curve discrete logarithm problem into the discrete logarithm problem of a finite field, and became an important issue since then. In 1986, Miller proposed an efficient algorithm for computing Weil pairings. Many researchers focus on the improvement of this algorithm. In 2006, Blake et al. proposed the reduction of total number of lines based on the conjugate of a line. Liu
et al. expanded their concept and proposed two improved methods. In this paper, we use both NAF and segmentation algorithm to implement the Weil pairing and analyse its complexity.
|
74 |
Studies of poly(ethylene succinate) and its copolyesters with poly(trimethylene succinate)Tsai, Chia-jung 01 September 2009 (has links)
Poly(ethylene succinate) (PES), poly(trimethylene succinate) (PTS) and their copolyesters with various compositions were synthesized through a direct polycondensation reaction with titanium tetraisopropoxide used as the catalyst. Results obtained from intrinsic viscosity and gel permeation chromatography (GPC) studies have significantly contributed to the preparation of polyesters with high molecular weight. Compositions and sequence distributions of the synthesized copolyesters were determined by analyzing the spectra of 1H NMR and 13C NMR. According to those results, the sequence distributions of ethylene succinate (ES) units and trimethylene succinate (TS) units were found to be random. Thermal properties were then characterized using differential scanning calorimeter (DSC) and thermogravimetric analyzer (TGA). All copolymers exhibited a single glass transition temperature (Tg). These polyesters did not significantly differ in thermal stability. Next, thermal stability was estimated using polarized light microscopy (PLM). Isothermal growth rates for polyesters were observed after pre-melting at various temperatures. The thermal degradation temperature (Td) was estimated, at which the growth rate for polyesters increased abruptly. The Td value of PES and PETSA 95/05 was found to be 213 and 200 ¢XC, respectively, which was 35−45 ¢XC lower than that determined by TGA.
Wide-angle X-ray diffractograms (WAXDs) were obtained for polyesters that were crystallized isothermally at a temperature 5−10 ¢XC below their melting temperatures. Only the crystal form of PES was appeared in the diffractograms of PES-rich copolyesters. The TS units in polyesters may be excluded and located in the amorphous part of polyesters. WAXD results indicate that incorporating TS units into PES could significantly inhibit the crystallization behavior of the latter. Additionally, dynamic mechanical properties of moldable polyesters were investigated using a Rheometer operated at 1 Hz. Below Tg, incorporating TS units into PES led to a decline in the storage modulus, while above Tg, the effect of crystallinity on the storage modulus could be found.
The sphreulite growth rates for crystallizable polyesters were measured by PLM. The growth rate of polyesters decreased with an increasing moiety of TS units. The regime II¡÷III transition of PES was estimated to occur at ca. 71 ¢XC, which is extremely close to values in the literature. The regime transition of PETSA95/05 and PETSA 80/20 was found to be 65.0 ¢XC and 51.4 ¢XC, respectively. A dynamic crystallization experiment was performed by PLM and compared with time consuming isothermal experiments. Above data closely corresponded to those data points determined in the isothermal experiments. Results of the regime analysis for the continuous data of polyesters closely resembled those of isothermal experiments.
The maximum growth rate was formulated in Arrhenius and WLF expressions for the molecular transport term. A master curve of the crystal growth rate for PES was constructed based on the continuous data of PES. Plotting the reduced growth rates after normalization against the reduced temperatures revealed a universal master curve for PES and two PES-rich copolyesters. Finally, the lateral surface free energy, fold surface free energy and work for chain folding of polyesters were evaluated based on kinetic analysis. According to those results, the works for chain folding decreased with an increasing moiety of TS units.
|
75 |
The environmental Kuznets curve case for the USA and the BRIC countriesRashid, Shehryar 20 November 2009 (has links)
Previous literature on the Environmental Kuznets Curve has focused extensively on why or why not such a relationship is observed given specific scenarios. More recent literature has shifted attention towards factors that may explain differences in the distribution or threshold of the curve. The purpose of this paper is to determine why we witness different cutoff points for environmental improvement given the same dependent variable. For this analysis, the relationship between CO2 emissions and GDP growth is observed in the United States and the BRIC countries (Brazil, Russia, India and China) from 1981-2006. The results suggest that the standard for environmental improvement is lower for the BRIC countries compared with the United States. Factors that explain this are FDI inflow, share of production from different industries, share of energy from different sources, and overall incentives.
|
76 |
Digital Aperture Photometry Utilizing Growth CurvesOvercast, William Chandler 01 May 2010 (has links)
Point source extraction is critical to proper analysis of images containing point sources obtained by focal plane array cameras. Two popular methods of extracting the intensity of a point source are aperture photometry and point spread function fitting. Digital aperture photometry encompasses procedures utilized to extract the intensity of an imaged point source. It has been used by astronomers in various forms for calculating stellar brightness. It is also useful for doing analysis of data associated with other unresolved radiating objects. The various aperture photometry methods include the two-aperture method, aperture correction, and growth curve method.The growth curve method utilizes integrated irradiance within an aperture versus growing aperture size. Signal to noise ratio, imperfect backgrounds, moving and off centered targets, and noise structure are just a few of the items that can cause problems with point source extraction. This thesis presents a study of how best to apply the growth curve method.Multiple synthetic image sets were produced to replicate real world data. The synthetic images contain a Gaussian target of known intensity. Noise was added to the images, and various image related parameters were altered. The growth curve method is then applied to each data set using every reasonable aperture size combination to calculate the target intensity. It will be shown that for different types of data, the most optimal application of the growth curve method can be determined. An algorithm is presented that can be applied to all data sets that fall within the scope of this study will be presented.
|
77 |
Dynamic modeling approach to forecast the term structure of government bond yieldsFu, Min, active 2013 09 December 2013 (has links)
Since arbitrage-free is a desirable theoretical feature in a healthy financial market, many efforts have been made to construct arbitrage-free models for yield curves. However, little attention is paid to review if such restriction will improve yield forecast. We evaluate the importance of arbitrage-free restriction on dynamic Nelson-Siegel term structure when forecasting yield curves. We find that it doesn’t help. We also compare these two Nelson-Siegel dynamic models with a benchmark dynamic model and show that Nelson-Siegel structure improve forecasts for long-maturity yields. / text
|
78 |
Uncertainty in proved reserves estimation by decline curve analysisApiwatcharoenkul, Woravut 03 February 2015 (has links)
Proved reserves estimation is a crucial process since it impacts aspects of the petroleum business. By definition of the Society of Petroleum Engineers, the proved reserves must be estimated by reliable methods that must have a chance of at least a 90 percent probability (P90) that the actual quantities recovered will equal or exceed the estimates. Decline curve analysis, DCA, is a commonly used method; which a trend is fitted to a production history and extrapolated to an economic limit for the reserves estimation. The trend is the “best estimate” line that represents the well performance, which corresponds to the 50th percentile value (P50). This practice, therefore, conflicts with the proved reserves definition. An exponential decline model is used as a base case because it forms a straight line in a rate-cum coordinate scale. Two straight line fitting methods, i.e. ordinary least square and error-in-variables are compared. The least square method works better in that the result is consistent with the Gauss-Markov theorem. In compliance with the definition, the proved reserves can be estimated by determining the 90th percentile value of the descending order data from the variance. A conventional estimation using a principal of confidence intervals is first introduced to quantify the spread, a difference between P50 and P90, from the variability of a cumulative production. Because of the spread overestimation of the conventional method, the analytical formula is derived for estimating the variance of the cumulative production. The formula is from an integration of production of rate over a period of time and an error model. The variance estimations agree with Monte Carlo simulation (MCS) results. The variance is then used further to quantify the spread with the assumption that the ultimate cumulative production is normally distributed. Hyperbolic and harmonic models are also studied. The spread discrepancy between the analytics and the MCS is acceptable. However, the results depend on the accuracy of the decline model and error used. If the decline curve changes during the estimation period the estimated spread will be inaccurate. In sensitivity analysis, the trend of the spread is similar to how uncertainty changes as the parameter changes. For instance, the spread reduces if uncertainty reduces with the changing parameter, and vice versa. The field application of the analytical solution is consistent to the assumed model. The spread depends on how much uncertainty in the data is; the higher uncertainty we assume in the data, the higher spread. / text
|
79 |
A Learning Curve in Aortic Dissection Surgery with the Use of Cumulative Sum AnalysisSONG, MIN-HO 02 1900 (has links)
No description available.
|
80 |
A warehouse benchmarking model utilizing frontier production functionsHollingsworth, Keith Brian 08 1900 (has links)
No description available.
|
Page generated in 0.0444 seconds