221 |
An Experimental Study of Single / Two Phase Flow and Heat Transfer in MicrochannelsLin, Chih-yi 27 January 2010 (has links)
An experimental investigation was carried to examine the flow/ thermal field characteristics with/without phase change in the microchannels and compared with the traditional results. There are three parts in this study. The first part investigated the 2-D flow field measured by the micro particle image velocimetry (£gPIV) in a single PMMA microchannel fabricated by an ArF excimer laser. The slip boundary condition in the microchannel wall was also discussed. The second part studied the influence of surface condition (hydrophilic vs hydrophobic) on the flow/thermal field in a micro cooling device which included twenty parallel microchannels, which was fabricated by SU-8 microfabrication technique and replicated by the PDMS replica technique. The UV/ozone device was used to change the PDMS microchannels¡¦ surface condition from hydrophobic to hydrophilic and the £gPIV/£gLIF system was also used to measure the velocity and temperature distribution. The third part investigated the two-phase subcooled flow boiling phenomena (onset of nucleate boiling, boiling curve, flow patterns, bubble departure diameter and frequency) in the seventy-five parallel microchannels fabricated by SU-8 microfabrication technique, and aimed to raise the critical heat flux (CHF) and heat transfer coefficient to enhance the cooling efficiency. Three major methods were used in this study, as follows:
(1) To add the cavity angle of £c = 60¢X, 90¢X, and 120¢X on the microchannel side walls.
(2) To coat 2 £gm diamond film on the Cu heated surface.
(3) To add 1 vol. % Multi-walled Carbon Nanotube (MCNT) into the working medium (deionized water).
The goal of this paper is to develop a high heat flux cooling technique and apply the experimental results to solve the cooling problem resulting from the exceedingly high heat flux from the electronic component.
|
222 |
Effects of sub-optimal component performance on overall cooling system energy consumption and efficiencyKhazaii, Javad 04 April 2012 (has links)
Predicted cooling system performance plays an important role in choices among alternative system selections and designs. When system performance is expressed in proper indicators such as "overall system energy consumption" or "overall system efficiency", it can provide the decision makers with a quantitative measure of the extent to which a cooling system satisfies the system design requirements and objectives. Predictions of cooling system energy consumption and efficiency imply assumptions about component performance. Quantitative appraisal of the uncertainty (lack of knowledge) in these assumptions can be used by design practitioners to select and design systems, by energy contractors to guarantee future system energy cost savings, and codes and standards officials to set proper goals to conserve energy.
Our lack of knowledge has different sources, notably unknown tolerances in equipment nameplate data, and unpredictable load profiles. Both cause systems to under-perform current predictions, and as a result decrease the accuracy of the outcomes of energy simulations that commonly are used to verify system performance during the design and construction stages. There can be many other causes of unpredictable system behavior, for example due to bad workmanship in the installation, occurrence of faults in the operation of certain system parts, deterioration over time and other. These uncertainties are typically much harder to quantify and their propagation into the calculated energy consumption is much harder to accomplish. In this thesis, these categories of failures are not considered, i.e. the treatment is limited to component tolerances and load variability.
In this research the effects of equipment nameplate tolerances and cooling load profile variability on the overall energy consumption and efficiency of commonly used commercial cooling systems are quantified. The main target of this thesis is to present a methodology for calculating the chances that a specific cooling system could deviate from a certain efficiency level by a certain margin, and use these results to guide practitioners and energy performance contractors to select, and guarantee system performances more realistically. By doing that, the plan is to establish a systematic approach of developing expressions of risk, in commercial cooling system consumption and efficiency calculations, and thus to advocate the use of expressions of risk as design targets.
This thesis makes a contribution to improving our fundamental understanding of performance risk in selecting and sizing certain HVAC design concepts.
|
223 |
Interest Curves : Concept, Evaluation, Implementation and ApplicationsLi, Bo January 2015 (has links)
Image features play important roles in a wide range of computer vision applications, such as image registration, 3D reconstruction, object detection and video understanding. These image features include edges, contours, corners, regions, lines, curves, interest points, etc. However, the research is fragmented in these areas, especially when it comes to line and curve detection. In this thesis, we aim to discover, integrate, evaluate and summarize past research as well as our contributions in the area of image features. This thesis provides a comprehensive framework of concept, evaluation, implementation, and applications for image features. Firstly, this thesis proposes a novel concept of interest curves. Interest curves is a concept derived and extended from interest points. Interest curves are significant lines and arcs in an image that are repeatable under various image transformations. Interest curves bring clear guidelines and structures for future curve and line detection algorithms and related applications. Secondly, this thesis presents an evaluation framework for detecting and describing interest curves. The evaluation framework provides a new paradigm for comparing the performance of state-of-the-art line and curve detectors under image perturbations and transformations. Thirdly, this thesis proposes an interest curve detector (Distinctive Curves, DICU), which unifies the detection of edges, corners, lines and curves. DICU represents our state-of-the-art contribution in the areas concerning the detection of edges, corners, curves and lines. Our research efforts cover the most important attributes required by these features with respect to robustness and efficiency. Interest curves preserve richer geometric information than interest points. This advantage gives new ways of solving computer vision problems. We propose a simple description method for curve matching applications. We have found that our proposed interest curve descriptor outperforms all state-of-the-art interest point descriptors (SIFT, SURF, BRISK, ORB, FREAK). Furthermore, in our research we design a novel object detection algorithm that only utilizes DICU geometries without using local feature appearance. We organize image objects as curve chains and to detect an object, we search this curve chain in the target image using dynamic programming. The curve chain matching is scale and rotation-invariant as well as robust to image deformations. These properties have given us the possibility of resolving the rotation-variance problem in object detection applications. In our face detection experiments, the curve chain matching method proves to be scale and rotation-invariant and very computational efficient. / Bilddetaljer har en viktig roll i ett stort antal applikationer för datorseende, t.ex., bildregistrering, 3D-rekonstruktion, objektdetektering och videoförståelse. Dessa bilddetaljer inkluderar kanter, konturer, hörn, regioner, linjer, kurvor, intressepunkter, etc. Forskningen inom dessa områden är splittrad, särskilt för detektering av linjer och kurvor. I denna avhandling, strävar vi efter att hitta, integrera, utvärdera och sammanfatta tidigare forskning tillsammans med vår egen forskning inom området för bildegenskaper. Denna avhandling presenterar ett ramverk för begrepp, utvärdering, utförande och applikationer för bilddetaljer. För det första föreslår denna avhandling ett nytt koncept för intressekurvor. Intressekurvor är ett begrepp som härrör från intressepunkter och det är viktiga linjer och bågar i bilden som är repeterbara oberoende av olika bildtransformationer. Intressekurvor ger en tydlig vägledning och struktur för framtida algoritmer och relaterade tillämpningar för kurv- och linjedetektering. För det andra, presenterar denna avhandling en utvärderingsram för detektorer och beskrivningar av intressekurvor. Utvärderingsramverket utgör en ny paradigm för att jämföra resultatet för de bästa möjliga teknikerna för linje- och kurvdetektorer vid bildstörningar och bildtransformationer. För det tredje presenterar denna avhandling en detektor för intressekurvor (Distinctive curves, DICU), som förenar detektering av kanter, hörn, linjer och kurvor. DICU representerar vårt främsta bidrag inom området detektering av kanter, hörn, kurvor och linjer. Våra forskningsinsatser täcker de viktigaste attribut som krävs av dessa funktioner med avseende på robusthet och effektivitet. Intressekurvor innehåller en rikare geometrisk information än intressepunkter. Denna fördel öppnar för nya sätt att lösa problem för datorseende. Vi föreslår en enkel beskrivningsmetod för kurvmatchningsapplikationer och den föreslagna deskriptorn för intressekurvor överträffar de bästa tillgängliga deskriptorerna för intressepunkter (SIFT, SURF, BRISK, ORB, och FREAK). Dessutom utformar vi en ny objektdetekteringsalgoritm som bara använder geometri för DICU utan att använda det lokala utseendet. Vi organiserar bildobjekt som kurvkedjor och för att upptäcka ett objekt behöver vi endast söka efter denna kurvkedja i målbilden med hjälp av dynamisk programmering. Kurvkedjematchningen är oberoende av skala och rotationer samt robust vid bilddeformationer. Dessa egenskaper ger möjlighet att lösa problemet med rotationsberoende inom objektdetektering. Vårt ansiktsigenkänningsexperiment visar att kurvkedjematchning är oberoende av skala och rotationer och att den är mycket beräkningseffektiv. / INTRO – INteractive RObotics research network
|
224 |
Alternate Compactifications of Hurwitz SpacesDeopurkar, Anand 19 December 2012 (has links)
We construct several modular compactifications of the Hurwitz space \(H^d_{g/h}\) of genus g curves expressed as d-sheeted, simply branched covers of genus h curves. They are obtained by allowing the branch points of the cover to collide to a variable extent, generalizing the spaces of twisted admissible covers of Abramovich, Corti, and Vistoli. The resulting spaces are very well-behaved if d is small or if relatively few collisions are allowed. In particular, for d = 2 and 3, they are always well-behaved. For d = 2, we recover the spaces of hyperelliptic curves of Fedorchuk. For d = 3, we obtain new birational models of the space of triple covers. We describe in detail the birational geometry of the spaces of triple covers of \(P^1\) with a marked fiber. In this case, we obtain a sequence of birational models that begins with the space of marked (twisted) admissible covers and proceeds through the following transformations: (1) sequential contractions of the boundary divisors, (2) contraction of the hyperelliptic divisor, (3) sequential flips of the higher Maroni loci, (4) contraction of the Maroni divisor (for even g). The sequence culminates in a Fano variety in the case of even g, which we describe explicitly, and a variety fibered over \(P^1\) with Fano fibers in the case of odd g. / Mathematics
|
225 |
Modeling achievement in the presence of student mobility : a growth curve model for multiple membership dataGrady, Matthew William, 1981- 03 December 2010 (has links)
The current study evaluated a multiple-membership growth curve model that can be used to model growth in student achievement, in the presence of student mobility. The purpose of the study was to investigate the impact of ignoring multiple school membership when modeling student achievement across time. Part one of the study consisted of an analysis of real longitudinal student achievement data. This real data analysis compared parameter estimates, standard error estimates, and model-fit statistics obtained from a growth curve model that ignores multiple membership, to those obtained from a growth model that accounts for multiple school membership via the MMREM approach. Part two of the study consisted of a simulation study designed to determine the impact of ignoring multiple membership and the accuracy of parameter estimates obtained under the two modeling approaches, under a series of data conditions. The goal of the study was to assess the importance of incorporating a more flexible MMREM approach when modeling students’ academic achievement across time. Overall, the results of the current study indicated that the Cross-classified multiple membership growth curve model (CCMM-GCM) may provide more accurate parameter estimates than competing approaches for a number of data conditions. Both modeling approaches, however, yielded substantially biased estimates of parameters for some experimental conditions. Overall, results demonstrate that incorporating student mobility into achievement growth modeling can result in more accurate estimates of schools effects. / text
|
226 |
Decline curve analysis in unconventional resource plays using logistic growth modelsClark, Aaron James 06 October 2011 (has links)
Current models used to forecast production in unconventional oil and gas formations are often not producing valid results. When traditional decline curve analysis models are used in shale formations, Arps b-values greater than 1 are commonly obtained, and these values yield infinite cumulative production, which is non-physical.. Additional methods have been developed to prevent the unrealistic values produced, like truncating hyperbolic declines with exponential declines when a minimum production rate is reached. Truncating a hyperbolic decline with an exponential decline solves some of the problems associated with decline curve analysis, but it is not an ideal solution. The exponential decline rate used is arbitrary, and the value picked greatly effects the results of the forecast.
A new empirical model has been developed and used as an alternative to traditional decline curve analysis with the Arps equation. The new model is based on the concept of logistic growth models. Logistic growth models were originally developed in the 1830s by Belgian mathematician, Pierre Verhulst, to model population growth. The new logistic model for production forecasting in ultra-tight reservoirs uses the concept of a carrying capacity. The carrying capacity provides the maximum recoverable oil or gas from a single well, and it causes all forecasts produced with this model to be within a reasonable range of known volumetrically available oil. Additionally the carrying capacity causes the production rate forecast to eventually terminate as the cumulative production approaches the carrying capacity.
The new model provides a more realistic method for forecasting reserves in unconventional formations than the traditional Arps model. The typical problems encountered when using conventional decline curve analysis are not present when using the logistic model.
Predictions of the future are always difficult and often subject to factors such as operating conditions, which can never be predicted. The logistic growth model is well established, robust, and flexible. It provides a method to forecast reserves, which has been shown to accurately trend to existing production data and provide a realistic forecast based on known hydrocarbon volumes. / text
|
227 |
Fractals : an exploration into the dimensions of curves and sufacesWheeler, Jodi Lynette 02 February 2012 (has links)
When many people think of fractals, they think of the beautiful images created by Mandelbrot’s set or the intricate dragons of Julia’s set. However, these are just the artistic stars of the fractal community. The theory behind the fractals is not necessarily pretty, but is very important to many areas outside the world of mathematics.
This paper takes a closer look at various types of fractals, the fractal dimensionality of surfaces and chaotic dynamical systems. Some of the history and introduction of creating fractals is discussed. The tools used to prevent a modified Koch’s curve from overlapping itself, finding the limit of a curves length and solving for a surfaces dimensional measurement are explored. Lastly, an investigation of the theories of chaos and how they bring order into what initially appears to be random and unpredictable is presented. The practical purposes and uses of fractals throughout are also discussed. / text
|
228 |
On the Applicability of a Cache Side-Channel Attack on ECDSA Signatures : The Flush+Reload attack on the point multiplication in ECDSA signature generation processJosyula, Sai Prashanth January 2015 (has links)
Context. Digital counterparts of handwritten signatures are known as Digital Signatures. The Elliptic Curve Digital Signature Algorithm (ECDSA) is an Elliptic Curve Cryptography (ECC) primitive, which is used for generating and verifying digital signatures. The attacks that target an implementation of a cryptosystem are known as side-channel attacks. The Flush+Reload attack is a cache side-channel attack that relies on cache hits/misses to recover secret information from the target program execution. In elliptic curve cryptosystems, side-channel attacks are particularly targeted towards the point multiplication step. The Gallant-Lambert-Vanstone (GLV) method for point multiplication is a special method that speeds up the computation for elliptic curves with certain properties. Objectives. In this study, we investigate the applicability of the Flush+Reload attack on ECDSA signatures that employ the GLV method to protect point multiplication. Methods. We demonstrate the attack through an experiment using the curve secp256k1. We perform a pair of experiments to estimate both the applicability and the detection rate of the attack in capturing side-channel information. Results. Through our attack, we capture side-channel information about the decomposed GLV scalars. Conclusions. Based on an analysis of the results, we conclude that for certain implementation choices, the Flush+Reload attack is applicable on ECDSA signature generation process that employs the GLV method. The practitioner should be aware of the implementation choices which introduce vulnerabilities, and avoid the usage of such ECDSA implementations.
|
229 |
Near real-time runoff estimation using spatially distributed radar rainfall dataHadley, Jennifer Lyn 30 September 2004 (has links)
The purpose of this study was to evaluate variations of the Natural Resources Conservation Service (NRCS) curve number (CN) method for estimating near real-time runoff for naturalized flow, using high resolution radar rainfall data for watersheds in various agro-climatic regions of Texas. The CN method is an empirical method for calculating surface runoff which has been tested on various systems over a period of several years. Many of the findings of previous studies indicate the need to develop variations of this method to account for regional and seasonal changes in weather patterns and land cover that might affect runoff. This study seeks to address these issues, as well as the inherent spatial variability of rainfall, in order to develop a means of predicting runoff in near real-time for water resource management. In the past, raingauge networks have provided data for hydrologic models. However, these networks are generally unable to provide data in real-time or capture the spatial variability associated with rainfall. Radar networks, such as the Next Generation Weather Radar (NEXRAD) of the National Weather Service (NWS), which are widely available and continue to improve in quality and resolution, can accomplish these tasks. In general, a statistical comparison of the raingauge and NEXRAD data, where both were available, shows that the radar data is as representative of observed rainfall as raingauge data. In this study, watersheds of mostly homogenous land cover and naturalized flow were used as study areas. Findings indicate that the use of a dry antecedent moisture condition CN value and an initial abstraction (Ia) coefficient of 0.1 produced statistically significant results for eight out of the ten watersheds tested. The urban watershed used in this study produced more significant results with the use of the traditional 0.2 Ia coefficient. The predicted results before and during the growing season, in general, more closely agreed with the observed runoff than those after the growing season. The overall results can be further improved by altering the CN values to account for seasonal vegetation changes, conducting field verification of land cover condition, and using bias-corrected NEXRAD rainfall data.
|
230 |
Elipsinių kreivių taškų skaičiavimo algoritmai ir jų taikymai / Elliptic Curve Points Calculation Algorithms and their ApplicationPocienė, Jurgita 08 June 2006 (has links)
Pociene, Jurgita. Informatics Master’s Final Thesis. Elliptic Curve Points Calculation Algorithms and their Application. Work leader dr. R. Steuding. Siauliai University. Siauliai, 2006. 35 pages
In the work I analyse calculation algorithms of the points on elliptic curves above a body Fp (the body is above primary numbers’ field) and their application opportunities.
Basic aims, set for the master’s thesis (to analyse elliptic curve points calculation algorithms and to compare them, to review application of elliptic curve points calculation algorithms, to realize Schoof elliptic curve points calculation algorithm and to analyse their effectivization possibilities) were attained. Elliptic curve points calculation algorithms were analysed and compared. System, realizing Schoof algorithm – one of the most important of elliptic curve points calculation algorithms – was created.
Main problems encountered include: standard programming system data types are insufficient to operate with large (~2150) numbers, therefore MIRACL library was employed, enabling use of large numbers in a program to find out whether the number is primary and to perform calculations with polynomials. For result output HTML was used as the form more acceptable for user (to derive polynomial equations).
Also it was concluded that finding order of elliptic curves is important for cryptosystem to select a safe curve. A number of methods were created to calculate order of a curve, including one or the most... [to full text]
|
Page generated in 0.0332 seconds