Spelling suggestions: "subject:"[een] SCALE"" "subject:"[enn] SCALE""
261 |
Evaluation of the Community Balance and Mobility Scale in a cardiac rehabilitation populationMartelli, Luke 05 December 2013 (has links)
Recent research indicates the need for a functional balance assessment in cardiac rehabilitation (CR) programs. One assessment technique that may be appropriate is the Community Balance and Mobility Scale (CBMS). The purpose of this study was to investigate psychometric properties of the CBMS when testing patients with cardiovascular disease (CVD). Thirty-one participants from community CR programs were recruited to perform the CBMS and measures of computerized dynamic posturography. Convergent validities between the measures were investigated using correlation coefficients, and floor and ceiling effects of the CBMS were analysed. The results indicated that the CBMS was moderately correlated with all computerized posturography variables, with no floor or ceiling effects present. Analysis of posturography results indicated that CR patients have decreased movement characteristics in the anterior and posterior directions. These findings indicate that the CBMS is a suitable tool to assess and monitor balance in a CR population.
|
262 |
Experimental and Numerical Investigation on Fouling Parameters in a Small-Scale Rotating UnitLane, Matthew Ryan 16 December 2013 (has links)
Fouling, a problem since the first heat exchanger was created, has been the focus of various studies since the 1970s. In particular, crude oil fouling is a costly and problematic type of heat exchanger fouling that occurs in the preheat train to the atmospheric distillation column in petroleum refineries. Previous experiments have been designed to determine the causes of fouling using less than one gallon of crude oil and accumulating test results within a day. These experiments will be the basis of the Rotating Fouling Unit (RFU) at Heat Transfer Research Inc. (HTRI). The RFU focuses on better controlling the shear stress and heat transfer distribution along the surface of the heated test section by analyzing Taylor-Couette flow experiments and using them as a basis to better predict the flow across the heated surface of the test section in the RFU. Additionally, the equations for Taylor-Couette flow are used to verify the 2D flow simulations of the RFU to ensure the accuracy of the results. The design of the RFU incorporates data acquisition with a variety of measurements that will facilitate automatic and accurate data collection, so the results can be easily compared to previous fouling experiments. The RFU will act as a supplement to the High Temperature Fouling Unit (HTFU) at HTRI, and provide data comparable to that of the HTFU in order to better understand crude oil fouling. Computer simulations can accurately predict the shear stress and heat transfer coefficient along the surface of the test probe and help verify the improvements made to the original batch stirred cell designs.
|
263 |
Towards More Efficient Delay Measurements on the InternetWebster, Patrick Jordan 16 December 2013 (has links)
As more applications rely on distributed systems (peer-to-peer services, content distribution networks, cloud services), it becomes necessary to identify hosts that return content to the user with minimal delay. A large scale map of delays would aid in solving this problem. Existing methods, which deploy devices to every region of the Internet or use of a single vantage point have yet to create such a map. While services such as PlanetLab offer a distributed network for measurements, they only cover 0.3% of the Internet. The focus of our research is to increase the speed of the single vantage point approach so that it becomes a feasible solution.
We evaluate the feasibility of performing large scale measurements by performing an experiment using more hosts than any previous study. First, an efficient scanning algorithm is developed to perform the measurement scan. We then find that a custom Windows network driver is required to overcome bottlenecks in the operating system. After developing a custom driver, we perform a measurement scan larger than any previous study. Analysis of the results reveals previously unidentified drawbacks to the existing architectures and measurement methodologies. We propose novel meth- ods for increasing the speed of experiments, improving the accuracy of measurement results, and reducing the amount of traffic generated by the scan. Finally, we present architectures for performing an Internet scale measurement scan.
We found that with custom drivers, the Windows operating system is a capable platform for performing large scale measurements. Scan results showed that in the eleven years since the original measurement technique was developed, the response patterns it relied upon had changed from what was expected. With our suggested improvements to the measurement algorithm and proposed scanning architectures, it may be possible to perform Internet scale measurement studies in the future.
|
264 |
HYPOTHESIS TESTING IN FINITE SAMPLES WITH TIME DEPENDENT DATA: APPLICATIONS IN BANKINGAllen, Jason, 1974- 26 September 2007 (has links)
This thesis is concerned with hypothesis testing in models where data exhibits
time dependence. The focus is on two cases where the dependence of observations
across time leads to non-standard hypothesis testing techniques.
This thesis first considers models estimated by Generalized Method of Moments
(GMM, Hansen (1982)) and the approach to inference. The main problem with
standard tests are size distortions in the test statistics. An innovative resampling
method, which we label Empirical Likelihood Block Bootstrapping, is proposed. The
first-order asymptotic validity of the proposed procedure is proven, and a series of
Monte Carlo experiments show it may improve test sizes over conventional block
bootstrapping. Also staying in the context of GMM this thesis shows that the testcorrection
given in Hall (2000) which improves power, can distort size with time
dependent data. In this case it is of even greater importance to use a bootstrap that
can have good size in finite samples.
The empirical likelihood is applied to a multifactor model of U.S. bank risk estimated
by GMM. The approach to inference is found to be important to the overall
conclusion about bank risk. The results suggest U.S. bank stock returns are sensitive
to movements in market and liquidity risk.
In the context of panel data, this thesis is the first to my knowledge to consider
the estimation of cost-functions as well as conduct inference taking into account the
strong dependence of data across time. This thesis shows that standard approaches
to estimating cost-functions for a set of Canadian banks lead to a downward bias in
the estimated coefficients and therefore an upward bias in the measure of economies
of scale. When non-stationary panel techniques are applied results suggest economies
of scale of around 6 per cent in Canadian banking as well as cost-efficiency differences
across banks that are correlated with size. / Thesis (Ph.D, Economics) -- Queen's University, 2007-09-24 17:25:22.212
|
265 |
Atom transfer radical polymerization with low catalyst concentration in continuous processesChan, Nicky 30 April 2012 (has links)
Atom transfer radical polymerization (ATRP) is a dynamic technique that possesses tremendous potential for the synthesis of novel polymeric materials not possible through conventional free radical polymerization. However, its use on an industrial scale has been limited by the high level of transition metal complex required. Significant advances have been made in the last 5 years towards lowering the level of copper complexes used in ATRP, resulting in novel variants called “activator regenerated by electron transfer” (ARGET) and “single electron transfer-living radical polymerization” (SET-LRP).
To fully realize the potential of ATRP, its use in industrially relevant processes must be studied. Continuous processes such as tubular flow reactors and stirred tank reactors (CSTR) can reduce waste, improve productivity and facilitate process scale-up when compared to common batch reactors. The combination of low copper concentration ATRP techniques and continuous processes are especially attractive towards the design of a commercially viable process. This thesis presents a study into ARGET ATRP and SET-LRP as applied to continuous tubular and stirred tank reactors for the production of acrylic and methacrylic polymers.
The equilibrium which governs polymerization rate and control over molecular architecture is studied through batch ARGET ATRP experiments. The improved understanding of ARGET ATRP enabled the reduction of ligand from a 3 to 10 fold excess used previously down to a stoichiometric ratio to copper salts. ARGET ATRP was then adapted to a continuous tubular reactor, as well as to a semi-automated CSTR. The design of the reactors and the effect of reaction conditions such as reducing agent concentration and residence time are discussed.
The use of common elemental copper(0) such as copper wire and copper tubing is also investigated with SET-LRP for room temperature polymerization of methyl acrylate. SET-LRP is adapted to a CSTR to observe the effects of residence time on reaction rate, molecular weight control as well as copper consumption rate. The use of copper tubing as a catalyst source for SET-LRP is demonstrated and the design of a continuous tubular reactor using a combination of copper and stainless steel tubing is discussed. / Thesis (Ph.D, Chemical Engineering) -- Queen's University, 2012-04-30 16:01:28.916
|
266 |
Methods for determining whether subscore reporting is warranted in large-scale achievement assessmentsBabenko, Oksana Illivna Unknown Date
No description available.
|
267 |
Generation of dicing damage in passivated silion wafersRepole, Kenzo K. D 12 1900 (has links)
No description available.
|
268 |
Asymmetric thermal cycles : a different approach to accelerated reliability assessment of microelectronic packagesClasse, Francis Christopher 08 1900 (has links)
No description available.
|
269 |
Generation of dicing damage in silicon wafersEbbutt, Ralph 08 1900 (has links)
No description available.
|
270 |
Calcium Sulfate Formation and Mitigation when Seawater was Used to Prepare HCl-Based AcidsHe, Jia 2011 December 1900 (has links)
It has been a practice to use seawater for preparing acid in offshore operations where fresh water is relatively expensive or logistically impossible to use. However, hydrochloric acid will release calcium ion into solution, which will combine with sulfate ion in seawater (greater than 3000 ppm) and calcium sulfate will precipitate once it exceeds its critical scaling tendency. A few studies have provided evidence for this problem and how to address this problem has not been fully examined.
Core flood tests were conducted using Austin Chalks cores (1.5 in. x 6 in. and 1.5 in. x 20 in.) with permeability 5 md to investigate the effectiveness of scale inhibitor. A synthetic seawater was prepared according to the composition of seawater in the Arabian Gulf. Calcium, sulfate ions, and scale inhibitor concentrations were analyzed in the core effluent samples. Solids collected in the core effluent samples were analyzed using X-ray photoelectron spectroscopy (XPS) technique and thermodynamic calculation using OLI Analyzer software were conducted to identify the critical scaling tendency of calcium sulfate at different temperatures.
Results showed that calcium sulfate precipitation occurred when seawater was used in any stage during matrix acidizing including preflush, post-flush, or in the main stage. Injection rate was the most important parameter that affected calcium sulfate precipitation; permeability reduction was significant at low flow rates, while at high rates wormhole breakthrough reduced the severity of the problem.
More CaSO4 precipitated at high temperatures, accounting for more significant permeability reduction in the cores. The values of critical scaling tendency at various temperatures calculated by OLI ScaleChem 4.0.3 were believed to be 2.1, 2.0, and 1.2 respectively.
A scale inhibitor (a sulfonated terpolymer) was found to be compatible with hydrochloric acid systems and can tolerate high concentration of calcium (30,000 mg/l). Analysis of core effluent indicated that the new treatment successfully eliminated calcium sulfate scale deposition. The concentration of scale inhibitor ranged from 20 to 250 ppm, depending on the scaling tendencies of calcium sulfate.
This work confirms the damaging effect of preparing hydrochloric acid solutions using seawater on the permeability of carbonate cores. Therefore, it is recommended to use fresh water instead of seawater to prepare HCl acids whenever possible. If fresh water is not available, then a proper scale inhibitor should be added to the acids to avoid calcium sulfate precipitation.
|
Page generated in 0.0397 seconds