Spelling suggestions: "subject:"interval"" "subject:"8interval""
101 |
The effects of high intensity interval training on pulmonary functionDunham, Cali A. January 1900 (has links)
Master of Science / Department of Kinesiology / Craig A. Harms / High-intensity interval training (HIT) has been utilized as a time-efficient strategy to induce numerous physiological adaptations and improve performance usually associated with “traditional” endurance training (ET). It is not known however, if HIT might lead to improvements in pulmonary function. Therefore we hypothesized that HIT would increase respiratory muscle strength and expiratory flow rates. Fifteen healthy subjects were randomly assigned to an ET group (n = 7) and a HIT group (n = 8). All subjects performed an incremental test to exhaustion (VO2 max) on a cycle ergometer prior to and after training. Standard pulmonary function tests, maximum inspiratory pressure (PImax), maximum expiratory pressure (PEmax), and maximal flow volume loops, were performed pre training and after each week of training. HIT subjects performed a four week training program on a cycle ergometer at 90% of their VO2 max final workload while the ET subjects performed exercise at 60-70% of their VO2 max final workload. All subjects trained three days/ week. The HIT group performed five one-minute bouts with three minute recovery periods and the ET group cycled for 45 minutes continuously at a constant workload. A five-mile time trial was performed prior to training, after two weeks of training, and after four weeks of training. Both groups showed similar (p<0.05) increases in VO2 max (~8-10%) and improvements in time trials following training (HIT 6.5 ± 1.3%, ET 4.4 ± 1.8%) with no difference (p>0.05) between groups. Both groups increased (p<0.05) PImax post training (ET ~25%, HIT ~43%) with values significantly higher for HIT than ET. There was no change (p>0.05) in
expiratory flow rates with training in either group. These data suggest that whole body exercise training is effective in increasing inspiratory muscle strength with HIT leading to greater improvements than ET. Also, HIT offers a time-efficient alternative to ET in improving aerobic capacity and performance.
|
102 |
Runtime detection and prevention for Structure Query Language injection attacksShafie, Emad January 2013 (has links)
The use of Internet services and web applications has grown rapidly because of user demand. At the same time, the number of web application vulnerabilities has increased as a result of mistakes in the development where some developers gave the security aspect a lower priority than aspects like application usability. An SQL (structure query language) injection is a common vulnerability in web applications as it allows the hacker or illegal user to have access to the web application's database and therefore damage the data, or change the information held in the database. This thesis proposes a new framework for the detection and prevention of new and common types of SQL injection attacks. The programme of research is divided in several work packages that start from addressing the problem of the web application in general and SQL injection in particular and discuss existing approaches. The other work packages follow a constructive research approach. The framework considers existing and new SQL injection attacks. The framework consists of three checking components; the first component will check the user input for existing attacks, the second component will check for new types of attacks, and the last component will block unexpected responses from the database engine. Additionally, our framework will keep track of an ongoing attack by recording and investigating user behaviour. The framework is based on the Anatempura tool, a runtime verification tool for Interval Temporal Logic properties. Existing attacks and good/bad user behaviours are specified using Interval Temporal Logic, and the detection of new SQL injection attacks is done using the database observer component. Moreover, this thesis discusses a case study where various types of user behaviour are specified in Interval Temporal Logic and show how these can be detected. The implementation of each component has been provided and explained in detail showing the input, the output and the process of each component. Finally, the functionality of each checking component is evaluated using a case study. The user behaviour component is evaluated using sample attacks and normal user inputs. This thesis is summarized at the conclusion chapter, the future work and the limitations will be discussed. This research has made the following contributions: • New framework for detection and prevention of SQL injection attacks. • Runtime detection: use runtime verification technique based on Interval Temporal logic to detect various types of SQL injection attacks. • Database observer: to detect possible new injection attacks by monitoring database transactions. • User's behaviour: investigates related SQL injection attacks using user input, and providing early warning against SQL injection attacks.
|
103 |
Behaviour-based virus analysis and detectionAl Amro, Sulaiman January 2013 (has links)
Every day, the growing number of viruses causes major damage to computer systems, which many antivirus products have been developed to protect. Regrettably, existing antivirus products do not provide a full solution to the problems associated with viruses. One of the main reasons for this is that these products typically use signature-based detection, so that the rapid growth in the number of viruses means that many signatures have to be added to their signature databases each day. These signatures then have to be stored in the computer system, where they consume increasing memory space. Moreover, the large database will also affect the speed of searching for signatures, and, hence, affect the performance of the system. As the number of viruses continues to grow, ever more space will be needed in the future. There is thus an urgent need for a novel and robust detection technique. One of the most encouraging recent developments in virus research is the use of formulae, which provides alternatives to classic virus detection methods. The proposed research uses temporal logic and behaviour-based detection to detect viruses. Interval Temporal Logic (ITL) will be used to generate virus specifications, properties and formulae based on the analysis of the behaviour of computer viruses, in order to detect them. Tempura, which is the executable subset of ITL, will be used to check whether a good or bad behaviour occurs with the help of ITL description and system traces. The process will also use AnaTempura, an integrated workbench tool for ITL that supports our system specifications. AnaTempura will offer validation and verification of the ITL specifications and provide runtime testing of these specifications.
|
104 |
Accuracy of Computer Simulations that use Common Pseudo-random Number GeneratorsDusitsin, Krid, Kosbar, Kurt 10 1900 (has links)
International Telemetering Conference Proceedings / October 26-29, 1998 / Town & Country Resort Hotel and Convention Center, San Diego, California / In computer simulations of communication systems, linear congruential generators and shift registers are typically used to model noise and data sources. These generators are often assumed to be close to ideal (i.e. delta correlated), and an insignificant source of error in the simulation results. The samples generated by these algorithms have non-ideal autocorrelation functions, which may cause a non-uniform distribution in the data or noise signals. This error may cause the simulation bit-error-rate (BER) to be artificially high or low. In this paper, the problem is described through the use of confidence intervals. Tests are performed on several pseudo-random generators to access which ones are acceptable for computer simulation.
|
105 |
Association between Use of a Specialized Diagnostic Assessment Unit and the Diagnostic Interval in Ontario Breast Cancer PatientsJiang, Li 18 November 2013 (has links)
Background: The amount of time that it takes to get a breast cancer diagnosis is very important to patients. The Ontario diagnostic assessment unit (DAU) is designed to improve the quality and timeliness of care during a breast cancer diagnosis. This study described and examined the association between the length of the diagnostic interval and DAU use in Ontario, Canada. Methods: This was a retrospective cohort study among all breast cancer patients diagnosed between Jan 1st, 2011 and Dec 31st, 2011 in Ontario, Canada. DAU use and diagnostic intervals were described. The association between DAU use and the diagnostic interval was examined separately in a cohort of 2499 screen-detected patients and a cohort of 4381 symptomatic patients. Study data sources included administrative databases available at the Institute for Clinical Evaluative Sciences (ICES) and Cancer Care Ontario (CCO). The diagnostic interval was defined as the time from the index contact to the cancer diagnosis. DAU use was determined based on the payment record within the organized screening program as well as the hospital where patients were diagnosed. Multivariate median regressions were used to control for possible confounders. Results: On average, Ontario breast cancer patients waited 4.6 weeks to be diagnosed. Forty-eight percent were diagnosed in a DAU and 52% were diagnosed in the usual care route. In screen-detected patients, DAUs had a higher rate in meeting national timeliness targets compared to usual care (79.1% vs. 70.2%, p<0.001). DAU use was significantly associated with an 8.3-day decrease in the time to diagnosis (95% CI: 6.5-10.2) after controlling for potential confounders. In symptomatic patients, DAUs also had a higher rate in achieving the Canadian timeliness targets compare to usual care (71.7% vs. 58.1%, p<0.001). DAUs significantly reduced the time to diagnosis by 10 days (95% CI: 7.8-11.9) after controlling for possible confounders. Conclusions: We observed considerable variation in breast cancer diagnostic intervals and DAU use in Ontario. Use of Ontario DAUs was associated with improved diagnostic timeliness for breast cancer patients. / Thesis (Master, Community Health & Epidemiology) -- Queen's University, 2013-11-18 13:19:03.231
|
106 |
A Bayesian method to improve sampling in weapons testingFloropoulos, Theodore C. 12 1900 (has links)
Approved for public release; distribution is unlimited / This thesis describes a Bayesian method to determine the number of samples needed to estimate a proportion or probability with 95% confidence when prior bounds are placed on that proportion. It uses the Uniform [a,b] distribution as the prior, and develops a computer program and tables to find the sample size. Tables and examples are also given to compare these results with other approaches for finding sample size. The improvement that can be obtained with this method is fewer samples, and consequently less cost in Weapons Testing is required to meet a desired confidence size for a proportion or probability. / http://archive.org/details/bayesianmethodto00flor / Lieutenant Commander, Hellenic Navy
|
107 |
Improving Energy-Efficiency of Multicores using First-Order ModelingSpiliopoulos, Vasileios January 2016 (has links)
In the recent decades, power consumption has evolved to one of the most critical resources in a computer system. In the form of electricity bill in data centers, battery life in mobile devices, or thermal constraints in desktops and laptops, power consumption imposes several limitations in today’s processors and improving power and energy efficiency is one of the most urgent research topics of Computer Architecture. Dynamic Voltage and Frequency Scaling (DVFS) and Cache Resizing are among the most popular energy saving techniques. Previous work, however, has focused on developing heuristics and trial-and-error methods that yield acceptable savings, but fail to provide insight and understanding of how these techniques affect power and performance of a computer system. In contrast, this Thesis proposes the use of first-order modeling to improve the energy efficiency of computer systems. A first-order model needs to be (i) accurate enough to efficiently drive DVFS and Cache Resizing decisions, and (ii) simple enough to eliminate the overhead of collecting the required inputs to the model. We show that such models can be constructed and successfully applied in modern systems. For DVFS, we propose to scale frequency down to exploit applications’ memory slack, i.e., periods that the processor spends waiting for data to be fetched from the main memory. In such cases, the processor frequency can be scaled down to save energy without inordinate performance penalty. Our DVFS models can detect slack and predict the impact of DVFS in both power and performance with great accuracy. Cache Resizing, on the other hand, relies on the fact that many applications do not benefit from the vast amount of cache that modern processors are equipped with. In such cases, the cache can be resized to save static energy consumption at limited performance cost. Since both techniques are related with the memory behavior of applications, we propose a unified model to manage the two techniques in tandem and maximize energy efficiency through synergistic DVFS and Cache Resizing. Finally, our experience with DVFS in real systems motivated us to contribute to the integration of DVFS into the gem5 simulator. Unlike other simulators that ignore the role of OS in DVFS, we extend the gem5 simulator by developing the hardware and software components that allow existing Linux DVFS infrastructure to be seamlessly integrated in the simulator.
|
108 |
Aspects of Universality in Function IterationTaylor, John (John Allen) 12 1900 (has links)
This work deals with some aspects of universal topological and metric dynamic behavior of iterated maps of the interval.
|
109 |
The performance and robustness of confidence intervals for the median of a symmetric distribution constructed assuming sampling from a Cauchy distributionCao, Jennifer Yue January 1900 (has links)
Master of Science / Department of Statistics / Paul Nelson / Trimmed means are robust estimators of location for distributions having heavy tails. Theory and simulation indicate that little efficiency is lost under normality when using appropriately trimmed means and that their use with data from distributions with heavy tails can result in improved performance. This report uses the principle of equivariance applied to trimmed means sampled from a Cauchy distribution to form a discrepancy function of the data and parameters whose distribution is free of the unknown median and scale parameter. Quantiles of this discrepancy function are estimated via asymptotic normality and simulation and used to construct confidence intervals for the median of a Cauchy distribution. A nonparametric approach based on the distribution of order statistics is also used to construct confidence intervals. The performance of these intervals in terms of coverage rate and average length is investigated via simulation when the data are actually sampled from a Cauchy distribution and when sampling is from normal and logistic distributions. The intervals based on simulation estimation of the quantiles of the discrepancy function are shown to perform well across a range of sample sizes and trimming proportions when the data are actually sampled from a Cauchy distribution and to be relatively robust when sampling is from the normal and logistic distributions.
|
110 |
The effects of high intensity interval training on resting mean arterial pressure and C-reactive protein content in prehypertensive subjectsSkutnik, Benjamin C. January 1900 (has links)
Master of Science / Department of Kinesiology / Craig A. Harms / Subjects with prehypertension are at risk for developing hypertension (HTN). Hypertension is associated with low-grade systemic inflammation (LGSI). Aerobic exercise training (ET) is a proven means to reduce both blood pressure and LGSI in healthy and diseased subjects. Recently, high intensity interval training (HIIT) has been show to elicit similar cardiovascular and metabolic adaptations as ET in healthy and at-risk populations in a more time efficient manner. Therefore, we hypothesized that HIIT would elicit greater reductions in blood pressure and LGSI than ET. Twelve pre-hypertensive subjects (systolic blood pressure 127.0 ± 8.5 mmHg; diastolic blood pressure 86.2 ± 4.1 mmHg) were randomly assigned to an ET group (n=5) and a HIIT group (n=7). All subjects performed an incremental test to exhaustion (VO2max) on a cycle ergometer prior to, after 4 weeks, and after 8 weeks of training. Resting heart rate and blood pressure were measured prior to and three times a week during training. LGSI was measured via high-sensitivity C-reactive protein (hs-CRP) prior to, after 4 weeks and after 8 weeks of training. ET subjects performed an eight week exercise training program at 40% VO2 reserve determined from the VO2max test, while HIIT subjects performed exercise at 60% peak power determined from the VO2max test. ET group trained four days/week while HIIT trained three days/week. ET exercised for 30 minutes continuously at a constant workload and cadence of 60 rpm while HIIT performed a protocol on a 1:1 work-to-rest ratio at a constant workload and cadence of 100 rpm. Both groups showed similar (p<0.05) decreases in mean arterial (ET = -7.3%, HIIT = -4.5%), systolic (ET = -6.6%, HIIT = -8.8%), and diastolic (ET= -9.7, HIIT= -8.2%) blood pressure. HIIT decreased in LGSI (-33.7%) while ET did not change LGSI (p>0.05). VO2max increased ~25% with both HIIT and ET with no differences (p>0.05) between groups. These data suggest both HIIT and ET similarly decreased resting blood pressure and increased VO2max while HIIT was effective in decreasing LGSI in subjects who were pre-hypertensive.
|
Page generated in 0.0465 seconds