• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 548
  • 275
  • 80
  • 78
  • 71
  • 30
  • 26
  • 25
  • 19
  • 17
  • 10
  • 7
  • 6
  • 6
  • 6
  • Tagged with
  • 1474
  • 188
  • 146
  • 106
  • 104
  • 96
  • 94
  • 89
  • 81
  • 68
  • 68
  • 66
  • 60
  • 58
  • 58
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
391

Relationship Between Concentric Velocities at Varying Intensity in the Back Squat Using a Wireless Inertial Sensor

Carroll, Kevin M., Sato, Kimitake, Beckham, George K., Triplett, N. Travis, Griggs, Cameron V., Stone, Michael H. 01 January 2017 (has links)
Objectives: The purpose of this study was to examine the relationship of velocities in the back squat between one repetition maximum (1RM) and submaximally loaded repetition maximum (RM) conditions, specifically in regard to what has been described as the minimal velocity threshold (MVT). The MVT describes a minimum concentric velocity that an individual must reach or surpass in order to successfully complete a repetition. Design: To test the presence of a MVT, participants were tested for 1RM and RM back squat ability. The mean concentric veloci ties (MCV) of the last successful repetition of each condition were then compared. Methods: Fourteen male participants familiar with the back squat volunteered to participate in the current study (age = 25.0 y ± 2.6, height = 178.9 cm ± 8.1, body mass = 88.2 kg ± 15.8). The mean concentric velocity (MCV) during the last successful repetition from each testing condition was considered for the comparison. Results: Results indicated a non-significant negative relationship of MCV between the 1RM and RM conditions (r = -0.135), no statistical difference between testing conditions (p = 0.266), with a small-to-moderate effect size (d = 0.468). Conclusions: The results of this study suggest that MVT should be further investigated to enhance its use in the practical setting. Additionally, coaches considering using a velocity-based approach for testing athletes should use data from either 1RM or RM conditions, but not both interchangeably. Coaches should be cautious when considering group averages or comparing velocity data between athletes, which may not be appropriate based on our results.
392

Any Effect of Gymnastics Training on Upper-Body and Lower-Body Aerobic and Power Components in National and International Male Gymnast?

Jemni, Monem, Sands, William A., Friemel, Françoise, Stone, Michael H., Cooke, Carlton B. 01 November 2006 (has links)
Aerobic and anaerobic performance of the upper body (UB) and lower body (LB) were assessed by arm cranking and treadmill tests respectively in a comparison of national (N) and international (I) male gymnasts. Force velocity and Wingate tests were performed using cycle ergometers for both arms and legs. In spite of a significant difference in training volume (4–12 vs. 27–34 h·wk−1 for N and I, respectively), there was no significant difference between N and I in aerobic and anaerobic performance. Upper body and LB maximal oxygen uptake (JOURNAL/jscr/04.02/00124278-200611000-00029/ENTITY_OV0312/v/2017-07-20T235327Z/r/image-pngO2max) values were 34.44 ± 4.62 and 48.64 ± 4.63 ml·kg−1·min−1 vs. 33.39 ± 4.77 and 49.49 ± 5.47 ml·kg−1·min−1, respectively, for N and I. Both N and I had a high lactic threshold (LT), at 76 and 82% of JOURNAL/jscr/04.02/00124278-200611000-00029/ENTITY_OV0312/v/2017-07-20T235327Z/r/image-pngO2max, respectively. Values for UB and LB force velocity (9.75 ± 1.12 and 15.07 ± 4.25 vs. 10.63 ± 0.95 and 15.87 ± 1.25 W·kg−1) and Wingate power output (10.43 ± 0.74 and 10.98 ± 3.06 vs. 9.58 ± 0.60 and 13.46 ± 1.34 W·kg−1) were also consistent for N and I. These findings confirm the consistency of JOURNAL/jscr/04.02/00124278-200611000-00029/ENTITY_OV0312/v/2017-07-20T235327Z/r/image-pngO2max values presented for gymnasts in the last 4 decades, together with an increase in peak power values. Consistent values for aerobic and anaerobic performance suggest that the significant difference in training volume is related to other aspects of perfomance that distinguish N from I gymnasts. Modern gymnastics training at N and I levels is characterized by a focus on relative strength and peak power. In the present study, the high LT is a reflection of the importance of strength training, which is consistent with research for sports such as wrestling. Aerobic and anaerobic performance of the upper body (UB) and lower body (LB) were assessed by arm cranking and treadmill tests respectively in a comparison of national (N) and international (I) male gymnasts. Force velocity and Wingate tests were performed using cycle ergometers for both arms and legs. In spite of a significant difference in training volume (4–12 vs. 27–34 h·wk−1 for N and I, respectively), there was no significant difference between N and I in aerobic and anaerobic performance. Upper body and LB maximal oxygen uptake (JOURNAL/jscr/04.02/00124278-200611000-00029/ENTITY_OV0312/v/2017-07-20T235327Z/r/image-pngO2max) values were 34.44 ± 4.62 and 48.64 ± 4.63 ml·kg−1·min−1 vs. 33.39 ± 4.77 and 49.49 ± 5.47 ml·kg−1·min−1, respectively, for N and I. Both N and I had a high lactic threshold (LT), at 76 and 82% of JOURNAL/jscr/04.02/00124278-200611000-00029/ENTITY_OV0312/v/2017-07-20T235327Z/r/image-pngO2max, respectively. Values for UB and LB force velocity (9.75 ± 1.12 and 15.07 ± 4.25 vs. 10.63 ± 0.95 and 15.87 ± 1.25 W·kg−1) and Wingate power output (10.43 ± 0.74 and 10.98 ± 3.06 vs. 9.58 ± 0.60 and 13.46 ± 1.34 W·kg−1) were also consistent for N and I. These findings confirm the consistency of JOURNAL/jscr/04.02/00124278-200611000-00029/ENTITY_OV0312/v/2017-07-20T235327Z/r/image-pngO2max values presented for gymnasts in the last 4 decades, together with an increase in peak power values. Consistent values for aerobic and anaerobic performance suggest that the significant difference in training volume is related to other aspects of perfomance that distinguish N from I gymnasts. Modern gymnastics training at N and I levels is characterized by a focus on relative strength and peak power. In the present study, the high LT is a reflection of the importance of strength training, which is consistent with research for sports such as wrestling.
393

Characterization of Risk From Airborne Benzene Exposure in the State of Florida

Johnson, Giffe 13 March 2008 (has links)
Environmental airborne benzene is a ubiquitous hazardous air pollutant whose emissions are generated from multiple sources, including industrial emissions, fuel station emissions, and automobile emissions. Chronic occupational exposures to elevated levels of benzene are known to be associated with leukemic cancers, in particular, acute myeloid leukemia (AML), though epidemiological evidence regarding environmental exposures and subsequent AML development is lacking. This investigation uses historical airborne monitoring data from six counties in the State of Florida to characterize the environmental cancer risk from airborne benzene concentrations using current Federal and State regulatory analysis methodology, and a comparative analysis based on occupational epidemiological evidence. Airborne benzene concentrations were collected from 24 air toxics monitoring stations in Broward, Duval, Orange, Miami-Dade, Hillsborough, and Pinellas counties. From the years 2003 - 2006, 3,794 air samples were collected using 8, 12, and 24 hr samples with sub-ambient pressure canister collectors consistent with EPA benzene methodological protocols 101 and 176. Mean benzene concentrations, by site, ranged from 0.18 - 3.58 ppb. Using risk analysis methodology consistent with the EPA and the Florida Department of Environmental Protection (FLDEP) the resulting cancer risk estimates ranged from 4.37 x 10-6 to 8.56 x 10-5, exceeding the FLDEP's acceptable cancer risk level, 1 x 10-6 for all monitoring sites. The cumulative lifetime exposures were calculated in ppm-years by site, ranging from 0.036 - 0.702 ppmyears. A comparative analysis with available epidemiological literature revealed that associations between benzene exposure and cancer outcomes were related to cumulative lifetime exposures in great excess of 1 ppm-years. The results of this investigation indicate that it is not reasonable to expect additional cancer outcomes in Florida residents as a result of airborne benzene exposures consistent with measured concentrations, despite the fact that all regulatory risk calculations exceed acceptable cancer risk levels in the State of Florida.
394

Benzene Related Hematological Disorders: Evidence for a Threshold in Animals and Humans

McCluskey, James 16 July 2008 (has links)
Significant benzene exposure has historically been associated with the development of a host of hematological disorders in humans and animals. In particular, benzene is known to cause disturbances of the peripheral blood, aplastic anemia and cancer of the lymphohematopoietic system. In 1928, the first modern report of an association between cancer and benzene exposure was published. This case report was followed by additional reports from around the world. In most instances, ailments resulted from long term, high level exposure to benzene found in glues, and through accidental industrial spills. Throughout the 1960's and 1970's, case reports accumulated linking benzene exposure to hematological cancers, particularly among leather workers in Turkey and Italy. At the time, only qualitative measures of benzene exposure were often available and most exposure information was based upon short term grab samples and subjective symptoms. However, this situation changed drastically in the mid-1970s, when the first report was published on a little known industry that manufactured rubber hydrochloride, also known as Pliofilm. This clear film product was made from natural rubber latex and processing utilized benzene in multiple stages. It appeared from the outset that there were an unusually large number of acute leukemia cases in this cohort of workers. Since that time, multiple follow-up evaluations of the same cohort have attempted to refine the benzene exposure of these workers. Benzene has subsequently been classified as a human carcinogen by several regulatory bodies and the allowable 8 hour time-weighted average has been lowered to 1 ppm. In pursuing the goal of protecting workers, regulatory bodies utilize a linear extrapolation, or no threshold dose, approach to cancer causation. This methodology assumes that every exposure brings an incremental rise in risk. In this work, the linear extrapolation methodology is tested utilizing the criteria proposed by Sir Bradford Hill. The Hill Criteria are used to critically evaluate the weight of evidence for a threshold dose that can cause hematological cancer in humans following benzene exposure. This evaluation revealed that there is sufficient evidence for a threshold dose and that linear extrapolation is designed to protect, not predict disease.
395

Threshold characteristics of multimode laser oscillators

Khoshnevissan, Mehdi 01 January 1987 (has links)
The threshold characteristics of multimode laser oscillators are considered in detail, and a new model is given for semiconductor diode lasers. Analytical expressions and numerical solutions are obtained for mode amplitudes, and over-all spectral characteristics of lasers operating above and below threshold. The theoretical results are i n agreement with experimental data. The band-ta-band absorption is included in the model and i t ' s effect is studied on the mixed broadening.
396

Semiparametric regression analysis of zero-inflated data

Liu, Hai 01 July 2009 (has links)
Zero-inflated data abound in ecological studies as well as in other scientific and quantitative fields. Nonparametric regression with zero-inflated response may be studied via the zero-inflated generalized additive model (ZIGAM). ZIGAM assumes that the conditional distribution of the response variable belongs to the zero-inflated 1-parameter exponential family which is a probabilistic mixture of the zero atom and the 1-parameter exponential family, where the zero atom accounts for an excess of zeroes in the data. We propose the constrained zero-inflated generalized additive model (COZIGAM) for analyzing zero-inflated data, with the further assumption that the probability of non-zero-inflation is some monotone function of the (non-zero-inflated) exponential family distribution mean. When the latter assumption obtains, the new approach provides a unified framework for modeling zero-inflated data, which is more parsimonious and efficient than the unconstrained ZIGAM. We develop an iterative algorithm for model estimation based on the penalized likelihood approach, and derive formulas for constructing confidence intervals of the maximum penalized likelihood estimator. Some asymptotic properties including the consistency of the regression function estimator and the limiting distribution of the parametric estimator are derived. We also propose a Bayesian model selection criterion for choosing between the unconstrained and the constrained ZIGAMs. We consider several useful extensions of the COZIGAM, including imposing additive-component-specific proportional and partial constraints, and incorporating threshold effects to account for regime shift phenomena. The new methods are illustrated with both simulated data and real applications. An R package COZIGAM has been developed for model fitting and model selection with zero-inflated data.
397

Understanding Security Threats of Emerging Computing Architectures and Mitigating Performance Bottlenecks of On-Chip Interconnects in Manycore NTC System

Rajamanikkam, Chidhambaranathan 01 May 2019 (has links)
Emerging computing architectures such as, neuromorphic computing and third party intellectual property (3PIP) cores, have attracted significant attention in the recent past. Neuromorphic Computing introduces an unorthodox non-von neumann architecture that mimics the abstract behavior of neuron activity of the human brain. They can execute more complex applications, such as image processing, object recognition, more efficiently in terms of performance and energy than the traditional microprocessors. However, focus on the hardware security aspects of the neuromorphic computing at its nascent stage. 3PIP core, on the other hand, have covertly inserted malicious functional behavior that can inflict range of harms at the system/application levels. This dissertation examines the impact of various threat models that emerges from neuromorphic architectures and 3PIP cores. Near-Threshold Computing (NTC) serves as an energy-efficient paradigm by aggressively operating all computing resources with a supply voltage closer to its threshold voltage at the cost of performance. Therefore, STC system is scaled to many-core NTC system to reclaim the lost performance. However, the interconnect performance in many-core NTC system pose significant bottleneck that hinders the performance of many-core NTC system. This dissertation analyzes the interconnect performance, and further, propose a novel technique to boost the interconnect performance of many-core NTC system.
398

Handover Performance in the Mobile WiMAX Netrworks

Yu, Yongxue 29 October 2009 (has links)
Mobile terminals allow users to access service while on the move. This unique feature has driven the rapid growth in the mobile network industry, changing it from a new technology into a massive industry in less than two decades. In this thesis, an in-depth study of the handover effects of mobile WiMAX networks is carried out. The mobile WiMAX technology is first presented as literature study and then the technologies of handovers for previous generations are introduced in detail. Further, the hard handover of the mobile WiMAX is simulated by Network Simulator-2 (NS-2). In addition, the "ping-pang" effect of handover was investigated and the call blocking and dropping probabilities are implemented using MATLAB. The goal is to find out which parameters have the significant impact on the handover performance. The results showed that the threshold and hysteresis margin of the handover should be selected by considering the tradeoff between the "ping-pang" effect and the extra interference causing to neighboring cells due to the poor quality link. The handover latency of mobile WiMAX is below 50 ms with the traveling speed of mobile station up to 20 m/s.
399

Agent Ordering and Nogood Repairs in Distributed Constraint Solving

Zhou, Lingzhong, n/a January 2006 (has links)
The distributed constraint satisfaction problem is a general formalization used to represent problems in distributed multi-agent systems. A large body of problems in artificial intelligence and computer science can be easily formulated as distributed constraint satisfaction problems. In this thesis we study agent ordering, effects of no-goods, search efficiency and threshold repairing in distributed constraint satisfaction problems and its variants. A summary of contributions is as follows: 1. We present a new algorithm, Dynamic Agent Ordering. A distinctive feature of this algorithm is that it uses the degree of unsatisfiability as a guiding parameter to dynamically determine agent ordering during the search. We show through an empirical study that our algorithm performs better than the existing approaches. In our approach, the independence of agents is guaranteed and agents without neighbouring relationships can run concurrently and asynchronously. (Part of this work was published in the Australian Al Conference (80)). 2. We extend the Dynamic Agent Ordering algorithm by incorporating a novel technique called nogood repairing. This results in a dramatic reduction in the nogoods being stored, and communication costs. In an empirical study, we11 show that this approach outperforms an equivalent static ordering algorithm and a current state-of-the-art technique in terms of execution time, memory usage and communication cost. (Part of this work was published at FLAIRS Conference (81)). Further, we introduce a new algorithm, Over-constrained Dynamic Agent Ordering, that breaks new ground in handling multiple variables per agent in distributed over-constrained satisfaction problems. The algorithm also uses the degree of unsatisfiability as a measure for relaxing constraints, and hence as a way to guide the search toward the best optimal solution(s). By applying our Threshold Repair method, we can solve a distributed constraint satisfaction problem without knowing whether the problem is under- or over-constrained. In an experimental study, we show that the new algorithm compares favourably to an implementation of asynchronous weak commitment search adapted to handle over-constrained problems. (Part of this work was published at the Canadian AI conference (79)).
400

A Method for Efficient Transmission of XML Data across a Network

Ridgewell, Alexander Graham, n/a January 2007 (has links)
Extensible Markup Language (XML) is a simple, very flexible text format derived from SGML (ISO 8879), which is a well defined, public standard. It uses plain text to encode a hierarchical set of information using verbose tags to allow the XML document to be understood without any special reader. The use of schemas in XML also allows a well defined contract describing what a single XML document means. The self-contained nature of XML and the strong contract provided by its schemas makes it useful as an archival storage format and as a means of communicating across system or organizational boundaries. As such XML is being increasingly used by businesses throughout the world. These businesses use XML as a means of storing, transmitting and (with the use of style sheets) displaying information. The simple, well defined structure of XML does present some problems when it is used by businesses and similar organizations. As it is an open, plain text based standard care must be taken when looking at security. The use of plain text with verbose tags also results in XML documents that are far larger than other means of storing the same information. This thesis focuses on the affect of the large size of XML when it is used to communicate across a network. This large size can often increase the time taken to transmit the document and we were interested to see how it could be minimized. we investigated the ways that are used to control the size of XML documents and how they are transmitted. We carefully investigated by implementing solutions on how to transmit the XML document. We then first presented a new method, called dynamic adaptive threshold transmission (DATT), in comparisons with other existing similar methods, which, under the discussed conditions, offers significant improvements in transmission times and network transmission efficiencies.

Page generated in 0.0494 seconds