Spelling suggestions: "subject:"exercise science"" "subject:"eexercise science""
61 |
Monitoring wellness, training load and neuromuscular performance: Implications for assessing athlete training statusLombard, Wayne 03 March 2022 (has links)
Background: Athletes training for peak performance have periods of systematic overload followed by recovery. The balance between overload and recovery is important to avoid unexpected fatigue or underperformance. The relationship between overload and recovery is unique for each athlete. Thus, programmes designed to monitor fitness and fatigue should consider the inter-athlete differences. Aim: The broad aim of the PhD thesis was to assess the relationships between various tools for monitoring fitness and fatigue in elite level athletes. Subjective and objective training/match demands, questionnaires to assess wellness and readiness-to-train as well as countermovement jump variables to assess neuromuscular performance were investigated within 4 inter-related studies. Methods: Four inter-related studies were designed to determine; 1) the validity and reliability of countermovement jump variables measured on a force plate in the laboratory; 2) the relationships between countermovement jump variables, responses to a wellness and readinessto-train questionnaire and exercise-induced fatigue in the laboratory; 3) the relationships between training load, responses to a wellness and readiness-to-train questionnaire and neuromuscular performance in elite level female field hockey athletes measured in a “realworld” situation, and 4) the relationships between these same variables for each athlete and whole team before, during and after international match play. Primary findings: The findings for each inter-related study were as follows; 1) Maximum force, rate of force development, jump height, flight time and time to maximum force, as measured on a force plate during a countermovement jump were valid and reliable. The typical error of measurement was defined for each variable. The validity and reliability were best in participants who had more strength training experience. In most cases the precision of the variables was sufficient to detect “small” changes. 2) Subjective measures (wellness questionnaire) were more sensitive to acute exercise-induced fatigue compared to objective measures of neuromuscular performance; 3) The relationship between variables differed between players. Multiple variables should be collected to better understand a player's subjective and objective fitness and fatigue status in response to subjective and objective measures of match and/or training demands; 4) Pre, intra and post-match related data should be collected to better understand individual player responses between matches. Variables such as jump height, rating of perceived exertion, total distance during the match, bodyload (a derived measure of the total external mechanical stress from accelerations, decelerations and change of direction) and subjective wellness should be considered when monitoring athlete training status. Conclusions: Firstly, there is no set standard battery of tools that can be used to monitor fitness and fatigue of athletes as the relationship between variables is not consistent between athletes. Variables such as jump height, rating of perceived exertion, total distance, bodyload and wellness responses should be considered in a monitoring system. Secondly, this thesis proposes the novel concept of “monitoring specificity”. This suggests that different tools, based on their responsiveness, should be used at an individual level. Thirdly, identifying which athletes are most sensitive to certain variables will reduce the “noise” within a team's monitoring system. This will enable better informed decisions to be made about the athlete's fitness/fatigue status.
|
62 |
Knowledge, attitudes and behaviours of top-level junior (under-19) rugby union coaches towards training the tackleSarembock, Martin January 2014 (has links)
Includes bibliographical references. / Background: The tackle in rugby union is a dynamic and high impact contact situation that occurs frequently during matches and exposes players to high risk of injury and muscle damage. The inability to tackle will result in opposition players gaining territory and possibly scoring points. Indeed, the ability to effectively engage in tackle contact has been associated with team success. While the risk of injury may always be present during these physical contests between the ball-carrier and tackler, coaching of proper techniques and skills may reduce the risk of injury, and at the same time improve performance. With that said, little is known about the knowledge, attitudes and behaviours of rugby union coaches towards coaching the tackle. Therefore the aim of this study was to assess coaches’ knowledge, attitudes and behaviours towards coaching the tackle. Methods: The top 8 rugby-playing schools (Premier A Division) in the Western Province Rugby Union participated in the study (representing 100% of the entire population of top-level junior schools in the region). A questionnaire was used to assess coaches’ knowledge, attitude and reported behaviour. Tackle training behaviour was also observed over a period of 4 weeks at the start of the season. Results: Sixty-two percent of coaches rated proper tackle technique to reduce the risk of injury as very important and 75% of coaches rated proper tackle technique as very important for improving performance. The tackle was practiced in 16% (n=15) of the total practice sessions (n=96). Coaches did not emphasise safety during the tackle sessions. Tackle training was over-reported by 75% (n=5) of coaches during the 4-week observational period. Discussion/Conclusion: Majority of coaches are aware of the high risk of injury associated with the tackle. Most coaches believe that tackle technique can improve tackle performance and safety during the tackle event. Coaches develop new 2 methods mostly through resources such as coaching colleagues and watching televised and live rugby matches. During the observed training period however, only 15 tackle training sessions were observed. It may be important to identify how much tackle training should occur during the pre-season and competition phase of the season to adequately prepare players for competition without increasing the risk of injury. The latest research on ways to reduce the risk of injury and improve performance in the tackle should also be disseminated through the appropriate channels that coaches are known to use. Tackle training guidelines should be based on scientific evidence, and these guidelines should outline how coaches need to design their training to meet their team requirements. Further research should identify which coaching behaviours can be used to effectively train tackle safety and tackle performance during training sessions. Keywords: Rugby union, tackling, coaching, injury prevention, attitude, knowledge, behaviour
|
63 |
Injury in elite rugby players during the Super 15 Rugby tournamentThomson, Alan January 2014 (has links)
Includes bibliographical references. / Professional rugby union is a contact sport with a high risk of injury. The Super Rugby competition is a particularly demanding 16-week Southern Hemisphere tournament. In this tournament, 15 teams compete and play international level matches every week, which may be associated with an even higher risk of injuries. The main objectives of this dissertation were 1) to review the epidemiology and risk factors of injuries in professional rugby union, with specific reference to the Super Rugby tournament (Part 1), and 2) to document the incidence and nature of time-loss injuries during the 2012 Super Rugby tournament (Part 2). Part 1: In this component of the dissertation, a comprehensive review of injuries during Super Rugby was undertaken. A search revealed only 3 studies that have been conducted during this competition. Therefore additional data were included from other studies on Rugby Union, where appropriate. Part 2: This component of the dissertation consists of a prospective cohort study that was conducted during the 2012 Super Rugby tournament, in which teams from Australia, New Zealand and South Africa participated. Participants consisted of 152 players from five South African teams. Team physicians collected daily injury data through a secure, webbased electronic platform. Data included the size of the squad, the type of day, main player position, whether it was a training or match injury, hours of play (training and matches), the time of the match injury, the mechanism of the injury, the main anatomical location of the injury, the specific anatomical structure of the injury, the type of injury, and the severity of the injury (days lost).
|
64 |
Epidemiology and prevention of rugby injuries amongst schoolboy, senior club and provincial rugby players in the Western CapeUpton, Patrick Anthony Howard January 2000 (has links)
This thesis comprises a series of independent investigations examining rugby injuries occurring to players from under 14 to senior provincial level in the Cape Province (now the Western Cape). The first two studies report data aimed at gaining a more detailed understanding of rugby injuries in specific populations or under specific conditions, whilst the remainder of the thesis reports injury data from both a retrospective and a prospective epidemiological survey involving the same 3990 boys from 25 high schools. Following publication of data showing a progressive rise in the number of spinal cord injuries in the Western Cape, coupled with a sustained media attack on the attitudes of the (then) South African Rugby Board, certain experimental law changes were introduced to South African schoolboy rugby in 1990 and 1991. The purpose of the law changes was either to make the game safer or to make it more open and flowing, or both. Accordingly, the studies described in chapters 4 -8 set out to analyse the effects of these law changes on the incidence and nature of rugby injuries. This was accomplished by comparing data with a similar study conducted in 1983 and 1984 in the same 25 schools (Roux, 1992). The study reported in chapter 2 determined whether the use of neoprene (thermal) pants might reduce the risk of hamstring injury amongst 60 senior club rugby players, all of whom had previously sustained a hamstring muscle tear. The rationale was that the few seasons prior to this 1992 study had been characterised by an increasing use by rugby players of thermal or neoprene pants; a practice which seemed to have evolved spontaneously and without any scientific assessment of its value. We concluded that the wearing of thermal pants can reduce the risk of hamstring injury during rugby. However, other risk factors for injury are probably more important. These include levels of preseason physical fitness, correct warm up and stretching procedures before activity and adequate rehabilitation before returning to activity following injury. The objective of the study reported in chapter 3 was to determine the influence of preseason strength and endurance training on risk of injury in rugby players from two South African provincial teams during the 1992 rugby season. Players from one province followed a supervised scientifically-designed physical training programme, while those from the other did not follow a structured programme. The findings of the study, the first study to prove the relationship between pre-season preparation and early season injury, showed that inadequate pre-season endurance training is a major contributor to the high injury rate at the beginning of the season amongst provincial rugby players. Further, strength and endurance training are interrelated as risk factors. Thus, compared to players with adequate strength and endurance training, those with adequate strength training and insufficient endurance training are at greatest risk of injury, followed by players with insufficient strength and endurance training. It was also shown that contact practices 2 days after inter-provincial match contributed more to an increased number of injuries than to success; that "niggling" injuries may develop into more serious injury if players attempt to "play through" them; and that the lack of structured treatment and rehabilitation of an injury places players at risk of being re-injured.
|
65 |
The Test-Retest Reliability of Single Leg Jump Performance Using the Drift Protocol in Division I Baseball PitchersBergquist, Amy 01 January 2022 (has links) (PDF)
The purpose of this study was to determine the test-retest reliability of the Drift protocol with and without utilizing an arm swing, and to establish if this is an effective assessment tool for single leg vertical jump. Thirteen male Division I Baseball pitchers (18-35 yrs.) completed four testing visits where a single leg hopping protocol was assessed. Visit one consisted of the consent process and familiarization with the Drift protocol. Visits two, three, and four involved two trials of the protocol, with and without an arm swing. The protocol consisted of five single leg hops on both the stride and trail legs. Jump variables contact time and flight time demonstrated both "moderate to good" relative reliability (ICC = 0.633 – 0.847) and acceptable absolute reliability (CV =3.6 - 14.1%). Compared to the trail leg, stride leg trials displayed greater relative and absolute reliability for contact time, flight time as well as jump power (ICC = 0.692 – 0.847; CV 3.6 - 14.1%). Further, arm swing trials demonstrated more acceptable relative and absolute reliability than trials without an arm swing, for jump contact time and flight time as well as jump height and jump power (ICC = 0.574 – 0.837; CV =3.6 - 9.3%). Significant main effects for arm swing were found for jump height, jump power, ground contact time, jump flight time, and the average area covered (p < 0.001). Significant main effects were also found in asymmetry variables regarding arm swing for the variables jump height, jump power and flight time (p < 0.05). The current data suggests that when evaluating single leg jump performance with the Drift protocol, the variables jump height, jump power, contact time, and flight time may be more reliable than the remaining average drift variables and total area covered in baseball pitchers.
|
66 |
Medical complications during a community-based mass participation endurance running event – an investigation of the epidemiology and risk factors associated with medical complications, with recommendations for risk mitigationSchwabe, Karen 16 September 2021 (has links)
Background: The epidemiology and risk factors associated with medical complications, including life-threatening complications during distance running events has not been well described. The aims of this research were to document the incidence of medical complications (study 1), determine risk factors associated with medical complications (studies 2 and 3), and develop and apply a pre-race medical screening tool to determine the prevalence of chronic disease in race entrants, using a risk stratification model (study 4). Design: Prospective studies Setting: Two Oceans Marathon races (2008-2011) (studies 1-3) and race entrants (2012) Participants: Studies 1-3: 65 865 race starters; 21.1 km (n =39 511), 56 km runners (n=26 354). Study 4: 15 778 race entrants Methods: Study 1: In all 4 years, race day medical complications were recorded and subdivided by severity (serious life-threatening/death), organ system and final diagnosis. Studies 2 and 3: Independent risk factors associated with all medical complications, severity and organ system involvement were determined in 21.1 and 56km runners, using multivariate modeling. Study 4: A pre-race medical screening tool was developed, based on international pre-exercise medical screening guidelines, and administered to all race entrants (2012). The prevalence (%) of runners with four risk categories was determined. Results: The incidence (per 1000 race starters) of all and serious/life-threatening medical complications was 8.27 and 0.56 respectively (study 1). Risk factors associated with medical complications were less experience (56km), slower running pace (56 km) and older females (21.1 km) (studies 2 and 3). 16.8% runners were identified as those that should undergo medical evaluation for suspected cardiac disease with 3.4% reporting existing CVD (very high risk) and 13.4% reporting multiple CVD risk factors (high risk) (study 4). Conclusion: The incidence of all and serious/life-threatening medical complications in the 21.1km and 56km race is 1/121 and 1/1786 race starters respectively. Race experience, running pace and sex are risk factors for medical complications. 16.8% runners have underlying suspected cardiovascular disease. These data formed the basis for the implementation of a pre-race medical screening and risk stratification. The research lays the foundation for a future educational intervention programme to reduce the risk of medical complications in distance running and other endurance events.
|
67 |
High Risk Environmental Conditions Attenuate Distance, Speed, and Performance Efficiency Index in NCAA D1 Female Soccer PlayersFurtado Mesa, Maxine 01 January 2021 (has links) (PDF)
PURPOSE: To evaluate the effects of environmental conditions on running performance and performance efficiency index (Effindex). METHODS: Performance data recorded using Polar Pro sensors from eight collegiate female soccer players in nine matches were analyzed during the 2019 competitive season. Effindex and running performance, including total distance covered relative to minutes played (TDREL) and distance covered in five-speed thresholds, were examined for indications of fatigue with rising environmental conditions, including ambient temperature and relative humidity. Matches were separated into three groups based on environmental condition risk: low (Low-Risk; n= 2 matches), moderate (Moderate-Risk; n=3 matches), or high (High-Risk; n=4 matches). Speed thresholds were grouped as follows: walking (WALKREL 0.83 – 1.94 m/s), jogging (JOGREL 1.94- 3.05 m/s), low-speed running (LSRREL3.06-4.16 m/s), high-speed running (HSRREL4.17- 5.27 m/s), and sprinting (SPRINTREL 5.28+ m/s). RESULTS: TDREL was significantly lower in High-Risk conditions. WALKREL, JOGREL, LSRREL, HSRREL , SPRINTREL , and Effindex were significantly greater in Low-Risk conditions when compared to Moderate-Risk conditions. WALKREL, HSRREL , SPRINTREL , and Effindex were significantly greater in Low-Risk conditions when compared to High-Risk conditions. CONCLUSIONS: High-Risk environmental conditions significantly affect performance in female collegiate soccer players. Cooling and timing strategies are advised to mitigate decrements in performance.
|
68 |
Effects of Post-Exercise Recovery Drink Composition on Subsequent Performance in Masters Class AthletesGoldstein, Erica 01 January 2022 (has links) (PDF)
Carbohydrate (CHO) and carbohydrate-protein coingestion (CHO-P) have been shown to be equally effective for enhancing glycogen resynthesis and subsequent same-day performance when CHO intake is suboptimal (≤0.8 g/kg). Few studies have specifically examined the effect of isocaloric CHO vs CHO-P consumption on subsequent high-intensity aerobic performance with limited time to recover (≤2 hours) in masters class endurance athletes. Participants (n = 22) were assigned to consume one of three beverages during a 2-hour recovery period: PLA (electrolytes and water), CHO (1.2 g/kg bm), or CHO-P (0.8 g/kg bm CHO + 0.4 g/kg bm PRO). All beverages were standardized to one liter (~32 oz.) of total fluid volume regardless of treatment group. One liter of a standard ready to drink sports beverage contains ~58 g of CHO. CHO powder was weighed in grams via a digital food scale and added to the existing liter of fluid to reach the total amount of CHO needed if a participant required more than 58 g of CHO. During Visit#1, participants completed graded exercise testing (VO2peak; cycle ergometer). Familiarization (Visit#2) consisted of 5 x 4 min intervals at 70-80% of peak power output [PPO, watts] with 2 min of active recovery at 50W, followed by time to exhaustion [TTE] at 90% PPO. The same high-intensity interval protocol with TTE was conducted pre-and post-beverage consumption on Visit#3. The ANCOVA indicated a significant difference among the group means for the posttest TTE (F2,18= 6.702, p =.007, η2 =.427) values after adjusting for the pretest differences. Both CHO and CHO-P were effective in promoting an increase in TTE performance with limited time to recover in this sample of masters class endurance athletes.
|
69 |
The Acute Effects of Continuous and Intermittent Blood Flow Restriction on Sprint Interval Performance and Muscle Oxygen ResponsesWizenberg, Aaron 01 January 2022 (has links) (PDF)
The purpose of this investigation was to examine the effects of intermittent and continuous blood flow restriction (BFR) during sprint interval training (SIT) on muscle mitochondrial function and perceived effort. Fifteen men volunteered to participate in this investigation and were randomly assigned to complete SIT with CBFR, IBFR, and NoBFR. Each SIT session consisted of 2, 30s maximal sprints on a cycle ergometer with a resistance of 7.5% of body mass. Peak power (PP), total work (TW), ratings of perceived exertion (RPE), sprint decrement score (Sdec), and muscle oxygen responses were measured during each sprint. Before (pretest) and after (posttest) the sprints muscle mitochondrial functioning was assessed. There were similar reductions (17,835.6 ± 966.2 to 12,687.2 ± 675.2 J) in TW from Sprint 1 to Sprint 2 for all 3 conditions, and TW was lower (collapsed across Time) for CBFR (14,320.7 ± 769.1 J) than IBFR (15,548.0 ± 840.5 J) and NoBFR (15,915.4 ± 771.5 J). PP decreased to a greater extent from Sprint 1 to Sprint 2 during CBFR (25.5 ± 11.9%) and IBFR (23.4 ± 9.3%) than NoBFR (13.4 ± 8.6%). There were no differences in Sdec (84.3 ± 1.7 %, 86.1 ± 1.5 %, 87.2 ± 1.1%, for CBFR, IBFR and NoBFR, respectively) or RPE that increased from Sprint 1 (8.5 ± 0.3) to Sprint 2 (9.7 ± 0.1) among conditions. Muscle oxygen responses increased across time and were similar for IBFR and NoBFR, while changes in deoxyhemoglobin and total hemoglobin were greater for CBFR. Collectively, the findings of the present study indicated that applying BFR to maximal aerobic exercise is capable of eliciting acute physiological adaptations that may be superior with CBFR relative to IBFR and NoBFR.
|
70 |
Sex Based Differences in Muscle Quality Recovery Following One Week of Lower Limb Immobilization and Subsequent RetrainingGirts, Ryan 01 January 2022 (has links) (PDF)
The present project represents the second part of a two-phase clinical trial designed to comprehensively examine the effects of knee joint immobilization and recovery on skeletal muscle size, strength, and central nervous system plasticity in healthy males and females. The purpose of this study was to examine differences between sexes in the recovery of muscle quality, size, and strength in response to a resistance training-based rehabilitation program following one week of knee joint immobilization. Twenty-seven participants (males: n = 16, age = 22 ± 3 years, BM = 81.3 ± 14.8 kg, BMI = 25.0 ± 3.4 kg/cm2; females: n = 11, age = 20 ± 1 years, BM = 61.3 ± 9.4 kg, BMI = 23.3 ± 2.1 kg/cm2) underwent one week of knee joint immobilization followed by twice weekly resistance training sessions designed to re-strengthen the left knee joint. Retraining sessions were conducted until participants could reproduce their pre-immobilization isometric MVC peak torque. Assessments of muscle quality, size, and strength were conducted prior to immobilization (Pre-), immediately after immobilization (Post-Immobilization), and until retraining was finished (Post-Retraining). Results suggested that both sexes experienced negative changes in MVC peak torque, specific torque, echo intensity, and ECW/ICW ratio, with females experiencing greater decrements in MVC peak torque and specific torque. The number of retraining sessions required was similar for males (median = 1, mean = 2.13) and females (median = 2, mean = 2.91). Following retraining, specific torque was the only "muscle quality" indicator that had fully recovered. This is the first study to examine sex differences in the recovery of muscle quality indices in response to a retraining program following lower-limb immobilization. The findings may have important implications for the development of evidence-based, sex-specific rehabilitation approaches following short-term knee joint immobilization.
|
Page generated in 0.0469 seconds