Spelling suggestions: "subject:"3ports physiotherapy"" "subject:"3ports hysiotherapy""
41 |
Motivation and behaviour change in Parkrun participants in Western Cape, South AfricaChivunze, Edgar January 2020 (has links)
Background: Participation in physical activity is a cost effective way to reduce the risks of over 25 chronic diseases. Despite the many dangers of physical inactivity, more than a quarter of the South African population remains inactive. One initiative aimed at increasing engagement in physical activity is parkrun, a free weekly 5 km running/walking based activity. There has been an increase in the number of parkrun participants in South Africa since its inception. An understanding of the motivation for participation and health related behaviour change is important for organisers and public health professionals to increase participation in this weekly mass participation event. Aim: The aim of this study was to describe the motivations for participation in parkrun and physical activity related behaviour changes among parkrun participants registered in the Western Cape Province of South Africa. Specific objectives The specific objectives of this study were: to identify demographic characteristics of parkrun participants in the Western Cape Province of South Africa; to describe the motivations for participating in parkrun runs in the Western Cape Province of South Africa; and to investigate physical activity related behaviour changes as a result of participating in parkruns in South Africa's Western Cape Province based on pre and post participation physical activity levels Methods: A cross sectional study was performed on 1787 parkrun participants registered at 40 parkrun sites in the Western Cape Province of South Africa. Participants from 37 of these sites were invited to participate via the parkrun South Africa mailing list in an online survey. Participants from the remaining three parkrun sites responded on paper-based questionnaires at the parkrun sites. The questionnaire included sections on demographic characteristics including employment status, gym membership and educational level; physical activity programmes before joining parkrun and changes in physical activity after joining parkrun. Results: The median age of participants was 50 (IQR:38-59). Female participants formed 53.3% of the sample. Approximately 80% of participants were educated to diploma or degree level (Technikons/College/University); and participants reported high employment rates (71%). Fifty-one percent of the sample were gym members. A total of 64.8% reported having very good to excellent health. A total of 86.1% reported health/fitness as the biggest motivation for participation in parkrun. Another 71.8% of the sample were motivated by enjoyment. Safe environment (58.7%), earning Discovery Health Vitality Points (46.4%), stress relief (40.8%), cost (40.4%) and socialisation (39.4%) were other common motivations among the sample. After joining parkrun, 24% of participants took up new physical activity programmes, with a further 24% of participant increasing their weekly volume of physical activity. More female participants (50.9%) than male participants (44.7%) increased their physical activity levels or took up new physical activity programmes (χ² =7.331, p=0.007). Running was the widely adopted physical activity attracting 18.2% of the sample as new runners. Conclusion: In conclusion, we found that parkrun in the Western Cape is mostly taken up by participants in their sixth decade of life with half of them being overweight. Most participants are physically active before joining parkrun with more than half exceeding recommended global physical activity levels. These results were described in previous studies in Australia and the UK. We also found health/fitness to be the biggest motivation for parkrun participation followed by enjoyment and the safe environment provided at parkrun sites. Running and walking are the common activities that are taken up by participants after joining parkrun. Further prospective studies are recommended to determine cause and effect models and describe health related physical activity behaviour changes in detail.
|
42 |
A comparison of muscle damage, soreness, morphology, T2 changes and running performance following an ultramarathon raceVan Niekerk, Wanda January 2016 (has links)
Background: Exercise induced muscle damage collectively describes the response to strenuous or unaccustomed exercise. It is well - established that endurance running causes muscle damage. Indirect indicators of muscle damage include the loss of muscle strength, increased levels of muscle proteins, such as creatine kinase, in the blood and delayed onset of muscle soreness. Magnetic resonance imaging has been used to gain insight in to the underlying mechanisms associated with exercise induced muscle damage. The most common approach has focused on changes in transverse (T2) relaxation times after exercise. Given that inflammation and oedema are proposed as reasons for the changes in T2 times, there may be changes in morphological measurements such as muscle volume and peak cross sectional area. Few studies have utilised MRI morphological measurements to assess the effects of exercise induced muscle damage, and there is a lack of evidence regarding changes in muscle morphology after endurance running. Aim: The aim of this study was to investigate changes in transverse (T2) relaxation times and muscle morphology in endurance runners after a 90 km ultramarathon race. Specific objectives: (a) To determine the time course of recovery of muscle pain and plasma creatine kinase activity after a 90km ultramarathon race; (b) to determine changes in 5km time trial performance in an experimental group of endurance runners that took part in a 90 km ultramarathon race compared to a control group of endurance runners that did not take part in a 90 km ultramarathon race; (c) to compare changes in muscle morphology (volume and average cross sectional area) and T2 relaxation times of the quadriceps and hamstrings in an experimental group of endurance runners that took part in a 90 km ultramarathon race and a control group of endurance runners that did not take part in a 90 km ultramarathon race; and (d) to evaluate potential relationships between indicators of muscle damage (plasma creatine kinase levels and muscle pain measurements), morphological muscle changes, and T2 relaxation times in an experimental group of endurance runners that took part in a 90 km ultramarathon race and a control group of endurance runners that did not take part in a 90 km ultramarathon race. Methods: This was a descriptive, correlational study that involved secondary analysis of previously collected data. No new participants were recruited for the study. Participants were allocated to groups, based on whether they took part in a 90 km ultramarathon. The experimental group (n = 11) completed a 90 km ultramarathon. The control group (n = 11) consisted of endurance runners, who ran a minimum of 60 km.wk-1, but did not take part in the ultramarathon. Magnetic resonance images were taken seven days before and 10 - 15 days after an ultramarathon as part of an earlier study. The magnetic resonance images analysis included the digital segmentation and reconstruction of the rectus femoris, combined quadriceps and combined hamstrings muscle groups. Muscle volume and peak cross sectional area was calculated as well as T2 relaxation times. These measurements were correlated with muscle pain and plasma creatine kinase activity measurements obtained during the initial study. Results: There was a significant difference in hamstrings muscle volume between the experimental and control groups. The experimental group had a significantly lower muscle volume compared to the control group (p = 0.03). There was also a significant positive relationship between the T2 relaxation time and plasma CK activity. (r = 0.74; p = 0.04) Conclusion: Changes in muscle morphology in endurance runners are evident after a 90 km ultramarathon. The significant relationship between T2 relaxation times and plasma creatine kinase activity confirms that T2 relaxation time may be used as a non-invasive direct indicator of exercise induced muscle damage.
|
43 |
Matching the density of the rugby playing population to the medical services available in the Eastern Cape, South AfricaMoore, Simon January 2017 (has links)
Background: Rugby Union is a popular contact sport played worldwide. The physical demands of the game are characterized by short duration, high intensity bouts of activity, with collisions between players, often while running fast. The head, neck, upper limb and lower limb are common sites for injury. Although catastrophic injuries are rare in rugby, they do occur. Immediate action (4-hour window) must occur after the injury to minimise the damage incurred from a catastrophic injury. This infers that a well-functioning medical infrastructure should be available to anticipate injuries of this nature and provide treatment for the best possible outcome. Currently there is no system information/map in South Africa describing the medical infrastructure in relation to places where clubs and schools practice and play matches. Such a system may assist providing early and immediate transfer of injured players to the appropriate treatment facility. This would minimise the damaging effects caused by delays in medical treatment. Therefore the aim of this study was to; (i) investigate and report on the location, distance and travel time from rugby playing/training venues in the Eastern Cape to the nearest specialist hospital where a player may be able to receive adequate treatment for a catastrophic injury, and ii) report on safety equipment available at these playing venues to facilitate this transport in a safe manner. Methods: All the clubs (n=403) and schools (n =264) that played rugby in the Eastern Cape were accounted for in the study. However, only 15 clubs and 35 schools were included in the analysis as they had their own facilities for training and playing matches. Distances between clubs/schools and the nearest public, private and specialized hospital (able to treat catastrophic injuries) were measured. In addition driving time was also estimated between the clubs/schools and nearest specialized hospital to determine if an injured player could be transported within four hours to receive medical treatment for a catastrophic injury. In addition medical safety equipment was audited (according to information provided by SA RUGBY)) for each club and school to identify if they were meeting the minimum safety standards as set by SA RUGBY. Results: Twenty schools were identified as being less than one hour away from the nearest hospital equipped to deal with catastrophic rugby injuries; nine schools were between 1-2 hours away and six schools were between 2-3 hours away. All schools were within 100 km driving distance of the nearest public hospital; 28 schools were within 100km driving distance to the nearest private hospital. For seven schools, the nearest private hospital was between 100 and 150 km away. Fourteen schools had spinal boards, eleven had neck braces, ten had harnesses, nine had change rooms, five had floodlights, and twenty-two had trained first aiders. Six schools were located 2-3 hours away and were at higher risk due to a lack of first aid equipment. Ten clubs were less than an hour away from the nearest hospital equipped to treat catastrophic injuries; two clubs were between 1-2 hours away, two were between 2-3 hours away and one was between 3-4 hours away. All clubs were within 100 km driving distance of the nearest public hospital. Nine clubs were within 100km driving distance to the nearest private hospital, three clubs were based between 100 and 150 km from the nearest private hospital and three were based over 150km away from the nearest private hospital. Twelve clubs had a spinal board, eleven clubs had neck braces, ten clubs had harnesses, ten clubs had change rooms, seven clubs had floodlights and twelve clubs had first aid trainers. One club was classified as high risk as it was located 2-3 hours away from the nearest hospital equipped to manage a catastrophic injury and had no first aid equipment. Discussion/Conclusion: No clubs or schools included in the study were more than four hours away from a hospital that was equipped to deal with a catastrophic rugby injury. Therefore, any player who suffers a catastrophic injury should be able to get to treatment within the 4-hour window period. Another finding was that not all clubs or schools possessed the minimum equipment required to host training or a rugby match. SA RUGBY can take appropriate action towards these clubs and schools to ensure that they maintain the safest possible practice to not put their own players at increased risk.
|
44 |
Cross sectional study to determine whether there are central nervous system changes in rugby players who have sustained recurrent ankle injuriesRawlinson, Alice Jane January 2017 (has links)
Background: Rugby is a popular game played around the world and has one of the highest recorded injury rates in sport. The literature exposes ankle injuries as one of the most common areas injured in sport and this trend carries through in rugby too, with lateral ankle sprains predominating. Recurrent ankle injuries are commonly reported in the literature and account for high economic and social burden. There are many intrinsic and extrinsic risk factors credited with causing lateral ankle injuries but to date the literature does not show conclusive evidence for management and prevention of recurrent injuries. A new area of research that has not previously been explored is the neurological influence on recurrent injury. Central processing is a recognised form of learning seen in adults and children during normal development and training and more recently acknowledged in injury settings. This phenomenon has also been seen in abnormal states of development such as neglect and chronic pain. Central Nervous System Changes In Recurrent Ankle Injuries In Rugby Player 2 Aim: The purpose of this study was to investigate whether there are changes in the central nervous system of rugby players with recurrent ankle injuries. Methods: An experimental and control group was used for this cross sectional study. Participants were recruited from the Golden Lions Rugby Union. Forty-six players in total were recruited. The control group consisted of 22 players, and the recurrent injury group consisted of 24 players. Medical and Sports History Questionnaire was administered as well as a battery of four physical test procedures. The questionnaire asked participants to provide information regarding demographics, playing position, training and playing history, current general health, current and previous injury history, and specifically ankle injury history. The four testing procedures were: body image testing, laterality testing, two point discrimination testing and pressure-pain threshold testing. Results: The results were collected and recorded. Between group and within group comparisons were made for the control and recurrent injury groups. From the Medical and Sports History Questionnaire the results indicated that the recurrent injury group participated in a significantly shorter preseason training period compared to the control group. The laterality testing within group analysis had a significant difference, the injured side had a slower recognition time [1.4(1.3-1.6)] compared to the uninjured side [1.3(1.15-1.5) Central Nervous System Changes In Recurrent Ankle Injuries In Rugby Player 3 p<0.01]. Pressure pain threshold testing produced a significant difference for the control group on the ATFL test site and the PTFL site. The PTFL site also demonstrated significant difference in the between group comparison analysis. The results from the two point discrimination testing and the body image testing produced interesting results. The two point discrimination tests performed on the both the recurrent injury group and the control group using within group comparison showed significant differences on the anterior talofibular ligament between the affected and nonaffected limbs. The between group test result were also significant for the injured vs control side at the ATFL site. The affected side showed a poorer ability to differentiate between one and two points, needing a bigger area before two points were distinguished from one. Similarly, body image testing showed significant differences in the within group comparison of total area drawn for the recurrent injury group only. In the recurrent injury group, the drawing of the affected foot was significantly larger than the drawing of the unaffected side. The control group showed no differences between sides. Conclusion: The study recommends that there is a relationship between central nervous system changes in recurrent ankle injuries in the sample group of professional rugby players. The data indicates that preseason length is a factor to be considered in recurrent ankle injuries. The clinical tests focussed specifically on central nervous system changes also produced some illuminating results. The recurrent injury group demonstrated significant difference between injured and uninjured sides in both two point discrimination testing of the ATFL ligaments and in the body image drawing of the foot and ankle. The control group in contrast didn't yield any differences between sides for these same tests. The pressure pain testing and laterality testing producing significant results also indicate the central nervous system involvement in recurrent injury.
|
45 |
Gastrocnemius muscle structure and function in habitually resistance-trained marathon runners and traditionally running-trained marathon runners: a comparative analysisEllis, Tracy January 2017 (has links)
Background: Marathon running involves running long distances and is associated with a high prevalence of running-related injuries. The calf has been identified as one of the most commonly injured structures during running. Running training causes an overload on muscle and stimulates a physiological adaptation to create a training response. Specific adaptations in metabolic and physiological function of a muscle may be further achieved through specificity of exercise training. Resistance training programmes are commonly implemented to enhance specific muscle strength and endurance; and are effective methods of performance and injury prevention. While evidence-based guidelines for resistance training exist, it is unclear whether runners are routinely incorporating evidence-based resistance training into marathon training programmes. If runners are performing habitual resistance training, it is also unknown if the resistance training is of sufficient magnitude or intensity to induce dose-related responses in calf muscle structure or function. Aim: The aim of this study was to evaluate gastrocnemius muscle structure and function in marathon runners who performed habitual resistance training in addition to regular endurance training, compared to marathon runners who performed traditional endurance running training only. Specific Objectives: • To describe the demographic and training characteristics of habitually resistance-trained marathon runners and traditionally running-trained marathon runners. • To determine if there were differences in gastrocnemius endurance, power and flexibility between habitually resistance-trained marathon runners and traditionally running-trained marathon runners. • To evaluate if there were differences in the gastrocnemius muscle structure and architecture in habitually resistance-trained marathon runners compared to traditionally running-trained marathon runners. • To establish if there were any differences in the number of calf injuries sustained in habitually resistance-trained marathon runners and traditionally running-trained marathon runners. Methods: Healthy male runners between 20 and 50 years were included in the study. Participants were required to have completed at least one marathon in the 12-month period prior to the study. Runners forming the "traditionally running-trained" group were required to be participating in regular endurance running training only. Runners in the "habitually resistance-trained group" were required to be performing resistance training in addition to regular endurance running training. Runners with any injury at the time of recruitment or runners who reported a calf injury within the six-month period prior to the study were excluded. Participants with any medical abnormalities detected during screening were also excluded from the study. Eight marathon runners participating in habitual resistance training plus standard running training and eleven marathon runners participating in traditional running training only were recruited for this study. Runners who met the criteria attended two testing sessions at least three days apart. During the first session, informed consent was obtained and the Physical Activity Readiness Questionnaire (PAR-Q) was completed to ensure participants could safely complete physical testing. A questionnaire was completed to determine relevant training and injury history. Body mass, height and the sum of seven skinfolds were recorded. Muscle architecture measurements, including fascicle length, pennation angle, thickness and volume, were performed via imaging ultrasound. Participants were then familiarised with the physical testing procedures. In the second testing session, calf muscle flexibility and endurance were assessed; and isokinetic testing was performed for the left and right triceps surae. Results: There were no significant differences in descriptive characteristics between groups. Participants in the habitually resistance-trained group performed in an average of two hours (range 0.5-2.5 hours) of resistance training of between one to four sessions per week. Participants combined upper and lower body training in the form of circuit training, body weight training, core and proprioceptive training. Resistance training sessions were performed at a varied intensity for load (light to high) according to an estimated 1RM. Participants in the habitually resistancetrained group had completed a significantly greater number of 21.1 km races compared to the traditionally running-trained group (p < 0.05); but there were no other differences in running training or competition history between groups. There were also no significant differences in the number of reported injuries between groups. Average pennation angle was significantly increased in the habitually resistance-trained group compared to the traditional running-trained group (p < 0.05). No other significant differences in architectural measurements were identified. There were no significant differences in calf muscle flexibility, strength, power or endurance between the two groups. However, the small sample size limits the interpretation of the study findings. Conclusion: Wide variability in habitual resistance training patterns were identified. While pennation angle was significantly greater in the habitually resistance-trained group; no differences in all other architectural measurements; or calf muscle strength, power, endurance or flexibility between groups were identified. However, one of the key findings emerging from this study is the variable resistance training practices in endurance runners; and that resistance training practices were not aligned to current evidence-based guidelines for resistance training. Resistance training has a critical role in enhancing endurance running performance, injury prevention and rehabilitation. Future research should investigate the knowledge, attitudes and practices of endurance runners regarding resistance training; to facilitate the development of appropriate education interventions, and to effectively disseminate evidence-based training guidelines to lay communities.
|
46 |
Prevalence and risk factors of chronic diseases of lifestyles in endurance runnersLanguage, Sarah 19 February 2019 (has links)
Background: Chronic diseases of lifestyle (CDL) are associated with high rates of morbidity and mortality in South Africa. Although prevalence of CDL has been established in the general population, there is limited research regarding the prevalence and risk factors for CDL in individuals taking part in regular physical activity. Endurance running is a popular sport, with growing levels of participation. Anecdotally, many individuals who participate in endurance running do not undergo formal pre-participation cardiovascular screening. It is also unclear if endurance runners are meeting the World Health Organisation’s recommended weekly moderate to vigorous intensity physical activity hours, or if they have other risk factors for CDL. It is therefore important to establish the prevalence and risk factors of CDL in this active population. Aim and Objectives: The aim of this study was to determine the prevalence of CDL and the associated risk factors in endurance runners in South Africa. The specific objectives of the study were: (a) to determine the presence of risk factors for the development of chronic diseases of lifestyle, including body mass index (BMI), waist circumference, body fat percentage, blood pressure, blood glucose, blood cholesterol, smoking history, dietary intake and weekly physical activity time in South African endurance runners; (b) to determine the presence of non-modifiable risk factors to the development of CDL, namely age and income, in South African endurance runners; (c) to determine whether South African endurance runners are fulfilling the World Health Organization’s recommended weekly moderate to vigorous intensity physical activity hours; and (d) to assess whether there are any relationships between the running characteristics, namely weekly training hours, running speed and level of competition; and the risk factors for chronic diseases of lifestyle. Methods: This study had an analytical, cross-sectional design. Two hundred participants between the ages of 18 to 69 years old, who reported endurance running as their main sport, and ran at least three kilometres twice a week for the past year were included in the study. Participants were excluded if they were pregnant or within six months post-partum, had an injury that required a minimum of two weeks rest or did not complete the questionnaire or physical testing component of the testing process. Participants were recruited through local running clubs and running races in the areas of Nelspruit, Mpumalanga and Cape Town, Western Cape. All participants gave written informed consent, and completed a questionnaire including socio-demographic characteristics, running training characteristics, the International Physical Activity Questionnaire short questionnaire, the modified Borg scale of perceived exertion, and the five-a-day community evaluation tool. Body mass, stature, skin folds and waist circumference were assessed. Blood pressure was measured using an automatic blood pressure monitor. A finger prick test was used to determine random blood glucose and cholesterol concentrations. Participants were requested to fast for three hours prior to testing to standardise the test in a non-fasted state (20). Results: One hundred and twenty four (62%) participants were found to have at least one risk factor for CDL. A high BMI was the most common risk factor for CDL (n=90; 45%). Nineteen participants (9.5%) did not meet the recommended duration of 150 minutes of physical activity per week. Seven percent of female participants (n=7) smoked, which is equivalent to the female population average of South Africa. Multiple risk factors were identified in fifty seven (28.5%) participants, ranging from two risk factors (n=37; 18.5%) to six risk factors (n=1; 0.5%). The majority of participants had no prior medical diagnosis of CDL or risk factors for CDL. The overall self-reported prevalence of a medically diagnosed CDL was 5.5% (n=11). Type 2 diabetes was the most commonly diagnosed CDL (n=6; 3%). Waist circumference, systolic blood pressure and cholesterol were significantly elevated in the older age group. There were no significant differences in risk factors for CDL according to income status. Female runners had significantly higher average sitting times compared to male runners. In addition, participants with a BMI ≥ 25 kg.m-2 had significantly slower 10 km running speeds and lower average weekly training distance, compared to participants with BMI within normal ranges. Conclusion: A high prevalence of risk factors for CDL was identified in South African endurance runners. The majority of endurance runners included in this sample are fulfilling the World Health Organisation’s recommended weekly moderate to vigorous intensity hours. However, the endurance runners in this study remain at risk for developing a CDL due to the presence of other risk factors for CDL. The knowledge and awareness of risk factors for CDL among South African endurance runners needs to be further investigated. Health care professionals are required to improve the prevention and management of risk factors of CDL through education and promotion of healthy lifestyles. A stronger emphasis on the prevention of risk factors for CDL in South African endurance runners is needed.
|
47 |
Sport-related Concussion Incidence and Mechanism of Injury in Male and Female Players at the South African Youth Week Rugby Tournaments: 2011-2018Cardis, Sheenagh 20 July 2022 (has links) (PDF)
Background: Rugby is a popular international sport for male and female youth and adult players (6). Injury incidence including sport-related concussion (SRC) is high in youth rugby (7, 8) . This is concerning as youth are more vulnerable to SRC and take longer to recover from SRC than adults (9, 10). Females are also more susceptible to sustaining a SRC, take longer to recover from SRC and have a higher incidence of SRC complications than men (11-15). Most research has focused on SRC in adult male players. There are fewer studies on youth, in particular female youth. Further research into SRC in youth male and female players is thus required. Aim: The aim of this study was to determine the incidence and mechanism of SRC among youth male and female rugby players at the 2011 to 2018 and 2015 to 2018 South African Rugby Union Youth Week Tournaments respectively. Specific objectives: a) To determine the incidence of SRC among boys U13-U18 and girls U16-U18 players; b) To describe SRC mechanism of injury in boys U13-U18 and girls U16-U18 players; c) To determine if a difference in SRC incidence exists between boys U13-U18 and girls U16-U18 players, and also, between age groups; d) To determine if a difference in mechanism of SRC exists between boys U13-U18 and girls U16-U18 players, and also, whether the difference exists between age groups; and e) To describe l factors associated with SRC in boys U13-U18 and girls U16-U18 players. Methods: The study had a retrospective, epidemiological design. The study reviewed SRC injury data collected at the 2011-2018 South African Rugby Union Youth Week Rugby Tournaments. SRC injury data for boys was collected at the 2011-2018 South African Rugby Union Youth Week Rugby Tournaments. SRC injury data for girls was collected only at the 2015-2018 South African Rugby Union Youth Week Rugby Tournaments, as the girl's tournaments were only introduced in 2015. Results: Data from 266 SRC events were analysed in the study. Overall SRC incidence was 7.0 SRC per 1000 match playing hours (95% CI, 6.2.-7.8). Overall SRC incidence for boys was 6.9 SRC per 1000 match playing hours (95% CI, 6.0-7.8). Overall SRC incidence for girls was 7.9 SRC per 1000 match playing hours (95% CI, 5.3-9.9). There was no significant difference in SRC incidence between boys and girls. SRC incidence from 2011-2018 was 10.7 (95% CI, 8.2-13.1), 7.5 (95% CI, 5.5-9.6) and 5.3 (95% CI, 3.4-6.5) SRC per 1000 match playing hours for boys U13, U16 and U18 age groups respectively. SRC incidence from 2015- 2018 was 7.2 (95% CI, 3.7-10.2) and 7.9 (95% CI, 4.7-10.9) SRC per 1000 match playing hours for girls U16 and U18 age groups respectively. There was a significantly higher incidence of SRC in the boys U13 age group when comparing boys U13 and U18 age groups (IRR 2.0; 95% CI, 1.5-2.7; p=0.00014). Boys U13 players were twice as likely to sustain a SRC than their U18 counterparts. The tackle (65%) and ruck (20%) were responsible for the majority of SRC. Boys U13 players were significantly more likely to sustain a SRC from a tackle than boys U18 players (p= 0.01). Boys U16 players had a significantly greater incidence of SRC resulting from the ruck than boys U18 players (p=0.02). Overall the most common primary mechanisms of SRC were front-on tackles (27%) and collisions (18%). Boys U16 players had a significantly higher rate of SRCs due to front-on tackles than boys U18 players (p=0.00007). U16 boy players also had a significantly higher rate of SRCs caused by collisions than U18 boy players (p=0.00007). Similarly, boys U13 players had significantly higher incidences of SRCs due to collision than boys U18 players (p=0.003). Factors that were associated with SRC incidence were tournament day and the use of headgear. SRC was more likely to occur on day two than day four (p=0.0008), day five (p=0.0002) and day six (p<0.001). Players who did not wear headgear were more likely to sustain a concussion than those who did (p<0.001). Conclusion: Overall SRC incidence at the 2011 to 2018 South African Youth Week Rugby Tournaments was 7.0 SRC per 1000 match playing hours. This study is unique as it reports SRC incidence for youth female players. The overall SRC incidence for girls U16 and U18 groups was 7.7 SRC per 1000 match playing hours. As no significant difference was found for the incidence, injury event and mechanism of SRC between male and female players, similar injury prevention strategies can be implemented with these groups. Injury prevention strategies should focus on teaching safe contact technique in the tackle and ruck. Particular attention should be focused on teaching safe contact technique in U13 boys as the SRC incidence was highest in this group. Injury prevention strategies should also focus on teaching U13 and U16 boy players how to avoid collisions; and teaching U16 boy players how to execute safe front-on tackles and rucks. Further research should focus on identifying what aspects of the tackle and ruck result in SRC so more tailored and specific injury prevention strategies can be implemented.
|
48 |
Training loads, injury profiles and illness in elite South African rugby playersBarnes, Curt 12 July 2022 (has links) (PDF)
Background Professional Rugby Union is a popular international team sport and is known to have one of the highest reported incidences of injury and illness across sporting codes. The Super Rugby tournament is played annually between professional Rugby Union teams and is one of the most competitive sports tournaments in the world. The demanding nature of the tournament has been associated with high rates of injury and illness, but the relationship between training loads on injury and illness profiles are unclear. As a result, the Super Rugby tournament is a platform to further investigate injury, illness and training load patterns within Rugby Union. Epidemiological data on training loads, injury profiles and illness patterns assist the development of preventative measures. Aim The aim of this study was to assess the relationships between training loads, injury profiles and illness rates in elite South African rugby players competing in the 2017 Super Rugby tournament. Specific objectives (a) To determine the incidence of training and match injuries during pre-season training, and early and late competition during the 2017 Super Rugby tournament; (b) To determine the incidence of illness during pre-season training, and early and late competition during the 2017 Super Rugby tournament; (c) To determine the anatomical site, type, mechanism and time-loss of injuries sustained during preseason training, and early and late competition during the 2017 Super Rugby tournament; (d) To determine potential associations between internal and external training loads; and injury and illness, respectively. Methods A descriptive, observational, surveillance study design was conducted on the 2017 Super Rugby tournament. Thirty-nine adult participants were recruited from one South African team over a complete season, including preseason, early and late competition. Data were collected from the team medical personnel who routinely collected data on a daily basis. Training load data included squad size, training or match day, the duration of training or matches, and internal and external training load measures for training and matches. Injury data included the participants age, the injury counts, the type of injury, the main and specific anatomical location, and the mechanism and severity of injury. Illness data included illness counts, the bodily system affected, symptoms and cause of illness, the specific diagnosis and time-loss. Results The overall incidence of injury was 12.8 per 1000 player hours. The majority (48.8%) of injuries occurred in the early competition phase. The incidence of match injuries (241.0 per 1000 player hours) was significantly higher than training injuries (3.3 per 1000 player hours). The lower limb (62.5%) sustained the greatest proportion of injuries. Muscle or tendon injuries accounted for 64.9% of all injuries. The tackle accounted for 28.8% of all injuries and 37.5% of all injuries were of a ‘moderate' severity. The proportion of players that sustained a time-loss injury was 76.9% (n = 30) and 25.6% (n = 10) of players sustained a time-loss injury severe enough to prevent eight days or more of participation in training or matches. The overall incidence of illness was 1.8 per 1000 player days. The proportion of players that acquired an illness was 28.3% (n = 11). Acute respiratory tract infections (28.6%) was the most common specific A significant negative correlation between injury and internal training loads were detected in the preseason phase (r = -0.34, p = 0.03). There were no significant correlations between external training load and injury incidence. No significant correlations were observed between internal and external training loads and illness incidence. There were no significant odds ratios demonstrated between internal and external acute to chronic ratios, and injury and illness risk. Conclusion The incidence of match injuries in this study was significantly higher than previously reported incidence rates in the Super Rugby tournament. The profiles of match and training injuries, anatomical location, type, mechanism and severity of injuries are similar to previous studies. Illness rates were significantly lower than reported in previous studies. Internal training load and injury were significantly correlated in the preseason phase. Further studies are required to determine the relationship of training loads on injury and illness over consecutive seasons and in multiple teams. diagnosis. A large majority of illnesses (64.3%) did not result in time-loss.
|
49 |
Does the use of upper leg compression garments aid performance and reduce post-race Delayed Onset Muscle Soreness (DOMS)?Kabongo, Ken 21 October 2022 (has links) (PDF)
Introduction: Despite the lack of scientific knowledge on the physiological and biomechanical effects of wearing compression garments, there has been an increase in the use of these garments in endurance running. The purpose of this study was to compare the performance, pain and thigh circumference changes in endurance runners using upper leg compression garments while competing against runners who did not use compression garments in the same marathon race. Methods: A randomised controlled intervention study was conducted in endurance runners (n=18) participating in the 2019 Winelands Marathon (42.2km). The compression garment group (n=10) participated in the race wearing upper leg compression garments while the control group (n=8) did not. Participants in the compression garment group only wore the compression garments during the marathon. Various outcome measures of perceived exercise-induced muscle damage (EIMD) and running performance were assessed three days before, immediately post-race and two days post-race. Three days prior to the race, mid-thigh circumference measurements were performed. Immediately post-race, mid-thigh circumference measurements, Visual Analogue Scale (VAS) pain ratings and Likert scale for determination of muscle soreness were assessed and race performance times were recorded. Two days post-race, mid-thigh circumference measurements, VAS pain rating and Likert scale for determination of muscle soreness were repeated. Results: VAS pain ratings for hamstring (compression garment 2.50 vs control group 4.00) (p=0.04), knee flexion (compression garment 2.50 vs control group 5.00) (p=0.02) and hip extension (compression garment 2.50 vs control group 4.00) (p=0.04) had a statistically significant difference between the compression garment and control group immediately post-race. VAS pain ratings for hamstring (compression garment 0.00 vs control group 1.00) (p=0.04), knee flexion (compression garment 1.00 vs control group 2.00) (p=0.02) and hip extension (compression garment 1.00 vs control group 2.50) (p=0.04) had a statistically significant difference between the compression garment and control group two days post-race. There were no statistically significant differences in any other outcome measures (i.e. Likert scale for determination of muscle soreness, mid-thigh circumference and race performance) between the compression garment and control group. Conclusion: The use of upper leg compression garments is a recovery ergogenic aid which improves VAS pain ratings post-race. The results suggest that upper leg compression garments have a protective effect on the hamstring muscle in runners in the recovery phase. However, since a runner would be in a recovery phase after a marathon, a minor difference would be of little practical advantage since, importantly, there was no statistically significant differences in race performance and thigh circumference measures.
|
50 |
Identifying risk factors contributing to the development of shoulder pain and injury in male, adolescent water polo playersJameson,Yale 20 October 2022 (has links) (PDF)
Water polo is a fast-growing adolescent sport that consists of swimming, defending and overhead shooting in an aquatic environment. The high demands on the shoulder to complete these tasks are proposed to cause the high injury incidence reported in the sport. The novelty of this research rests in its clinically valuable contribution to understanding shoulder injury aetiology in adolescent water polo players as overhead throwing athletes. The overall research aim of this thesis explores the musculoskeletal profile of a male adolescent water polo players shoulder and the intrinsic factors associated with shoulder injury risk. An overview of the literature (Chapter 2) explores the biomechanics of water polo including swimming and overhead throwing; the musculoskeletal adaptations of overhead throwing in water polo compared to other overhead sports; and the epidemiology of shoulder injury in water polo players relative to other overhead sports. Due to the absence of a consensus-based definition of injury in water polo comparison of existing quality epidemiological studies in the sport was limited. Additionally, although a limited amount of studies have proposed potential risk factors to shoulder injury in water polo players, significant correlations are yet to be found. As with other overhead sports, the water polo shoulder is prone to injury due to the generation of high force during a modified upright swimming posture, repetitive swimming stroke and overhead throwing at high velocities. Male adolescent water polo players were recruited for this study. Chapter 3 describes the adolescent water polo player's shoulder musculoskeletal profile and its association with shoulder injury prevalence throughout a single water polo season. The musculoskeletal variables included pain provocation, range of motion, strength, flexibility and shoulder stability tests which have been used previously in overhead athletes to investigate injury prevention and performance. There were three steps in the data collection process. Firstly, informed consent and assent, demographic, competition, training and injury history, and a shoulder-specific functional questionnaire was acquired from participants. Secondly, a battery of pre-season musculoskeletal tests was performed. The battery of tests included: anthropometry, pain-provocation, glenohumeral and upward scapula range of motion, glenohumeral and scapula muscle strength, glenohumeral flexibility and shoulder stability measurements. Thirdly, at the end of the season participants completed an injury report and training load questionnaire. Participants who experienced shoulder pain, with or without medical management, were categorised into the injury group and those who did not were categorised as uninjured. Chapter 3 documents the adolescent water polo players shoulder musculoskeletal profile, shoulder injury prevalence and the association between these intrinsic risk factors and injury. Specifically, adolescent water polo players present with significant side-to-side asymmetry in the lower trapezius (p = 0.01), upward scapula rotation ROM at 90° glenohumeral elevation (p = 0.03), glenohumeral internal and external rotation ROM (p = 0.01), glenohumeral internal and external strength (p = 0.05 and p = 0.01 respectively) and the pectoralis minor index (p = 0.01). Twenty-four participants (49%) sustained a shoulder injury during the season with the dominant shoulder more commonly affected (54.2%). The most common aggravating factors were identified as throwing (41.7%) and shooting (20.8%). Although significantly lower scores on the pre-season shoulder-specific functional questionnaire (p = 0.01) and significantly greater upward scapula rotation at 90° glenohumeral elevation (p = 0.01) on the dominant shoulder was found in the injured group compared to the uninjured group, no factors were significantly associated with increased injury risk. In conclusion, the findings suggest that male adolescent water polo players are a high-risk population for shoulder injury. It is suggested that improving the players, coaches and parents' health literacy, particularly of the shoulder, and incorporating preventative exercises, targeting modifiable risk factors and side-to-side asymmetry, into pre-season conditioning programmes may reduce the prevalence of shoulder injury in this sporting population. While this research contributes to the epidemiology of shoulder injuries in water polo players, further research is needed to continue to report on injury incidence and associated risk factors, particularly training and workload characteristics in the water polo population.
|
Page generated in 0.2421 seconds