• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 15
  • Tagged with
  • 15
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Incidence of musculoskeletal injuries in professional dancers

Brooker, Heather January 2020 (has links)
Background: Professional ballet dancers focus on the high levels of discipline, perfection and mobility to achieve the fluid, controlled lines of movement presented on the stage. Dancers undergo long hours of strenuous, repetitive training which increases the risk of developing overuse or traumatic injuries and may compromise the longevity of dancers' careers. Relevant research, particularly in the South African context, is needed to provide recommendations on the intrinsic and extrinsic factors contributing to musculoskeletal injuries in professional ballet dancers. Aim: The aim of this study was to determine the incidence of musculoskeletal injuries and their associated risk factors over a three-month period in adult female professional ballet dancers in South Africa. Specific Objectives: The specific objectives of this study were: • To determine the incidence of traumatic and overuse injuries per 1000 dance hours over a three-month training and performance period in South African female professional ballet dancers; • To determine the relationships between a) Functional Lower Extremity Evaluation (FLEE) scores and injury incidence; b) intrinsic factors (amenorrhoea; body mass index; skinfold measurements; caloric intake) and injury incidence; and c) extrinsic factors (training hours; performance hours) and injury incidence respectively, in South African female professional ballet dancers. Methods: This study had a prospective, descriptive design. Eighteen female dancers were recruited from professional dance companies in the Gauteng, Western Cape and North West provinces of South Africa. Data were collected over a three-month period and included a subjective questionnaire, three-day food diary, skinfold measurements and the Functional Lower Extremity Evaluation (FLEE). Injuries were reported using an injury reporting form over the three-month period. Results: Participants had an average age of 22.1 ± 3.0 years. The dancers had an average BMI of 21.4 ± 2.1 kg.m⁻²; LBM of 41.7 ± 4.9 kg and body fat percentage of 24.7% ± 2.9%. Injury incidence was 3.3 injuries per 1000 dance hours with a total of 4605.58 hours reported overall. Of the 15 injuries reported, 13 occurred in the lower limb, with eight in the ankle and foot. Overuse injuries accounted for 93.3% of the total injuries, with only one traumatic injury reported. None of the descriptive characteristics was associated with increased injury risk. The average caloric intake of 1810.0 ± 503.7 calories, while lower than what is recommended for female athletes, also showed no significant relationship to injury. There were also no significant associations between pre-injury FLEE measurements and training loads; and injury incidence over the course of the study. Conclusion: An overall injury incidence of 3.3 injuries per 1000 dance hours was found in professional female ballet dancers in South Africa, which is higher than the injury incidences identified in previous studies in high-income countries. With regards to injury profile, overuse injuries are 86% more prevalent than traumatic injuries among this population type. We were unable to identify any intrinsic or extrinsic risk factors associated with injury incidence; however, we recognise the limitations of the small sample size in this study. With a high level of injury incidence and inconclusive results on injury risk factors, there is a clear need for significant further research in the field of injury prevention in professional ballet dancing. Further, this study identified a strong need for further research in South African dance companies to facilitate injury prevention and management in South Africa.
2

The relationship between performance (tournament progression), daily stress and perceived exertion in male participants of professional squash tournaments

Montanus, Munro January 2016 (has links)
Squash is a popular sport that is played by over 15 million people in 120 countries. Squash is a sport requiring extreme levels of fitness and skill to be proficient at. Squash being a high impact, fast sport that relies on consistency, strength and skill, players often experience stress. This stress is mainly due to the intensity of the matches, but also due to the short duration of the tournaments, which places a lot of pressure on the participants to do well. Stress in sport has been shown to be a critical component in the performance of an individual athlete as well as in team sports. Stress in sport may be categorised as competitive and organisational as well as acute. Not being able to cope with stress may have varied affects for athletes. These include increased anxiety and aggression; decreased enjoyment and self-esteem; and most importantly a decrease in performance expectations and performance difficulties. Furthermore, if an athlete believes he or she cannot resolve the demands of the competitive environment, negative physical and emotions can affect performance. The ability to compete with the presence of different stressors is thus necessary for an athlete to perform at his or her best. Aim and objectives The specific objectives were to establish whether a) Anthropometric and demographic characteristics, b) Daily Stress as measured by the Daily Analysis of Life Demands for Athletes (DALDA) and c) Rate of Perceived Exertion (RPE) as measured by the Borg Scale were associated with competition performance as measured by winning/losing games in national squash tournaments.
3

Investigation of the impact of compression garments on endurance running performance and exercise induced muscle damage in the lower leg

Geldenhuys, Alda Grethe January 2018 (has links)
Introduction: Compression garments utilisation is very popular among runners despite the relative lack of consensus in the literature regarding a beneficial impact. Methods: A randomised controlled experimental study was conducted in healthy, uninjured endurance runners (n=41) participating in the Old Mutual Two Oceans 56km race. The experimental group (n=20) trained for six weeks and participated in the race wearing below knee compression garments while the control group (n=21) did not. Participants were tested on four occasions for various markers of exercise induced muscle damage (EIMD) and running performance. Six weeks prior to the race, ultrasound scans of the medial gastrocnemius, mid-calf and figure-of-8 ankle circumference baseline measurements were performed. Shortly prior to the race, these measurements were repeated in addition to a countermovement jump (CMJ) test. Immediately following the race, circumference measurements and CMJ testing were repeated in addition to pain ratings on the visual analogue scale (VAS). Race performance times were also obtained. Two days following the race, the ultrasound scans, circumference measurements and VAS pain ratings were repeated. Results: Ankle circumferences measurements increased significantly less (p=0.01, Cohen's d=0.9) in the experimental group from immediately after the race until two days post-race compared to the control group. There were no further statistically significant changes over time in any other objective outcome measure (i.e. mean mid-calf circumference, medial gastrocnemius mean muscle thickness and mean pennation angle, mean CMJ height and estimated peak power output nor in race performance) between the experimental and control groups. Selected pain ratings were statistically significantly worse in the experimental group. Muscle thickness and pennation angles were significantly greater in the control group compared to the experimental group two days following the race. Conclusion: There were limited indications of a beneficial impact of compression garments with minor improvements in ankle circumference measurements, but no further significant effects related to EIMD were detected. Furthermore, no ergogenic impact was detected. Based on the results of the study, there is limited evidence to support the continued utilisation of commercially available below knee compression garments during running.
4

Motivation and behaviour change in Parkrun participants in Western Cape, South Africa

Chivunze, Edgar January 2020 (has links)
Background: Participation in physical activity is a cost effective way to reduce the risks of over 25 chronic diseases. Despite the many dangers of physical inactivity, more than a quarter of the South African population remains inactive. One initiative aimed at increasing engagement in physical activity is parkrun, a free weekly 5 km running/walking based activity. There has been an increase in the number of parkrun participants in South Africa since its inception. An understanding of the motivation for participation and health related behaviour change is important for organisers and public health professionals to increase participation in this weekly mass participation event. Aim: The aim of this study was to describe the motivations for participation in parkrun and physical activity related behaviour changes among parkrun participants registered in the Western Cape Province of South Africa. Specific objectives The specific objectives of this study were: to identify demographic characteristics of parkrun participants in the Western Cape Province of South Africa; to describe the motivations for participating in parkrun runs in the Western Cape Province of South Africa; and to investigate physical activity related behaviour changes as a result of participating in parkruns in South Africa's Western Cape Province based on pre and post participation physical activity levels Methods: A cross sectional study was performed on 1787 parkrun participants registered at 40 parkrun sites in the Western Cape Province of South Africa. Participants from 37 of these sites were invited to participate via the parkrun South Africa mailing list in an online survey. Participants from the remaining three parkrun sites responded on paper-based questionnaires at the parkrun sites. The questionnaire included sections on demographic characteristics including employment status, gym membership and educational level; physical activity programmes before joining parkrun and changes in physical activity after joining parkrun. Results: The median age of participants was 50 (IQR:38-59). Female participants formed 53.3% of the sample. Approximately 80% of participants were educated to diploma or degree level (Technikons/College/University); and participants reported high employment rates (71%). Fifty-one percent of the sample were gym members. A total of 64.8% reported having very good to excellent health. A total of 86.1% reported health/fitness as the biggest motivation for participation in parkrun. Another 71.8% of the sample were motivated by enjoyment. Safe environment (58.7%), earning Discovery Health Vitality Points (46.4%), stress relief (40.8%), cost (40.4%) and socialisation (39.4%) were other common motivations among the sample. After joining parkrun, 24% of participants took up new physical activity programmes, with a further 24% of participant increasing their weekly volume of physical activity. More female participants (50.9%) than male participants (44.7%) increased their physical activity levels or took up new physical activity programmes (χ² =7.331, p=0.007). Running was the widely adopted physical activity attracting 18.2% of the sample as new runners. Conclusion: In conclusion, we found that parkrun in the Western Cape is mostly taken up by participants in their sixth decade of life with half of them being overweight. Most participants are physically active before joining parkrun with more than half exceeding recommended global physical activity levels. These results were described in previous studies in Australia and the UK. We also found health/fitness to be the biggest motivation for parkrun participation followed by enjoyment and the safe environment provided at parkrun sites. Running and walking are the common activities that are taken up by participants after joining parkrun. Further prospective studies are recommended to determine cause and effect models and describe health related physical activity behaviour changes in detail.
5

A comparison of muscle damage, soreness, morphology, T2 changes and running performance following an ultramarathon race

Van Niekerk, Wanda January 2016 (has links)
Background: Exercise induced muscle damage collectively describes the response to strenuous or unaccustomed exercise. It is well - established that endurance running causes muscle damage. Indirect indicators of muscle damage include the loss of muscle strength, increased levels of muscle proteins, such as creatine kinase, in the blood and delayed onset of muscle soreness. Magnetic resonance imaging has been used to gain insight in to the underlying mechanisms associated with exercise induced muscle damage. The most common approach has focused on changes in transverse (T2) relaxation times after exercise. Given that inflammation and oedema are proposed as reasons for the changes in T2 times, there may be changes in morphological measurements such as muscle volume and peak cross sectional area. Few studies have utilised MRI morphological measurements to assess the effects of exercise induced muscle damage, and there is a lack of evidence regarding changes in muscle morphology after endurance running. Aim: The aim of this study was to investigate changes in transverse (T2) relaxation times and muscle morphology in endurance runners after a 90 km ultramarathon race. Specific objectives: (a) To determine the time course of recovery of muscle pain and plasma creatine kinase activity after a 90km ultramarathon race; (b) to determine changes in 5km time trial performance in an experimental group of endurance runners that took part in a 90 km ultramarathon race compared to a control group of endurance runners that did not take part in a 90 km ultramarathon race; (c) to compare changes in muscle morphology (volume and average cross sectional area) and T2 relaxation times of the quadriceps and hamstrings in an experimental group of endurance runners that took part in a 90 km ultramarathon race and a control group of endurance runners that did not take part in a 90 km ultramarathon race; and (d) to evaluate potential relationships between indicators of muscle damage (plasma creatine kinase levels and muscle pain measurements), morphological muscle changes, and T2 relaxation times in an experimental group of endurance runners that took part in a 90 km ultramarathon race and a control group of endurance runners that did not take part in a 90 km ultramarathon race. Methods: This was a descriptive, correlational study that involved secondary analysis of previously collected data. No new participants were recruited for the study. Participants were allocated to groups, based on whether they took part in a 90 km ultramarathon. The experimental group (n = 11) completed a 90 km ultramarathon. The control group (n = 11) consisted of endurance runners, who ran a minimum of 60 km.wk-1, but did not take part in the ultramarathon. Magnetic resonance images were taken seven days before and 10 - 15 days after an ultramarathon as part of an earlier study. The magnetic resonance images analysis included the digital segmentation and reconstruction of the rectus femoris, combined quadriceps and combined hamstrings muscle groups. Muscle volume and peak cross sectional area was calculated as well as T2 relaxation times. These measurements were correlated with muscle pain and plasma creatine kinase activity measurements obtained during the initial study. Results: There was a significant difference in hamstrings muscle volume between the experimental and control groups. The experimental group had a significantly lower muscle volume compared to the control group (p = 0.03). There was also a significant positive relationship between the T2 relaxation time and plasma CK activity. (r = 0.74; p = 0.04) Conclusion: Changes in muscle morphology in endurance runners are evident after a 90 km ultramarathon. The significant relationship between T2 relaxation times and plasma creatine kinase activity confirms that T2 relaxation time may be used as a non-invasive direct indicator of exercise induced muscle damage.
6

Matching the density of the rugby playing population to the medical services available in the Eastern Cape, South Africa

Moore, Simon January 2017 (has links)
Background: Rugby Union is a popular contact sport played worldwide. The physical demands of the game are characterized by short duration, high intensity bouts of activity, with collisions between players, often while running fast. The head, neck, upper limb and lower limb are common sites for injury. Although catastrophic injuries are rare in rugby, they do occur. Immediate action (4-hour window) must occur after the injury to minimise the damage incurred from a catastrophic injury. This infers that a well-functioning medical infrastructure should be available to anticipate injuries of this nature and provide treatment for the best possible outcome. Currently there is no system information/map in South Africa describing the medical infrastructure in relation to places where clubs and schools practice and play matches. Such a system may assist providing early and immediate transfer of injured players to the appropriate treatment facility. This would minimise the damaging effects caused by delays in medical treatment. Therefore the aim of this study was to; (i) investigate and report on the location, distance and travel time from rugby playing/training venues in the Eastern Cape to the nearest specialist hospital where a player may be able to receive adequate treatment for a catastrophic injury, and ii) report on safety equipment available at these playing venues to facilitate this transport in a safe manner. Methods: All the clubs (n=403) and schools (n =264) that played rugby in the Eastern Cape were accounted for in the study. However, only 15 clubs and 35 schools were included in the analysis as they had their own facilities for training and playing matches. Distances between clubs/schools and the nearest public, private and specialized hospital (able to treat catastrophic injuries) were measured. In addition driving time was also estimated between the clubs/schools and nearest specialized hospital to determine if an injured player could be transported within four hours to receive medical treatment for a catastrophic injury. In addition medical safety equipment was audited (according to information provided by SA RUGBY)) for each club and school to identify if they were meeting the minimum safety standards as set by SA RUGBY. Results: Twenty schools were identified as being less than one hour away from the nearest hospital equipped to deal with catastrophic rugby injuries; nine schools were between 1-2 hours away and six schools were between 2-3 hours away. All schools were within 100 km driving distance of the nearest public hospital; 28 schools were within 100km driving distance to the nearest private hospital. For seven schools, the nearest private hospital was between 100 and 150 km away. Fourteen schools had spinal boards, eleven had neck braces, ten had harnesses, nine had change rooms, five had floodlights, and twenty-two had trained first aiders. Six schools were located 2-3 hours away and were at higher risk due to a lack of first aid equipment. Ten clubs were less than an hour away from the nearest hospital equipped to treat catastrophic injuries; two clubs were between 1-2 hours away, two were between 2-3 hours away and one was between 3-4 hours away. All clubs were within 100 km driving distance of the nearest public hospital. Nine clubs were within 100km driving distance to the nearest private hospital, three clubs were based between 100 and 150 km from the nearest private hospital and three were based over 150km away from the nearest private hospital. Twelve clubs had a spinal board, eleven clubs had neck braces, ten clubs had harnesses, ten clubs had change rooms, seven clubs had floodlights and twelve clubs had first aid trainers. One club was classified as high risk as it was located 2-3 hours away from the nearest hospital equipped to manage a catastrophic injury and had no first aid equipment. Discussion/Conclusion: No clubs or schools included in the study were more than four hours away from a hospital that was equipped to deal with a catastrophic rugby injury. Therefore, any player who suffers a catastrophic injury should be able to get to treatment within the 4-hour window period. Another finding was that not all clubs or schools possessed the minimum equipment required to host training or a rugby match. SA RUGBY can take appropriate action towards these clubs and schools to ensure that they maintain the safest possible practice to not put their own players at increased risk.
7

Cross sectional study to determine whether there are central nervous system changes in rugby players who have sustained recurrent ankle injuries

Rawlinson, Alice Jane January 2017 (has links)
Background: Rugby is a popular game played around the world and has one of the highest recorded injury rates in sport. The literature exposes ankle injuries as one of the most common areas injured in sport and this trend carries through in rugby too, with lateral ankle sprains predominating. Recurrent ankle injuries are commonly reported in the literature and account for high economic and social burden. There are many intrinsic and extrinsic risk factors credited with causing lateral ankle injuries but to date the literature does not show conclusive evidence for management and prevention of recurrent injuries. A new area of research that has not previously been explored is the neurological influence on recurrent injury. Central processing is a recognised form of learning seen in adults and children during normal development and training and more recently acknowledged in injury settings. This phenomenon has also been seen in abnormal states of development such as neglect and chronic pain. Central Nervous System Changes In Recurrent Ankle Injuries In Rugby Player 2 Aim: The purpose of this study was to investigate whether there are changes in the central nervous system of rugby players with recurrent ankle injuries. Methods: An experimental and control group was used for this cross sectional study. Participants were recruited from the Golden Lions Rugby Union. Forty-six players in total were recruited. The control group consisted of 22 players, and the recurrent injury group consisted of 24 players. Medical and Sports History Questionnaire was administered as well as a battery of four physical test procedures. The questionnaire asked participants to provide information regarding demographics, playing position, training and playing history, current general health, current and previous injury history, and specifically ankle injury history. The four testing procedures were: body image testing, laterality testing, two point discrimination testing and pressure-pain threshold testing. Results: The results were collected and recorded. Between group and within group comparisons were made for the control and recurrent injury groups. From the Medical and Sports History Questionnaire the results indicated that the recurrent injury group participated in a significantly shorter preseason training period compared to the control group. The laterality testing within group analysis had a significant difference, the injured side had a slower recognition time [1.4(1.3-1.6)] compared to the uninjured side [1.3(1.15-1.5) Central Nervous System Changes In Recurrent Ankle Injuries In Rugby Player 3 p<0.01]. Pressure pain threshold testing produced a significant difference for the control group on the ATFL test site and the PTFL site. The PTFL site also demonstrated significant difference in the between group comparison analysis. The results from the two point discrimination testing and the body image testing produced interesting results. The two point discrimination tests performed on the both the recurrent injury group and the control group using within group comparison showed significant differences on the anterior talofibular ligament between the affected and nonaffected limbs. The between group test result were also significant for the injured vs control side at the ATFL site. The affected side showed a poorer ability to differentiate between one and two points, needing a bigger area before two points were distinguished from one. Similarly, body image testing showed significant differences in the within group comparison of total area drawn for the recurrent injury group only. In the recurrent injury group, the drawing of the affected foot was significantly larger than the drawing of the unaffected side. The control group showed no differences between sides. Conclusion: The study recommends that there is a relationship between central nervous system changes in recurrent ankle injuries in the sample group of professional rugby players. The data indicates that preseason length is a factor to be considered in recurrent ankle injuries. The clinical tests focussed specifically on central nervous system changes also produced some illuminating results. The recurrent injury group demonstrated significant difference between injured and uninjured sides in both two point discrimination testing of the ATFL ligaments and in the body image drawing of the foot and ankle. The control group in contrast didn't yield any differences between sides for these same tests. The pressure pain testing and laterality testing producing significant results also indicate the central nervous system involvement in recurrent injury.
8

Gastrocnemius muscle structure and function in habitually resistance-trained marathon runners and traditionally running-trained marathon runners: a comparative analysis

Ellis, Tracy January 2017 (has links)
Background: Marathon running involves running long distances and is associated with a high prevalence of running-related injuries. The calf has been identified as one of the most commonly injured structures during running. Running training causes an overload on muscle and stimulates a physiological adaptation to create a training response. Specific adaptations in metabolic and physiological function of a muscle may be further achieved through specificity of exercise training. Resistance training programmes are commonly implemented to enhance specific muscle strength and endurance; and are effective methods of performance and injury prevention. While evidence-based guidelines for resistance training exist, it is unclear whether runners are routinely incorporating evidence-based resistance training into marathon training programmes. If runners are performing habitual resistance training, it is also unknown if the resistance training is of sufficient magnitude or intensity to induce dose-related responses in calf muscle structure or function. Aim: The aim of this study was to evaluate gastrocnemius muscle structure and function in marathon runners who performed habitual resistance training in addition to regular endurance training, compared to marathon runners who performed traditional endurance running training only. Specific Objectives: • To describe the demographic and training characteristics of habitually resistance-trained marathon runners and traditionally running-trained marathon runners. • To determine if there were differences in gastrocnemius endurance, power and flexibility between habitually resistance-trained marathon runners and traditionally running-trained marathon runners. • To evaluate if there were differences in the gastrocnemius muscle structure and architecture in habitually resistance-trained marathon runners compared to traditionally running-trained marathon runners. • To establish if there were any differences in the number of calf injuries sustained in habitually resistance-trained marathon runners and traditionally running-trained marathon runners. Methods: Healthy male runners between 20 and 50 years were included in the study. Participants were required to have completed at least one marathon in the 12-month period prior to the study. Runners forming the "traditionally running-trained" group were required to be participating in regular endurance running training only. Runners in the "habitually resistance-trained group" were required to be performing resistance training in addition to regular endurance running training. Runners with any injury at the time of recruitment or runners who reported a calf injury within the six-month period prior to the study were excluded. Participants with any medical abnormalities detected during screening were also excluded from the study. Eight marathon runners participating in habitual resistance training plus standard running training and eleven marathon runners participating in traditional running training only were recruited for this study. Runners who met the criteria attended two testing sessions at least three days apart. During the first session, informed consent was obtained and the Physical Activity Readiness Questionnaire (PAR-Q) was completed to ensure participants could safely complete physical testing. A questionnaire was completed to determine relevant training and injury history. Body mass, height and the sum of seven skinfolds were recorded. Muscle architecture measurements, including fascicle length, pennation angle, thickness and volume, were performed via imaging ultrasound. Participants were then familiarised with the physical testing procedures. In the second testing session, calf muscle flexibility and endurance were assessed; and isokinetic testing was performed for the left and right triceps surae. Results: There were no significant differences in descriptive characteristics between groups. Participants in the habitually resistance-trained group performed in an average of two hours (range 0.5-2.5 hours) of resistance training of between one to four sessions per week. Participants combined upper and lower body training in the form of circuit training, body weight training, core and proprioceptive training. Resistance training sessions were performed at a varied intensity for load (light to high) according to an estimated 1RM. Participants in the habitually resistancetrained group had completed a significantly greater number of 21.1 km races compared to the traditionally running-trained group (p < 0.05); but there were no other differences in running training or competition history between groups. There were also no significant differences in the number of reported injuries between groups. Average pennation angle was significantly increased in the habitually resistance-trained group compared to the traditional running-trained group (p < 0.05). No other significant differences in architectural measurements were identified. There were no significant differences in calf muscle flexibility, strength, power or endurance between the two groups. However, the small sample size limits the interpretation of the study findings. Conclusion: Wide variability in habitual resistance training patterns were identified. While pennation angle was significantly greater in the habitually resistance-trained group; no differences in all other architectural measurements; or calf muscle strength, power, endurance or flexibility between groups were identified. However, one of the key findings emerging from this study is the variable resistance training practices in endurance runners; and that resistance training practices were not aligned to current evidence-based guidelines for resistance training. Resistance training has a critical role in enhancing endurance running performance, injury prevention and rehabilitation. Future research should investigate the knowledge, attitudes and practices of endurance runners regarding resistance training; to facilitate the development of appropriate education interventions, and to effectively disseminate evidence-based training guidelines to lay communities.
9

Prevalence and risk factors of chronic diseases of lifestyles in endurance runners

Language, Sarah 19 February 2019 (has links)
Background: Chronic diseases of lifestyle (CDL) are associated with high rates of morbidity and mortality in South Africa. Although prevalence of CDL has been established in the general population, there is limited research regarding the prevalence and risk factors for CDL in individuals taking part in regular physical activity. Endurance running is a popular sport, with growing levels of participation. Anecdotally, many individuals who participate in endurance running do not undergo formal pre-participation cardiovascular screening. It is also unclear if endurance runners are meeting the World Health Organisation’s recommended weekly moderate to vigorous intensity physical activity hours, or if they have other risk factors for CDL. It is therefore important to establish the prevalence and risk factors of CDL in this active population. Aim and Objectives: The aim of this study was to determine the prevalence of CDL and the associated risk factors in endurance runners in South Africa. The specific objectives of the study were: (a) to determine the presence of risk factors for the development of chronic diseases of lifestyle, including body mass index (BMI), waist circumference, body fat percentage, blood pressure, blood glucose, blood cholesterol, smoking history, dietary intake and weekly physical activity time in South African endurance runners; (b) to determine the presence of non-modifiable risk factors to the development of CDL, namely age and income, in South African endurance runners; (c) to determine whether South African endurance runners are fulfilling the World Health Organization’s recommended weekly moderate to vigorous intensity physical activity hours; and (d) to assess whether there are any relationships between the running characteristics, namely weekly training hours, running speed and level of competition; and the risk factors for chronic diseases of lifestyle. Methods: This study had an analytical, cross-sectional design. Two hundred participants between the ages of 18 to 69 years old, who reported endurance running as their main sport, and ran at least three kilometres twice a week for the past year were included in the study. Participants were excluded if they were pregnant or within six months post-partum, had an injury that required a minimum of two weeks rest or did not complete the questionnaire or physical testing component of the testing process. Participants were recruited through local running clubs and running races in the areas of Nelspruit, Mpumalanga and Cape Town, Western Cape. All participants gave written informed consent, and completed a questionnaire including socio-demographic characteristics, running training characteristics, the International Physical Activity Questionnaire short questionnaire, the modified Borg scale of perceived exertion, and the five-a-day community evaluation tool. Body mass, stature, skin folds and waist circumference were assessed. Blood pressure was measured using an automatic blood pressure monitor. A finger prick test was used to determine random blood glucose and cholesterol concentrations. Participants were requested to fast for three hours prior to testing to standardise the test in a non-fasted state (20). Results: One hundred and twenty four (62%) participants were found to have at least one risk factor for CDL. A high BMI was the most common risk factor for CDL (n=90; 45%). Nineteen participants (9.5%) did not meet the recommended duration of 150 minutes of physical activity per week. Seven percent of female participants (n=7) smoked, which is equivalent to the female population average of South Africa. Multiple risk factors were identified in fifty seven (28.5%) participants, ranging from two risk factors (n=37; 18.5%) to six risk factors (n=1; 0.5%). The majority of participants had no prior medical diagnosis of CDL or risk factors for CDL. The overall self-reported prevalence of a medically diagnosed CDL was 5.5% (n=11). Type 2 diabetes was the most commonly diagnosed CDL (n=6; 3%). Waist circumference, systolic blood pressure and cholesterol were significantly elevated in the older age group. There were no significant differences in risk factors for CDL according to income status. Female runners had significantly higher average sitting times compared to male runners. In addition, participants with a BMI ≥ 25 kg.m-2 had significantly slower 10 km running speeds and lower average weekly training distance, compared to participants with BMI within normal ranges. Conclusion: A high prevalence of risk factors for CDL was identified in South African endurance runners. The majority of endurance runners included in this sample are fulfilling the World Health Organisation’s recommended weekly moderate to vigorous intensity hours. However, the endurance runners in this study remain at risk for developing a CDL due to the presence of other risk factors for CDL. The knowledge and awareness of risk factors for CDL among South African endurance runners needs to be further investigated. Health care professionals are required to improve the prevention and management of risk factors of CDL through education and promotion of healthy lifestyles. A stronger emphasis on the prevention of risk factors for CDL in South African endurance runners is needed.
10

Sport-related Concussion Incidence and Mechanism of Injury in Male and Female Players at the South African Youth Week Rugby Tournaments: 2011-2018

Cardis, Sheenagh 20 July 2022 (has links) (PDF)
Background: Rugby is a popular international sport for male and female youth and adult players (6). Injury incidence including sport-related concussion (SRC) is high in youth rugby (7, 8) . This is concerning as youth are more vulnerable to SRC and take longer to recover from SRC than adults (9, 10). Females are also more susceptible to sustaining a SRC, take longer to recover from SRC and have a higher incidence of SRC complications than men (11-15). Most research has focused on SRC in adult male players. There are fewer studies on youth, in particular female youth. Further research into SRC in youth male and female players is thus required. Aim: The aim of this study was to determine the incidence and mechanism of SRC among youth male and female rugby players at the 2011 to 2018 and 2015 to 2018 South African Rugby Union Youth Week Tournaments respectively. Specific objectives: a) To determine the incidence of SRC among boys U13-U18 and girls U16-U18 players; b) To describe SRC mechanism of injury in boys U13-U18 and girls U16-U18 players; c) To determine if a difference in SRC incidence exists between boys U13-U18 and girls U16-U18 players, and also, between age groups; d) To determine if a difference in mechanism of SRC exists between boys U13-U18 and girls U16-U18 players, and also, whether the difference exists between age groups; and e) To describe l factors associated with SRC in boys U13-U18 and girls U16-U18 players. Methods: The study had a retrospective, epidemiological design. The study reviewed SRC injury data collected at the 2011-2018 South African Rugby Union Youth Week Rugby Tournaments. SRC injury data for boys was collected at the 2011-2018 South African Rugby Union Youth Week Rugby Tournaments. SRC injury data for girls was collected only at the 2015-2018 South African Rugby Union Youth Week Rugby Tournaments, as the girl's tournaments were only introduced in 2015. Results: Data from 266 SRC events were analysed in the study. Overall SRC incidence was 7.0 SRC per 1000 match playing hours (95% CI, 6.2.-7.8). Overall SRC incidence for boys was 6.9 SRC per 1000 match playing hours (95% CI, 6.0-7.8). Overall SRC incidence for girls was 7.9 SRC per 1000 match playing hours (95% CI, 5.3-9.9). There was no significant difference in SRC incidence between boys and girls. SRC incidence from 2011-2018 was 10.7 (95% CI, 8.2-13.1), 7.5 (95% CI, 5.5-9.6) and 5.3 (95% CI, 3.4-6.5) SRC per 1000 match playing hours for boys U13, U16 and U18 age groups respectively. SRC incidence from 2015- 2018 was 7.2 (95% CI, 3.7-10.2) and 7.9 (95% CI, 4.7-10.9) SRC per 1000 match playing hours for girls U16 and U18 age groups respectively. There was a significantly higher incidence of SRC in the boys U13 age group when comparing boys U13 and U18 age groups (IRR 2.0; 95% CI, 1.5-2.7; p=0.00014). Boys U13 players were twice as likely to sustain a SRC than their U18 counterparts. The tackle (65%) and ruck (20%) were responsible for the majority of SRC. Boys U13 players were significantly more likely to sustain a SRC from a tackle than boys U18 players (p= 0.01). Boys U16 players had a significantly greater incidence of SRC resulting from the ruck than boys U18 players (p=0.02). Overall the most common primary mechanisms of SRC were front-on tackles (27%) and collisions (18%). Boys U16 players had a significantly higher rate of SRCs due to front-on tackles than boys U18 players (p=0.00007). U16 boy players also had a significantly higher rate of SRCs caused by collisions than U18 boy players (p=0.00007). Similarly, boys U13 players had significantly higher incidences of SRCs due to collision than boys U18 players (p=0.003). Factors that were associated with SRC incidence were tournament day and the use of headgear. SRC was more likely to occur on day two than day four (p=0.0008), day five (p=0.0002) and day six (p<0.001). Players who did not wear headgear were more likely to sustain a concussion than those who did (p<0.001). Conclusion: Overall SRC incidence at the 2011 to 2018 South African Youth Week Rugby Tournaments was 7.0 SRC per 1000 match playing hours. This study is unique as it reports SRC incidence for youth female players. The overall SRC incidence for girls U16 and U18 groups was 7.7 SRC per 1000 match playing hours. As no significant difference was found for the incidence, injury event and mechanism of SRC between male and female players, similar injury prevention strategies can be implemented with these groups. Injury prevention strategies should focus on teaching safe contact technique in the tackle and ruck. Particular attention should be focused on teaching safe contact technique in U13 boys as the SRC incidence was highest in this group. Injury prevention strategies should also focus on teaching U13 and U16 boy players how to avoid collisions; and teaching U16 boy players how to execute safe front-on tackles and rucks. Further research should focus on identifying what aspects of the tackle and ruck result in SRC so more tailored and specific injury prevention strategies can be implemented.

Page generated in 0.069 seconds