• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 424
  • 424
  • 424
  • 423
  • 423
  • 423
  • 64
  • 60
  • 46
  • 35
  • 31
  • 30
  • 29
  • 28
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Monitoring malaria vector densities and behaviours in Tanzania

Govella, Nicodem January 2010 (has links)
Malaria remains the most important parasite-related public health problem globally, with the majority of burden occurring in sub Saharan Africa. Increased political and financial support has resulted in rapid scale up of malaria prevention measures, so that disease burden has been substantially reduced in many African countries. However, behavioural change by malaria vector populations, so that a greater proportion of human exposure to bites occurs outdoors, threatens to undermine the impact of malaria control with existing front line interventions such as insecticide treated nets (ITNs) and indoors residual spraying (IRS) because both act indoors. Also, progress towards lower transmission levels poses substantive entomological monitoring challenges because most standard methods fail to detect low levels of vector density and malaria transmission. The overall goal of this study was to enhance understanding of the potential and limitations of ITNs for reducing malaria transmission by outdoor biting mosquitoes, and to develop a safe, sensitive, practical and effective malaria vector surveillance tool that enables sustained entomologic monitoring of intervention impact. An existing mathematical model was adapted to examine the possibility that ITNs can achieve community suppression of malaria transmission exposure, even when mosquitoes avoid them by feeding on people while they are outdoors. Simulations indicated that ITNs may provide useful levels of community suppression of malaria transmission, even when outdoor biting rates exceed indoor biting rates and slightly more than half of bites occurred at times and places when using ITNs is not feasible. This suggests that ITNs should not be deprioritized as a malaria control tool simply because local vector species prefer to feed outdoors. Nevertheless, complementary interventions that target outdoor- and early-biting mosquitoes should be prioritized, especially for going beyond malaria control to achieve elimination. Cross over and Latin Squares experimental designs were used to compare the sensitivity of multiple trapping techniques for catching malaria vectors, under conditions of both high and low mosquito density, in rural Kilombero and urban Dar es Salaam, respectively. A new tent style trapping device called the Ifakara Tent Trap was successfully developed and proved to be safe and more efficacious than any other commonly used alternative to human landing catch for catching Anopheles gambiae s.l. in the low transmission setting of urban Dar es Salaam. Its sampling efficiency appeared to be independent of vector density in a rural setting with high mosquito abundance but increased as mosquito densities decreased in an urban area of low mosquito density where it exceeded that of HLC at lowest densities. This density- dependence of the trap implies that this tool may have particular potential for monitoring malaria in low transmission settings. It was also demonstrated to be effective when used by unsupervised community members under programmatic conditions and it is currently the only technique used for routine adult mosquito surveillance by the Urban Malaria Control Programme of Dar es Salaam. However, it cannot be used to determine how bites upon humans are distributed between indoor and outdoor exposure components.
32

Optimizing impact assessment of entomological intervention for malaria control in an operational setting in Zambia

Chanda, Emmanuel January 2011 (has links)
The study aimed at optimally assessing the impact of indoor residual spraying (IRS) and insecticide treated nets (ITNs) on vector species abundance, their infectivity and resistance status, and Plasmodium falciparum prevalence, malaria deaths and case fatality rates in the human population. Malaria prevalence surveys were conducted and routine surveillance data was retrospectively analyzed. The average P. falciparum prevalence in children between the ages of 1 and 14 years was below 10% across the study period. The intervention effect was more pronounced in IRS areas than in ITNs localities but with an incremental protective effect of their combined use. Age-specific comparison showed better intervention effect on children below 5 years than older children 5 to 14 years old. While the average number of deaths and case fatality rates in children under the age of five plunged precipitately, the reductions were more significant in IRS districts than in ITNs districts. Results indicate the need for supplementing parasite prevalence survey data with routine surveillance data in low transmission intensity areas and demonstrate the significance of evidence-based age-specific deployment of interventions. To monitor vector species abundance and infectivity, mosquitoes were collected daily using exit window traps. The three major vectors; An. gambiae s.s, An. arabiensis and An. funestus s.s, and three potential vectors of malaria, An. nili, An. rivulorum and An. funestus-like species were identified. Overall, the biggest impact of IRS and ITNs was on An. gambiae s.s, and An. funestus abundance. No An. gambiae s.s was collected in IRS localities, thus validating the fact that An. gambiae s.s and An. funestus are characteristically more amenable to control by IRS and ITNs than An. arabiensis. The transmission potential for all malaria vectors, as expressed by the calculated transmission index, was zero as none of the trapped mosquitoes tested positive for P. falciparum sporozoites. The identification of An. nili, An. rivulorum and An. funestus-like necessitate further research to determine their role in malaria transmission in the country. The low numbers of mosquitoes collected also indicate a compromise in the efficiency of exit window traps in low transmission settings, suggesting the need for their replacement with a more robust collection tool like the CDC light trap. While the persistence of An. arabiensis suggests the presence of resistance segregating in this population or, that this outdoor species is not in contact with IRS or ITNs, it could as well imply that it’s the one species perpetuating malaria transmission in these meso-to hypo- endemic areas. To determine the impact of interventions on insecticide resistance status of malaria vectors, susceptibility assays using the WHO standard protocol were conducted in 17 localities. High levels of resistance were detected in both An, gambiae s.l and An, funestus s.l to pyrethroids and DDT but with 100% susceptibility to malathion and bendiocarb. The level of resistance was significantly higher in IRS areas than in ITN areas. These findings indicate that resistance has been selected for following extensive vector control. Resistance to both DDT and deltamethrin in IRS localities and ITN areas with intense cotton growing was detected suggesting selection due to either historical use of DDT, gene flow or cross-resistance. All An. gambiae s.s were molecular s-forms and only the west (leu-phe) kdr was detected. Complete susceptibility to the organophosphates and carbamates provides a possibility to switch to these alternative insecticide classes for IRS. The detected increases in the malaria prevalence in localities with high insecticide resistance levels indicate vector control failure. These findings point to the need for information on underlying biochemical and molecular resistance mechanisms to make possible the design of an effective resistance management strategy, and for the assessment of the impact of resistance on interventions. The results indicate that the impact of malaria control can be optimally assessed by using a combination of epidemiological (routine surveillance and prevalence data) and entomological indicators, in the context of a malaria decision support system, to enhance policy formulation for objective implementation of malaria control interventions and rational use of available resources.
33

The clinical development of rectal microbicides for HIV prevention

McGowan, Ian January 2013 (has links)
Introduction: Individuals practicing unprotected receptive anal intercourse are at particularly high risk of HIV infection. Men who have sex with men in the developed and developing world continue to have disproportionate and increasing levels of HIV infection. The last few years have seen important progress in demonstrating the efficacy of oral pre-exposure prophylaxis, vaginal microbicides, and treatment as prevention but there has also been significant progress in the development of rectal microbicides. The purpose of this thesis is to summarise the status of rectal microbicide research, to identify opportunities, challenges, and future directions in this important field of HIV prevention, and to describe the results of a recently completed Phase 1 rectal microbicide study (MTN-007). Methods: MTN-007, a Phase 1, randomised, partially blinded, rectal safety study was undertaken to determine whether a reduced glycerin formulation of tenofovir 1% gel was safe and acceptable to men and women with a history of practicing receptive anal intercourse. The study was conducted at three clinical trial sites in the United States (Pittsburgh, Pennsylvania; Boston, Massachusetts; and Birmingham, Alabama). Study participants were randomized to one of three gel arms (tenofovir gel, a hydroxyethyl cellulose placebo gel, or a 2% Nonoxynol gel) or a no treatment arm and received a total of eight rectal daily doses of the study product. In addition to collecting conventional clinical safety and acceptability data, the study also included extensive mucosal safety assays to determine whether product administration was associated with changes in mucosal biology that might predispose to increased risk of HIV acquisition associated with unprotected receptive anal intercourse. Results: Sixty-five participants (45 men and 20 women) were recruited into the study. There were no significant differences between the numbers of ≥ Grade 2 adverse events across the arms of the study. Likelihood of future product use (acceptability) was 87% (reduced glycerin formulation of tenofovir 1% gel), 93% (hydroxyethyl cellulose placebo gel), and 63% (Nonoxynol-9 gel). Fecal calprotectin, rectal microflora, and epithelial sloughing, did not differ by treatment arms during the study. Suggestive evidence of differences was seen in histology, mucosal gene expression, protein expression, and T cell phenotype. These changes were mostly confined to comparisons between the Nonoxynol-9 gel and other study arms. Microarray analysis of the mucosal transcriptome provided preliminary evidence that topical application of tenofovir 1% gel was associated with decreased mitochondrial function within the rectal mucosa. Conclusions: The MTN-007 study demonstrated that, using conventional criteria, tenofovir gel is safe and acceptable and should be advanced to Phase 2 development as a potential rectal microbicide. However, microarray analysis of mucosal tissue suggested that use of tenofovir gel may modulate mucosal mitochondrial function. This observation will require further evaluation in future studies.
34

Unravelling the epidemiology of norovirus outbreaks in hospitals

Harris, John January 2014 (has links)
Norovirus is the commonest cause of outbreaks of gastrointestinal disease in the U.K. Most reported outbreaks occur in health care settings, such as hospitals and nursing homes, and can cause severe disruption through ward closures, cancelled operations and staff sickness. Previous studies estimated these outbreaks cost the NHS around £115 million a year. Despite previous studies some questions remain. What is the burden of norovirus in hospitals - how many outbreaks occur and how many people are hospitalised each year as a result of norovirus infection? Do published reports of outbreaks provide evidence of what works in infection control? Can the factors facilitating norovirus transmission during outbreaks in hospitals be identified? These questions were answered through a series of inter-linked studies that explored mortality, morbidity, transmission pathways and aspects of infection control. The introduction of a new surveillance system provided greater insights into the heavy burden that norovirus imposes on English hospitals. In the years 2009-2011, 3,980 reports of outbreaks of suspected and confirmed norovirus were received. There was little difference in the epidemiology of outbreaks from one season to the next. On average outbreaks were associated with 13,000 patients and 3,400 staff becoming ill, 8,900 days of ward closure and the loss of over 15,500 bed-days annually. Analysis of mortality data demonstrated a clear association between norovirus infection and mortality in the elderly (65 years and over) with an estimated 80 deaths per year in this age group. The number of deaths increased in years where norovirus activity was higher but this was not associated with increased pathogenicity of the virus. Norovirus was the only pathogen that had a significant association with mortality in the regression models. Modeling of routine hospital admission data demonstrates that norovirus accounted for around 3,000 norovirus admissions a year to English hospitals, two thirds of which were in the elderly. A review of published papers did not provide clear evidence for the effectiveness of infection control measures. However, this was largely because the reporting of outbreaks was poor and that the introduction of more rigorous reporting protocols would improve this. Analysis of 3,500 outbreaks of norovirus demonstrated that closing a ward or bay promptly (within three days of the first person becoming ill) is beneficial. The duration of outbreak, the total duration of disruption were shorter, and fewer patients overall were affected, if closure occurred promptly. When closure occurred 7 or more days after the first onset date outbreaks were twice as long as those where closure was prompt. The duration of outbreak was also increased by ward size and in outbreaks occurring in winter time. Outbreaks were longer if they occurred on care of the elderly wards. A strategy of prompt closure is beneficial, particularly in larger wards and during winter time. The time between the first two cases of each outbreak was used to estimate the serial interval for norovirus in a hospital setting and was estimated to be 1.86 days. This distribution and dates of illness onset were used to calculate epidemic trees for each outbreak. A permutation test found strong evidence that proximity was a significant driver of outbreaks (p < 0.001). Patients occupying the same bay as patients with symptomatic norovirus infection are at increased risk of becoming infected by these patients compared with patients elsewhere in the same ward. In summary, there is a demonstrable association with mortality in older people, and around 3,000 admissions to hospital each year. Over 3,900 outbreaks were reported in three years (2009-2011). On average 13,000 patients were affected each year leading to 8,900 days of ward closures. Vomiting appears to be an important driver of outbreaks. Acting quickly by closing affected areas appears to be beneficial in controlling outbreaks caused by norovirus. This is especially the case in larger wards during the winter.
35

Community-based surveillance and control of malaria vectors in urban Dar es Salaam, Tanzania

Chaki, Prosper Pius January 2012 (has links)
Recent increase in political and funding commitments to malaria control have resulted in rapid scale up of indoor residual spraying (IRS) and long-lasting insecticidal nets (LLINs) as priority vector control interventions. Despite this increasing coverage and consequent substantial reductions of malaria burden, residual malaria transmission by outdoor-biting mosquitoes in particular, necessitates complimentary vector control strategies such as larval source management. More sensitive and scalable entomological surveillance tools are required to monitor the resultant lower transmission levels that persist across much of the tropics. The Urban Malaria Control Program (UMCP) in Dar es Salaam, Tanzania, implements a large-scale community-based (CB) larviciding programme with the aim of demonstrating operational feasibility of integrating larval control into routine municipal services, while utilizing community-owned resource personnel (CORPs) for its implementation. The goal of this study was to a better understanding of community participation in larval-stage vector surveillance and control, and to develop a practical, safe and affordable prototype for routine programmatic adult mosquito surveillance. Qualitative methods involving administering a set of unstructured interviews to CORPs were used to investigate their performance and demographic characteristics, their perceptions and reasons for participating in the UMCP. Ethnographic and historical resources were used to examine how ‘participation in’ and ‘responsibility for’ larval control is inter-articulated through scientific protocols, development practices, and the specific political history of Tanzania. Cross-sectional surveys were later used to assess the effectiveness of operational, community-based larval habitat surveillance systems within the UMCP by estimating the respective detection coverage and sensitivity levels by CORPs. Additionally, an intensive and extensive CB system for routine, longitudinal, programmatic surveillance of mosquitoes using the Ifakara Tent Trap (ITT) was developed and evaluated in comparison with quality assurance (QA) surveys using either ITT or human landing catches (HLC) and with malaria parasite prevalence from the cross-sectional surveys. Overall, CORPs’ individual detection coverage and sensitivity levels were poor, influenced by his/her unfamiliarity with the area, habitat type, fencing and inclusion within larviciding roll out. These indicators were particularly low among CORPs recruited through programme management staff, compared to those recruited by local government officials or health committees, and among staff living outside their areas of responsibility. The CORPs perceived their role to be professional rather than voluntary, with participation being a de facto form of employment. In spite of all challenges, the central coordination role played by the city council, coupled with catalytic donor funding and technical support from expert research partners, enabled institutionalization of strengthened management and planning and improved community mobilization. Capacity to exploit national and international funding systems was enhanced and a sustainable implementation program was ultimately established with funding from the Ministry of Health and Social Welfare, overseen by the National Malaria Control Programme and implemented by the City and Municipal Councils. Management of this program is currently supported by a spatially extensive and temporally intensive community-based longitudinal adult mosquito vector surveillance system with predictive power for parasite infection risk.
36

Febrile illnesses at the Colombo North Teaching Hospital in Sri Lanka (The Ragama Fever Study)

Bailey, Mark S. January 2012 (has links)
Acute undifferentiated febrile illnesses in the tropics and sub-tropics are caused by a wide range of infectious diseases that often have indistinguishable clinical features. In developing countries there may also be insufficient microbiology facilities to identify these infections leading to missed diagnoses, inefficient use of healthcare resources, over-use of empirical treatments, a lack of information on antimicrobial resistance and inaccurate epidemiological data for guiding prevention strategies. These problems occur in Sri Lanka, but a prospective, systematic, representative and comprehensive study of febrile illnesses has never been performed. The Ragama Fever Study was performed at a major hospital in western Sri Lanka that served both urban and rural areas. Its aims were to identify the causes of febrile illnesses in a large sample of patients admitted to the hospital over a 1-year period, develop clinical prediction rules that could distinguish between the most common infectious diseases and assist in the evaluation of rapid (point-of-care) diagnostic tests that were appropriate to this setting. 617 (86.7%) of 711 febrile patients admitted to a quarter of the hospital medical wards were recruited. 56.4% had confirmed infections with organisms identified including dengue (22.2%), chikungunya (16.7%), leptospirosis (5.2%), various bacteraemias (4.2%), Q fever (2.9%), rickettsial infections (2.3%), tuberculosis (1.1%) and urinary tract infections (0.8%). 7.6% had confirmed infections with no organisms identified including cellulitis (2.4%), respiratory tract infections with radiographic changes (2.1%) and pulmonary tuberculosis with radiographic changes (1.6%). 4.1% had confirmed non-infectious diseases and 37.2% had unconfirmed diseases including “viral fever” (13.3%), undifferentiated fever (7.8%), respiratory tract infections (6.8%), urinary tract infections (3.4%), leptospirosis (2.8%) and gastroenteritis (1.0%). Clinical prediction rules for identifying dengue fever and chikungunya were developed using imputation, multiple logistic regression, scoring algorithms and receiver operating characteristic (ROC) curve analysis. The dengue fever rule had sensitivity = 49.6%, specificity = 93.9%, positive predictive value (PPV) = 70.8% and negative predictive value (NPV) = 86.1%. The chikungunya rule had sensitivity = 35.0%, specificity = 95.0%, PPV = 60.0% and NPV = 87.1%. ROC curve analysis could not identify any probability cut-offs that would produce clinical prediction rules with acceptable combinations of both sensitivity and specificity. A commercial (Panbio) rapid serology test for dengue fever showed sensitivity = 43.4%, specificity = 88.8%, PPV = 54.6% and NPV = 83.5% on samples from admission and significantly better diagnostic performance on follow-up. When repeated in conjunction with a PanBio rapid NS1 antigen detection test, the diagnostic performance improved with sensitivity = 89.9%, specificity = 75.0%, PPV = 69.0% and NPV = 92.3% on admission. This study confirmed the wide range of infections that present as febrile illnesses in Sri Lanka and showed the limitations of clinical prediction rules and rapid diagnostic tests in identifying these on admission. I hope that it will prove a foundation for further work on these important topics.
37

HIV infection and stroke in Malawian adults

Benjamin, Laura January 2014 (has links)
There is an increased incidence of young people with stroke (age ≤45years) in Human immunodeficiency virus (HIV) endemic countries; this has been largely attributed to hypertension. However, hospital based surveys in countries like Malawi and South Africa have shown that the prevalence of hypertension in these young people is lower than expected, but HIV infection is substantially higher, implicating HIV as a risk factor. For many years a link between HIV and stroke has been postulated, but the relationship is uncertain. Whilst HIV may be a risk factor for stroke directly through mechanisms linked with HIV-associated vasculopathy, or indirectly through opportunistic infections, the drugs that treat HIV infection may also increase the risk of stroke because of their metabolic effects. Many studies, almost all retrospective, have failed to separate the direct effect of HIV infection from the indirect effects, including combined antiretroviral therapy, on cerebrovascular risk. HIV infection increases the risk of stroke mimics such as intracranial toxoplasma infection. The Recognition of Stroke in the Emergency Room (ROSIER) score is commonly used to screen for a stroke and triage patients for computer tomography (CT) of the brain. However, the accuracy of the ROSIER score and CT brain to reliably differentiate a stroke diagnosis from those with a stroke mimic in people with HIV infection is uncertain. I found that the ROSIER score and CT brain imaging had poor diagnostic accuracy in an HIV positive population. Therefore, in my thesis, every patient with an acute neurological symptom was fully assessed for a stroke as part of the screening process and confirmation was by magnetic resonance brain imaging. I subsequently investigated the risk factors and aetiology of stroke through a prospective case-control study in an HIV endemic country. Through this work, I showed that HIV infection is associated with cerebrovascular disease. Although hypertension was the leading risk factor in the population overall, HIV infection and its treatment was the second most important, and the most important in younger patients. Unexpectedly, I found that starting combined antiretroviral therapy in a subgroup of people living with HIV infection independently increased the risk of stroke. In this cohort, ischaemic stroke was the predominant stroke type and opportunistic infections only accounted for less than a third of these cases. The heterogeneity of HIV stroke with respect to risk factors for stroke, the degree of immunosuppression and HIV activity, and prior or current opportunistic infection has made it difficult to generalise epidemiological findings in some studies to populations at large. My study, to some extent unravels some of this ambiguity. I speculate that HIV related strokes evolves through the introduction of cART and then transitions into an aging population, accelerating atherosclerotic stroke and potentially contributing to an anticipated stroke epidemic in countries like Malawi.
38

A multi-sited ethnography of patient and public involvement in epilepsy research

Deja, Elizabeth January 2014 (has links)
Contemporary health policy and funding bodies are placing increasing emphasis on patient and public involvement (PPI) in healthcare and health research, advocating PPI in all stages of the research process. Currently, however, there is limited empirical evidence critiquing different approaches to PPI or exploring its associated benefits and challenges. Without this information researchers and patient/public representatives cannot make informed decisions about best practice. The principal aim of this thesis was to generate a detailed understanding of the implementation of PPI in health research. To accomplish this broad aim, I focused on a specific health condition, epilepsy, and the research structures underlying health research in the UK, namely, research networks. I achieved this using a multi-sited, ethnographic approach, incorporating multiple qualitative data collection methods, including 47 interviews, 35 observations, fieldnotes and document analysis. My in-depth thematic analysis of the data found that PPI is conceptualised in terms of ‘meaningful’ and ‘tokenistic’ involvement by those engaged in the process, rather than how it is depicted in the current models of involvement. Having first explored these terms I identified five components that can help to ensure that PPI is meaningful and not tokenistic. Having compared and contrasted multiple approaches to PPI I conclude that there is not one single ‘best approach’ for implementing PPI. Rather, to achieve high ‘quality’ PPI there is a need to incorporate seven methodological factors that overarch approaches and ensure that there is an alignment of approach and purpose. Both the professionals and the patient/ public representatives within my research appeared to be highly aware of the moral and political motivations of PPI, but were primarily motivated by pragmatic or consequentialist reasons. Professionals were motivated almost exclusively by the goal of improving the applicability or relevance of the research. This goal was important for representatives too but they were also motivated by a range of personal reasons, including the wish to feel they were making a difference; the opportunity to learn about epilepsy and epilepsy research; and the opportunity to interact with others. The perceived benefits of PPI were also identified and discussed in depth, and appeared to be largely congruent with those reported in the literature. However, my work has identified some challenges and barriers around PPI that have not previously been explored including: adverse emotional effects; organisational practicalities; concerns about ‘representativeness’ and ‘tokenism’; the ‘blurring’ of roles and the erosion of patient-clinician boundaries. I conclude by recommending that there should be an increased focus on appropriate, ‘meaningful’, involvement rather than endeavouring to implement PPI in all stages of the research process, as currently advocated in policy documents. The insights into the challenges of PPI that my work has provided will allow them to be addressed from the outset, improving the PPI experience and consequently the likelihood of PPI being successfully implemented.
39

Integration of a maternal psychosocial well-being component into an early child-development intervention

Zafar, Shamsa January 2014 (has links)
Maternal psychosocial well-being (MPW) is a comprehensive concept that covers the psychological (e.g., depression, distress, anxiety, coping, mental health,) and social (e.g., family and community support, empowerment, relationships, culture) aspects of motherhood. High rates of poor maternal mental health, with maternal depression the most prevalent condition, have been reported in the low and middle income countries, including Pakistan. Though evidence based interventions exist to address maternal depression, these have not been translated into policy because of various implementation barriers. Integration of these interventions into existing maternal and child health (MCH) programmes has been suggested as a strategy to provide accessible care to mothers. In the current study we developed and integrated a cognitive behavioural therapy–based MPW intervention (the 5 pillars approach) into a child nutrition and development program. Following qualitative research with community health workers (CHWs) and families, CHWs were trained in (1) empathic listening, (2) family engagement, (3) guided discovery using pictures, (4) behavioural activation, and (5) problem solving. A qualitative feasibility study in one area demonstrated that CHWs were able to apply these skills effectively to their work, and the approach was found to be useful by CHWs and mothers. This work provides vital information on the lessons learnt in the implementation of a maternal psychosocial wellbeing intervention for universal use. The facilitating factors included mothers being the central focus of the intervention, utilizing existing local CHWs whom the mothers trust, simple training and regular supervision, and an approach that facilitates, and does not add, to the CHWs’ work.
40

A comparison of access to medical care for insured and uninsured expatriates in Saudi Arabia

Alkhamis, Abdulwahab January 2013 (has links)
Background: Saudi Arabia is one of the Gulf Cooperative Council (GCC) countries which have common characteristics such as high-income governments, dominant expatriate populations, and under-developed healthcare systems, including healthcare financing. The dominance of the expatriate working population raises the question of how to find a mechanism that ensures expatriates have appropriate access to medical care whilst the employers bear the responsibility of healthcare expenses. Saudi Arabia is one of the few GCC countries to have reformed its private healthcare system through a Compulsory Employment-Based Health Insurance (CEBHI). The CEBHI was designed to mitigate some of the disadvantages of the Employment Sponsored Insurance scheme previously implemented in the United States; and this is the first study to investigate the impact of this form of private health insurance on access to medical care, in a country such as Saudi Arabia. The main aim of the study was to explore the influence of health insurance on access to medical care, in order to assist the Saudi Government in their deliberations about making CEBHI compulsory for all people (citizens and expatriates) within Saudi Arabia. This aim was investigated through the following objectives: 1) to review health financing in Saudi Arabia and compare it with other GCC countries and elsewhere in the world; 2) to compare the access to medical care of insured and uninsured expatriates in Saudi Arabia; 3) to develop a framework for understanding the complex relationship of health insurance and access to healthcare, 4) to make policy-relevant recommendations regarding the key question as to whether compulsory health insurance in Saudi Arabia should be expanded. Methods: Two methods were used to tackle the study objectives. Firstly, a framework for country-level analysis of healthcare financing arrangements was used to compare and analyse the national expenditure on healthcare within the GCC and other developing/developed countries. Secondly, a logistic regression analysis of data from a cross-sectional survey was undertaken to investigate the impact of health insurance on access to medical care, considering the main workplace and personal characteristics of the expatriates. Three access measures, access to usual medical care (Access 1), inability to access medical care (Access 2), and utilization of medical care (Access 3), were used to evaluate access to medical care for the expatriate population. Prior to the implementation of CEBHI the expatriate population accessed medical care through a variety of different avenues. These modes of access were used as classification of the expatriate population into four groups. Two of these groups were insured but had a different Previous Method of Paying for Healthcare (PMPHC) (Group B=insured, not paid, and Group D=insured and paid) and two groups were not insured but also had different PMPHC (Group A=not insured, not paid and Group C=not insured, but paid). A multistage stratified cluster sampling was used, and a sample selected from each sector and company size proportionately. The total sample size was 3,278. A simple conceptual framework for studying access to medical care was developed to guide the multi-variate regression techniques, and greatly assisted interpretation of the results. Results: The GCC characteristics impact on the healthcare financing strategies of GCC countries in three ways. First, GCC governments provide the majority share of the health budget, similar to high-income countries. Second, GCC countries use different strategies to control expatriates costs, but some of these strategies lead to increased out-of-pocket expenses, which is a characteristic of low-income countries. Third, health care financing systems in GCC countries are still being developed as they finance most of their public services, including health care services, with revenue from natural resources (i.e. oil or gas). Additionally, some of their health care indicators are identifiable with those from below upper-middle income countries. In addition, after CEBHI, private expenditure did not change but remained around 22.4%, which does not reflect the huge number of people having access to medical care though private sector only. However, there was a shift in the means of private sector expenditure from Out Of Pocket payments to private insurance expenditure. OOP expenditure decreased from 32.3% in 2006 to 28.4% in 2008, and private insurance expenditure increased as a percentage of private sector expenditure from 26.2% in 2006 to 36.7% in 2008. Analysis of the data from the survey demonstrates that health insurance is strongly associated with access to medical care, as measured by the three different access measures). Compared to uninsured workers, being enrolled in CEBHI increased the possibility of an expatriate’s access to usual medical care and utilisation of medical care by more than 10 (8.709-12.299, 95%), and 2.3 (1.946-2.750, 95%) respectively. However, the influence of PMPHC is greater than the influence of insurance alone on reducing the inability to access medical care (health insurance reduced the inability to access medical services by 42% (0.515-0.995, 95%), whereas PMPHC reduced the inability to access medical services by more than 65.% (0.273-0.436, 95%)).Therefore, the impact of health insurance on access to medical care is much greater for those expatriates previously having had healthcare costs met by their employer, than for those who had not. These impacts remained, when the odds ratios were adjusted for both workplace and personal characteristics. Conclusion: CEBHI has a clear positive impact on reducing out of pocket payments and increasing private insurance expenditure. However, overall, private healthcare expenditure has increased insignificantly. This indicates that the main impact of CEBHI on private expenditure, is the change in the mode of payment from out of pocket payments to private insurance expenditure. However, the actual impact on private sector expenditure is still minor. Access to medical care is influenced by health insurance. In addition, it is also influenced by PMPHC as a contributory role to play in the influence of health insurance on access to medical care. Workplace and personal characteristics play a small part in mediating the influence of health insurance on access to medical care. A framework was developed for understanding the complex relationship of health insurance and access to healthcare, which will be useful for further investigations regarding the influence of health insurance on access to medical care. Both long and short-term recommendations are proposed for increasing the expatriate population’s access to medical care, whilst reducing the burden on healthcare financing.

Page generated in 0.0626 seconds