• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 96
  • 22
  • 17
  • 9
  • 9
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 176
  • 176
  • 66
  • 56
  • 51
  • 46
  • 33
  • 29
  • 25
  • 22
  • 17
  • 17
  • 17
  • 16
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Do estate-level characteristics generate unsafety? : Examining neighborhood and estate characteristics influence on perceived residential safety in Gothenburg / Do estate-level characteristics generate unsafety? : Examining neighborhood and estate characteristics influence on perceived residential safety in Gothenburg

Frisk Garcia, Madeleine January 2023 (has links)
Do estate and neighborhood characteristics influence perceptions of safety? Using data from a survey of residents living in municipal housing in Gothenburg, this paper argues that the spatial and social characteristics of a neighborhood vastly outpace the role of its socioeconomic and demographic composition, when it comes to accounting for the perceived safety of its residents. The dataset consists of survey data on residents’ perception of safety from 2013-2014 and 2016-2021 in Gothenburg linked with sociodemographic data at an estate level. This allows us to examine the effects of neighborhood and estate characteristics on perceived safety. We compare two different indices of safety and conceptualize safety as residential safety, which is then analyzed using statistical models. The study employs a combined estate and year fixed effect model with estate clustering and robust standard errors to strengthen the causal identification between the relationships and the robustness of the results. The study finds strong support for neighborhoods’ social and spatial characteristics such as contact with neighbors and the level of streetlights to influence individuals’ perception of safety. Weaker support is also established for the safety level to be affected by the socioeconomic composition in the area and the estate. These findings indicate that the social cohesion in a neighborhood and the spatial organization are important factors in increasing residential safety.
72

Performance of the Kenward-Project when the Covariance Structure is Selected Using AIC and BIC

Gomez, Elisa Valderas 17 May 2004 (has links) (PDF)
Linear mixed models are frequently used to analyze data with random effects and/or repeated measures. A common approach to such analyses requires choosing a covariance structure. Information criteria, such as AIC and BIC, are often used by statisticians to help with this task. However, these criteria do not always point to the true covariance structure and therefore the wrong covariance structure is sometimes chosen. Once this step is complete, Wald statistics are used to test fixed effects. Degrees of freedom for these statistics are not known. However, there are approximation methods, such as Kenward and Roger (KR) and Satterthwaite (SW) that have been shown to work well in some situations. Schaalje et al. (2002) concluded that the KR method would perform at least as well as or better than the SW method in many cases assuming that the covariance structure was known. On the other hand, Keselman et al. (1999) concluded that the performance of the SW method when the covariance structure was selected using AIC was poor for negative pairings of treatment sizes and covariance matrices and small sample sizes. Our study used simulations to investigate Type I error rates in test of fixed effects using Wald statistics with the KR adjustment method, incorporating the selection of the covariance structure using AIC and BIC. Performance of the AIC and BIC criteria in selecting the true covariance structure was also studied. The MIXED procedure (SAS v. 9) was used to analyze each simulated data set. Type I error rates from the best AIC and BIC models were always higher than target values. However, Type I error rates obtained by using the BIC criterion were better than those obtained by using the AIC criterion. Type I error rates for the correct models were often adequate depending on the sample size and complexity of covariance structure. Performance of AIC and BIC was poor. This could be a consequence of small sample sizes and the high number of covariance structures these criteria had to choose from.
73

Antibiotic consumption and its determinants in India

Fazaludeen Koya, Muhammed Shaffi 30 August 2022 (has links)
BACKGROUND: India—one of the most significant antibiotic users in the world with a high burden of antibiotic resistance—does not have a formal antibiotic surveillance system. No formal studies exist on the sub-national differences in antibiotic use in India except for small hospital or community-based studies. Informed by the WHO Global Action Plan, India developed a national action plan; however only two states have state action plans so far. This suggests that it is important to understand existing antibiotic consumption patterns, sub- national differences and trends over time, and the determinants of antibiotic use so that evidence-informed action plans and programs can be developed in India. AIM: To understand the changing landscape of antibiotic use in India and contribute to relevant policy and programmatic interventions that can improve the appropriate use of antibiotics in the country. Specific objectives included examining the use of systemic antibiotic consumption at the national level, analyzing geographical and temporal variations across states between 2011 and 2019, and understanding the determinants of antibiotic consumption. Additionally, we examined Kerala as a case study to understand the use and availability of data in designing, implementing, and monitoring the state antibiotic action plan. METHODOLOGY: First, we conducted a cross-sectional analysis of antibiotic use in 2019 using the WHO Access-Watch-Reserve (AWaRe) and Defined Daily Doses (DDD) matrices at the national level across product type (Fixed-Dose Combinations [FDCs]; and single formulations [SF]), essentiality (listed in the national list of essential medicines [NLEM]; and not listed), and central regulatory approval status (approved and unapproved). Second, we analyzed trends in consumption rates and patterns at the national, state, and groups of states at different levels of health achievements (‘high focus’ [HF]; and ‘non-high focus’ [nHF]) and compared the appropriateness of use between states and state groups. Third, using a cross-sectional, time series (panel) dataset on antibiotic use, per-capita GDP, per-capita government spending on health, girls' tertiary education enrollment ratio, measles vaccination coverage, and lower respiratory tract infection incidence for the period 2011- 2019, we conducted a quasi- experimental fixed-effects analysis to understand the critical determinants of antibiotic use. Finally, we conducted key-informant interviews and document analysis to understand the use of data in policy formulation, implementation, monitoring, and evaluation of the Kerala state action plan. RESULTS: India's per-capita private-sector antibiotic consumption rate was lower than global rates, but the country has a high consumption rate of broad-spectrum antibiotics, FDCs discouraged by WHO, formulations outside NLEM in FDCs, and unapproved formulations. The overall rate increased from 2011 to 2016 and decreased between 2016 and 2019, registering a net decrease of 3.6%. State consumption rates varied widely— with HF states reporting lower rates. The inappropriate use increased over the years, the share of Access antibiotics decreased (13.1%), and the access-to-watch ratio declined (from 0.59 to 0.49). HF and nHF states showed convergence in the share of the Access and the Access-Watch ratio, while they showed divergence in the use of WHO Discouraged FDCs. The most critical independent determinant of antibiotic use was government spending on health—for every US$12.9 increase in per-capita government spending on health, antibiotic use decreased by 461.4 doses per 1000 population per year after adjusting for other factors. Economic progress (increase in per-capita GDP) and social progress (increase in girls' higher education) were also found to reduce antibiotic use independently. The qualitative case study showed that stakeholders understand and express interest in generating and using data for decision- making, and the action plan document mentions some basic monitoring plans. However, a monitoring and evaluation framework is missing, there is a lack of engagement with the private sector, and there is a lack of understanding among key government policymakers on the importance of using data for surveillance and policy implementation. CONCLUSION AND IMPLICATIONS: There is significant and increasing inappropriate antibiotic use in India's private sector, accounting for 85-90% of total antibiotic use. Increased government spending on health is critical in reducing private-sector antibiotic use. The dearth of data on public sector use is a significant challenge in understanding the total consumption rate. Developing a monitoring and evaluation system through stakeholder engagement is necessary for Indian States to inform, monitor, and evaluate effective antibiotic action plans. We need global efforts to improve the science and methods to measure antibiotic use. / 2023-08-30T00:00:00Z
74

Evaluating Renewable Energy Employment Impacts from Renewable Energy Policies

Frey, Noah 10 November 2022 (has links)
No description available.
75

High School Dropouts, Higher Education Dreams, and Achievement: A Six-Year Study of a High-Stakes Test in Brazil

Miranda, Eveline 12 December 2022 (has links) (PDF)
Rumberger (2020) observed that "dropping out of school has economic and social consequences both for dropouts themselves and for the country as a whole" (p. 151). Every year, many Brazilians drop out of school due to work, early pregnancies, marriage, drug consumption, crime, etc. Dropping out of school can occur due to learning challenges, poor attendance, discipline problems, or a lack of access to high school institutions. Dropouts can experience depression and anxiety and are more likely to attempt suicide. The present dissertation includes two different papers about dropouts. The first paper uses fixed effect regression to show the main characteristics of dropouts who both left high school before completing it and registered for the Brazilian National Exam (ENEM). The results demonstrate that dropouts who take the ENEM are males, hail from low-income families, are younger (less than 17 years old), and are less likely to possess computers. When analyzing the 2015 and 2016 data set, which included dropouts who took the ENEM to receive high school certification, the results show that thew are more likely to have dropped out of school during their basic education (1st to 9th grade). In the second paper, I evaluated differences in achievement "between dropout registrants and current students, and dropout registrants and graduates" each comparison using the same data set (ENEM), but restricted to 2015 and 2016, due to the availability of a larger number of predictive variables of dropouts. The results indicate that dropout registrants did worse than all groups in essay writing but performed similarly to current students in math and language in 2016. When comparing the achievement of dropout registrants and graduates, the results show more pronounced differences, but in essay writing, the effect size varied from 0.22SD to 0.35SD.
76

Measures of University Research Output

Zharova, Alona 14 February 2018 (has links)
New Public Management unterstützt Universitäten und Forschungseinrichtungen dabei, in einem stark wettbewerbsorientierten Forschungsumfeld zu bestehen. Entscheidungen unter Unsicherheit, z.B. die Verteilung von Mitteln für den Forschungsbedarf und Forschungszwecke, erfordert von Politik und Hochschulmanagement, die Beziehungen zwischen den Dimensionen der Forschungsleistung und den resultierenden oder eingehenden Zuschüssen zu verstehen. Hierfür ist es wichtig, die Variablen der wissenschaftlichen Wissensproduktion auf der Ebene von Individuen, Forschungsgruppen und Universitäten zu untersuchen. Das Kapitel 2 dieser Arbeit analysiert die Ebene der Individuen. Es verwendet die Beobachtungen der Forscherprofile von Handelsblatt (HB), Research Papers in Economics (RePEc, hier RP) und Google Scholar (GS) als meist verbreitete Ranking-Systeme in BWL und VWL im deutschsprachigen Raum. Das Kapitel 3 liefert eine empirische Evidenz für die Ebene von Forschungsgruppen und verwendet die Daten eines Sonderforschungsbereichs (SFB) zu Finanzinputs und Forschungsoutput von 2005 bis 2016. Das Kapitel beginnt mit der Beschreibung passender Performanzindikatoren, gefolgt von einer innovativen visuellen Datenanalyse. Im Hauptteil des Kapitels untersucht die Arbeit mit Hilfe eines Zeit-Fixed-Effects-Panel- Modells und eines Fixed-Effects-Poisson-Modells den Zusammenhang zwischen finanziellen Inputs und Forschungsoutputs. Das Kapitel 4 beschäftigt sich mit dem Niveau der Universitäten und untersucht die Interdependenzstruktur zwischen Drittmittelausgaben, Publikationen, Zitationen und akademischem Alter mit Hilfe eines PVARX-Modells, einer Impulsantwort und einer Zerlegung der Prognosefehlervarianz. Abschließend befasst sich das Kapitel mit den möglichen Implikationen für Politik und Entscheidungsfindung und schlägt Empfehlungen für das universitäre Forschungsmanagement vor. / New Public Management helps universities and research institutions to perform in a highly competitive research environment. Decision making in the face of uncertainty, for example distribution of funds for research needs and purposes, urges research policy makers and university managers to understand the relationships between the dimensions of research performance and the resulting or incoming grants. Thus, it is important to accurately reflect the variables of scientific knowledge production on the level of individuals, research groups and universities. Chapter 2 of this thesis introduces an analysis on the level of individuals. The data are taken from the three widely-used ranking systems in the economic and business sciences among German-speaking countries: Handelsblatt (HB), Research Papers in Economics (RePEc, here RP) and Google Scholar (GS). It proposes a framework for collating ranking data for comparison purposes. Chapter 3 provides empirical evidence on the level of research groups using data from a Collaborative Research Center (CRC) on financial inputs and research output from 2005 to 2016. First, suitable performance indicators are discussed. Second, main properties of the data are described using visualization techniques. Finally, the time fixed effects panel data model and the fixed effects Poisson model are used to analyze an interdependency between financial inputs and research outputs. Chapter 4 examines the interdependence structure between third-party expenses (TPE), publications, citations and academic age using university data on individual performance in different scientific areas. A panel vector autoregressive model with exogenous variables (PVARX), impulse response functions and a forecast error variance decomposition help to capture the relationships in the system. To summarize, the chapter addresses the possible implications for policy and decision making and proposes recommendations for university research management.
77

Economic Inequality, Demographics and Violent Crime : A Cross-National Panel Analysis of Homicide Rates, 2010-18

Li, minyi, Delladona, Abner January 2022 (has links)
Violent crime has many long-lasting negative consequences for society. This thesis aims to explore the relationship between economic inequality and violent crime, represented by the level of intentional homicides in forty-nine countries over the period of nine years from 2010-2018. We delve into several theories and representative works in the fields of criminology, sociology, psychology, and economics that provide important perspectives on the subject and offer a theoretical foundation for the analysis. Previous research has usually pointed to a positive association between inequality and crime rates, albeit with some notable outliers. Our objective was to provide an updated view on the subject, employing recent data and statistical methods. We use fixed-effects estimators to account for time-invariant determinants, provide random-effects estimators for control and apply a generalized methods of moments model for possible inertia regarding the dependent variable. Economic inequality in the form of income inequality does seem to cause more harm than what might be suspected at first, influencing the intentional homicide levels in a society. It is the duty of public and private bodies to foster policies that aim to reduce this trend, and thus diminish the societal costs associated with it.
78

The Effects of Options Markets on the Underlying Markets: Quasi-Experimental Evidence

Mason, Brenden James January 2018 (has links)
This dissertation consists of three essays in applied financial economics. The unifying theme is the use of financial regulation as quasi-experiments to understand the interrelationship between derivatives and the underlying assets. The first two essays use different quasi-experimental econometric techniques to answer the same research question: how does option listing affect the return volatility of the underlying stock? This question is difficult to answer empirically because being listed on an options exchange is not random. Volatility is one of the dimensions along which the options exchanges make their listing decisions. This selection bias confounds any causal effect that option listing may have. What is more, the options exchanges may list along unobservable dimensions. Such omitted variable bias can also confound any causal effect of option listing. My first essay overcomes these two biases by exploiting the exogenous variation in option listing that is created by the SEC-imposed option listing standards. Specifically, the SEC mandates that a stock must meet certain criteria in the underlying market before it can trade on an options exchange. For example, a stock needs to trade a total of 2.4 million shares over the previous 12 months before it can be listed. Since 2.4 million is an arbitrary number, stocks that are “just above” the 2.4 million threshold will be identical to stocks that are “just below” it, the sole difference being their probability of option listing. Accordingly, I use the 2.4 million threshold as an instrument for option listing in a fuzzy regression discontinuity design. I find that option listing causes a modest decrease in underlying volatility, a result that corroborates many previous empirical studies. My second essay attempts to estimate the effect of option listing for stocks that are “far away from” the 2.4 million threshold. I overcome the aforementioned omitted variable bias by fully exploiting the panel nature of the data. I control for the unobserved heterogeneity across stocks by implementing a two-way fixed effects model. Unlike most previous studies, I control for individual-level fixed effects at the firm level rather than at the industry level. My results show that option listing is associated with a decrease in volatility. Importantly, these results are only statistically significant in a model with firm-level fixed effects; they are insignificant with industry-level fixed effects. My third essay is a policy evaluation of the SEC’s Penny Pilot Program, a mandated decrease of the option tick size for various equity options classes. Several financial professionals claimed that this decrease would drive institutional investors out of the exchange-traded options market, channeling them into the opaque, over-the-counter (OTC) options market. I empirically test an implication of this hypothesis: if institutional investors have fled the exchange-traded options market for the OTC market, then it may take longer for information to be impounded into a stock’s price. Using the `price delay’ measure of Hou and Moskowitz (2005), I test whether stocks become less price efficient as a result of being included in the Penny Pilot Program. I perform this test using firm-level fixed effects on all classes that were included in the program. I confirm these results with synthetic control experiments for the classes included in Phase I of the Penny Pilot Program. Generally, I find no change in price efficiency of the underlying stocks, which suggests that the decrease in option tick size did not materially erode the price discovery that takes place in the exchange-traded equity options market. I also find evidence that the decrease in option tick size caused an increase in short selling for the piloted stocks. / Economics
79

How Well Can Two-Wave Models Recover the Three-Wave Second Order Latent Model Parameters?

Du, Chenguang 14 June 2021 (has links)
Although previous studies on structural equation modeling (SEM) have indicated that the second-order latent growth model (SOLGM) is a more appropriate approach to longitudinal intervention effects, its application still requires researchers to collect at least three-wave data (e.g. randomized pretest, posttest, and follow-up design). However, in some circumstances, researchers can only collect two-wave data for resource limitations. With only two-wave data, the SOLGM can not be identified and researchers often choose alternative SEM models to fit two-wave data. Recent studies show that the two-wave longitudinal common factor model (2W-LCFM) and latent change score model (2W-LCSM) can perform well for comparing latent change between groups. However, there still lacks empirical evidence about how accurately these two-wave models can estimate the group effects of latent change obtained by three-wave SOLGM (3W-SOLGM). The main purpose of this dissertation, therefore, is trying to examine to what extent the fixed effects of the tree-wave SOLGM can be recovered from the parameter estimates of the two-wave LCFM and LCSM given different simulation conditions. Fundamentally, the supplementary study (study 2) using three-wave LCFM was established to help justify the logistics of different model comparisons in our main study (study 1). The data generating model in both studies is 3W-SOLGM and there are in total 5 simulation factors (sample size, group differences in intercept and slope, the covariance between the slope and intercept, size of time-specific residual, change the pattern of time-specific residual). Three main types of evaluation indices were used to assess the quality of estimation (bias/relative bias, standard error, and power/type I error rate). The results in the supplementary study show that the performance of 3W-LCFM and 3W-LCSM are equivalent, which further justifies the different models' comparison in the main study. The point estimates for the fixed effect parameters obtained from the two-wave models are unbiased or identical to the ones from the three-wave model. However, using two-wave models could reduce the estimation precision and statistical power when the time-specific residual variance is large and changing pattern is heteroscedastic (non-constant). Finally, two real datasets were used to illustrate the simulation results. / Doctor of Philosophy / To collect and analyze the longitudinal data is a very important approach to understand the phenomenon of development in the real world. Ideally, researchers who are interested in using a longitudinal framework would prefer collecting data at more than two points in time because it can provide a deeper understanding of the developmental processes. However, in real scenarios, data may only be collected at two-time points. With only two-wave data, the second-order latent growth model (SOLGM) could not be used. The current dissertation compared the performance of two-wave models (longitudinal common factor model and latent change score model) with the three-wave SOLGM in order to better understand how the estimation quality of two-wave models could be comparable to the tree-wave model. The results show that on average, the estimation from two-wave models is identical to the ones from the three-wave model. So in real data analysis with only one sample, the point estimate by two-wave models should be very closed to that of the three-wave model. But this estimation may not be as accurate as it is obtained by the three-wave model when the latent variable has large variability in the first or last time point. This latent variable is more likely to exist as a statelike construct in the real world. Therefore, the current study could provide a reference framework for substantial researchers who could only have access to two-wave data but are still interested in estimating the growth effect that supposed to obtain by three-wave SOLGM.
80

Firm performance, corporate governance and executive compensation in Pakistan

Sheikh, M.F., Shah, S.Z.A., Akbar, Saeed 12 June 2019 (has links)
Yes / This study examines the effects of firm performance and corporate governance on chief executive officer (CEO) compensation in an emerging market, Pakistan. Using a more robust Generalized Method of Moments (GMM) estimation approach for a sample of non-financial firms listed at Karachi Stock Exchange (KSE) over the period 2005 to 2012, we find that both current and previous year accounting performance has positive influence on CEO compensation. However, stock market performance does not appear to have a positive impact on executive compensation. We further find that ownership concentration is positively related with CEO compensation, indicating some kind of collusion between management and largest shareholder to get personal benefits. Inconsistent with agency theory, CEO duality appears to have a negative influence, while board size and board independence have no convincing relationship with CEO compensation, indicating board ineffectiveness in reducing CEO entrenchment. The results of dynamic GMM model suggest that CEO pay is highly persistent and takes time to adjust to long-run equilibrium.

Page generated in 0.1386 seconds