• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 331
  • 117
  • 58
  • 54
  • 39
  • 18
  • 16
  • 12
  • 6
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 809
  • 192
  • 124
  • 98
  • 92
  • 68
  • 63
  • 54
  • 53
  • 51
  • 49
  • 47
  • 43
  • 41
  • 40
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Overexpression of TLR2 and TLR4 Susceptibility to Serum Deprivation-Induced Apoptosis in CHO Cells

Fan, Wei, Ha, Tuanzhu, Li, Yan, Ozment-Skelton, Tammy, Williams, David L., Kelley, Jim, Browder, I. William, Li, Chuanfu 25 November 2005 (has links)
We examined the effect of overexpression of TLR2 and TLR4 on apoptosis. TLR2 and TLR4 transfected CHO cells were subjected to serum deprivation for 0, 24, and 48 h. CHO cells served as control. The survival was 80.4% and 66.8% in CHO cells, 73.8% and 47.6% in TLR2/CHO, and 70.5% and 53.0% in TLR4/CHO, respectively. Flow cytometry examination suggested that apoptotic cells were 7.17% and 32.91% in control CHO cells, 29.0% and 64.6% in TLR2/CHO, and 41.4% and 64.6% in TLR4/CHO, respectively. The levels of FasL and caspase-8 activity in TLR2/CHO and TLR4/CHO cells were significantly higher than that of CHO cells. Transfection of dominant negative FADD into TLR2/CHO and TLR4/CHO cells significantly reduced apoptosis. Our results suggest that overexpression of TLR2 and TLR4 in CHO cells sensitizes the cells to serum deprivation-induced apoptosis and that the mechanisms are involved in the death receptor-mediated signaling pathway.
222

Explaining “Everyday Crime”: A Test of Anomie and Relative Deprivation Theory

Itashiki, Michael Robert 12 1900 (has links)
Every day, individuals commit acts which are considered immoral, unethical, even criminal, often to gain material advantage. Many people consider cheating on taxes, cheating on tests, claiming false benefits, or avoiding transport fare to be wrong, but they do them anyway. While some of these acts may not be formally illegal, they are, at best, considered morally dubious and is labeled “everyday crime.” Anomie theory holds that individuals make decisions based on socialized values, which separately may be contradictory but together, balances each other out, producing behavior considered “normal” by society. When one holds an imbalanced set of values, decisions made on that set may produce deviant behavior, such as everyday crime. RD theory holds that individuals who perceive their own deprivation, relative to someone else, will feel frustration and injustice, and may attempt to ameliorate that feeling with deviant behavior. Data from the 2006 World Values Survey were analyzed using logistic regression, testing both constructs concurrently. An individual was 1.55 times more likely to justify everyday crime for each calculated unit of anomie; and 1.10 times more likely for each calculated unit of RD. It was concluded from this study that anomie and relative deprivation were both associated with the tendency towards everyday crime.
223

Naturalistic Partial Sleep Deprivation Leads to Greater Next-Day Anxiety: The Moderating Role of Baseline Anxiety and Depression

Bean, Christian Alexander Ledwin 09 April 2020 (has links)
No description available.
224

A Short Window Granger Causality Approach to Identify Brain Functional Pattern Associated with Changes of Performance Induced by Sleep Deprivation

Li, Muyuan 01 January 2014 (has links)
The comprehensive effect of sleep deprivation on biological and behavioral functions largely remains unknown. There is evidence to support that human sleep must be of sufficient duration and physiological continuity to ensure neurocognitive performance while we are waking. Insufficient sleep would lead to high risk of human-error related to accidents, injuries or even fatal outcomes. However, in modern society, more and more people suffer from sleep deprivation because of the increasing social, academic or occupational demand. It is important to study the effect of sleep deprivation, not only on task performance, but also on neurocognitive functions. Recent research that has explored brain effective connectivity has demonstrated the directed inference interaction among pairs of brain areas, which may bring important insight to understand how brain works to support neurocognitive function. This research aimed to identify the brain effective connectivity pattern associated with changes of a task performance, response time, following sleep deprivation. Experiments were conducted by colleagues at Neuroergonomics Department at Jagiellonian University, Krakow, Poland. Ten healthy young women, with an average age of 23-year-old, performed visual spatial sustained-attention tasks under two conditions: (1) the rest-wakeful (RW) condition, where participants had their usual sleep and (2) the sleep-deprived (SD) condition, where participants had 3 hours less sleep than their usual sleep, for 7 nights (amounting to 21 h of sleep debt). Measures included eye tracking performance and functional magnetic resonance imaging (fMRI). In each condition, each subject*s eye-position was monitored through 13 sessions, each with 46 trials, while fMRI data was recorded. There were two task performance measures, accuracy and response time. Accuracy measured the proportion of correct responses of all trials in each session. Response time measured the average amount of milliseconds until participants gazed at the target stimuli in each session. An experimental session could be treated as a short window. By splitting long trials of fMRI data into consecutive windows, Granger causality was applied based on short trials of fMRI data. This procedure helped to calculate pairwise causal influences with respect to time-varying property in brain causal interaction. Causal influence results were then averaged across sessions to create one matrix for each participant. This matrix was averaged within each condition to formulate a model of brain effective connectivity, which also served as a basis of comparison. In conclusion, significant effect of sleep deprivation was found on response time and brain effective connectivity. In addition, the change of brain effective connectivity after sleep deprivation was linked to the change of response time. First, an analysis of variance (ANOVA) showed significant difference for response time between the RW condition and the SD condition. No significant changes for accuracy were found. A paired t-test showed that response time was significantly shorter in sleep deprivation for the visual spatial sustained-attention task. Second, Granger causality analysis demonstrated a reduction of bidirectional connectivity and an increase of directed influences from low-level brain areas to high-level brain areas after sleep deprivation. This observation suggested that sleep deprivation provoked the effective connectivity engaged in salient stimuli processing, but inhibited the effective connectivity in biasing selection of attention on task and in maintaining self-awareness in day time. Furthermore, in the SD condition, attention at the visual spatial task seemed to be driven by a bottom-up modulation mechanism. Third, a relationship was found between brain effective connectivity with response time. Decreases of Granger causal influences in two directions, from medial frontal lobe to sub cortical gray nuclei and from medial parietal lobe to sub cortical gray nuclei, were associated with shorter response time in the SD condition. Additionally, an increase of Granger causal influence from medial parietal lobe to cerebellum was associated with longer response time in the SD condition.
225

THE EFFECT OF AREA-LEVEL HEALTHCARE ACCESS AND DEPRIVATION ON COLORECTAL CANCER INCIDENCE IN PENNSYLVANIA FROM 2008 TO 2017

Snead, Ryan, 0000-0003-2876-7003 08 1900 (has links)
Background and Purpose: Colorectal cancer (CRC) is the third most common cancer, the second leading cause of cancer death, with lower survival rates at later stages. Adherence to CRC screening can prevent the development of cancerous polyps and reduce incidence. Area-level characteristics, such as access to healthcare and deprivation, can create barriers to timely screening, increasing the risk of developing CRC. The degree to which area-level characteristics versus individual-level characteristics are responsible for CRC outcomes, including incidence and stage at diagnosis, are not well-understood. Specifically, deficits in the use of spatial statistical techniques has led to a lack of clarity in the current literature. This study aimed to overcome these deficiencies by identifying and utilizing the optimal measurement for area-level access to healthcare and deprivation, employing robust spatiotemporal and multilevel analytic methods to assess their effects on CRC incidence and late-stage diagnosis in Pennsylvania (PA) at the block group-level from 2008 to 2017. The results of this research will more accurately map areas of high predicted CRC relative risk for targeted public health interventions to reduce the burden of CRC over time. The following three study aims were used to address the research problem: Aim 1: Identify the best predictive measure of access to healthcare for estimating CRC incidence risk at the block group-level in PA from 2008 to 2017. Q1: What is the best measure of access to care for estimating risk of CRC incidence? H1.1: The most comprehensive measurement, Multi-Modal 2SFCA, is optimal for predicting CRC incidence compared to unidimensional distance, availability, and other 2SFCA measures. H1.2: Weighting access to healthcare measures for individual insurance coverage improves predictive performance of CRC incidence. Aim 2: Ascertain the relative risk from area-level deprivation on CRC incidence at the block group-level in PA from 2008 to 2017.Q2: How does area-level deprivation affect CRC incidence? H2.1: WQS will demonstrate the relative importance of an extensive array of SES variables for CRC incidence. H2.2: Higher deprivation will be positively associated with risk of CRC incidence. Aim 3: Determine the individual-level likelihood of being diagnosed with late-stage CRC based on place of residence across PA from 2008 to 2017.Q3: How does place of residence affect the likelihood of developing late-stage CRC incidence after adjusting for individual-level characteristics and covariates? H3.1: PA residents living in areas of worse deprivation and low access to care have a higher likelihood of being diagnosed with late-stage CRC. H3.2: The likelihood of late-stage CRC varies significantly by individual characteristics. Methods: This research used ecologic and cross-sectional study designs to perform secondary data analysis of the cancer registry and publicly available data. The geographic units were block groups in PA (N = 9,740), accessed from the US Census Bureau. The sample included screening age-eligible PA residents, 45-75 years, diagnosed with a primary incident case of CRC from 2008 to 2017 (N=34,250), identified via the PA Cancer Registry. Out-of-state residents at diagnosis and high-risk individuals were excluded. Nine block groups were uninhabitable with no population and thus excluded. Primary exposure variables (i.e., area-level access to healthcare and deprivation) were calculated using the PA Cancer Registry, a provider database, the US Census Bureau’s polygon and network shapefiles, and American Community Survey. Ecologic covariates (see below) were derived from the American Community Survey, the Behavioral Risk Factor Surveillance System, and the USDA’s Rural-Urban Commuting Areas. The PA Cancer Registry provided individual data for patient demographics, tumor characteristics, and insurance coverage. Exploratory spatial, temporal, and spatiotemporal analyses of the CRC data were performed before Aims 1 to 3. Aim 1: CRC cases were aggregated by block group to represent a count of CRC incidence. Area-level access to healthcare measures was calculated using providers’ addresses, population-weighted block group centroids, and road/rail networks (i.e., driving, walking, and public transit). Measures included great-circle distance, driving distance to the nearest provider by miles/time, physician-to-population ratio, enhanced two-step floating catchment area (2SFCA), variable 2SFCA, and multi-modal 2SFCA. Four 15-minute catchment sizes were tested (range = 15-60-minutes). A weighted version of each 2SFCA measure for insurance coverage was calculated. Predictive performance was assessed with model fit statistics from 29 hierarchical Bayesian spatiotemporal Poisson regression models. All models included CRC screening adherence, rurality, age, race, education level, unemployment, and poverty level. Aim 2: CRC cases were aggregated by block group to represent a count of CRC incidence. Area-level deprivation indicators (n=39) were calculated from the American Community Survey’s five-year pooled estimates for demographic, social, economic, and housing characteristics and represented at the census tract or block group-level. Weighted Quantile Sum regression generated an area-level deprivation index, weighting each indicator by its relative relationship with CRC incidence. A hierarchical Bayesian spatiotemporal Poisson regression with conditional autoregressive priors and a first-order autoregressive time series process was used to estimate the relative risk of CRC. The ecologic covariates included in the model were area-level access to healthcare from Aim 1, CRC screening adherence, rurality, age, and sex. Aim 3: Three binary outcome variables represented localized vs. regional, distant, and regional and distant CRC at diagnosis. Aim 1 and 2’s area-level access to healthcare and deprivation measurements were used for this study’s primary exposure variables. The data was split into three time periods (2008-2009, 2010-2013, and 2014-2017) to analyze CRCS coverage mandates from the Affordable Care Act for private insurers in 2010 and Medicare in 2014. Using binomial distributed outcomes, three two-level generalized linear mixed models using hierarchical Bayesian methods with conditional autoregressive priors were run for each time period. Results: There were 34,250 eligible incident cases with 0-6 cases per block group (N=9,731) each year and an average of 3.5 cases per block group for the pooled study period. From 2008 to 2017, the pooled CRC incidence rate was 7.45 cases per 1,000 for 45 to 75 year olds in PA. Scan statistics found the highest CRC burden was in Philadelphia (northeast, west, and south), Pittsburgh, and rural areas in southwest PA (e.g., Westmoreland County and Fayette County) and northcentral PA (e.g., Lycoming County, Clinton County, and Centre County). In PA, yearly crude CRC rates decreased slightly over the ten years (0.80 to 0.72, Δ =-.08), though not empirically tested. Aim 1: The best fitting model used the Multi-Modal 2SFCA, which included aggregated physician-to-population ratios within 45-minutes from the provider facility for population-weighted block group centroids via driving, walking, and public transit of the same distance. Access was generally worst in rural areas and best in urban/suburban areas. Block groups with access one standard deviation above the state median had 27% decreased CRC risk. Weighting for insurance coverage improved a measure’s predictive ability for shorter travel times (i.e., 15-minutes and 30-minutes). Aim 2: Of a 39 indicator deprivation index, nine were statistically significant and three were related to SES (i.e., median household income, the percent of the block group without a high school degree, or living in a house without heating). However, the most important significant indicators belonged to geography and income domains, collectively representing 71% of the relative influence of the index. The area-level deprivation index was significant and positively associated with CRC incidence at the block group-level in PA from 2008 to 2017 (RR: 1.33, 95% CI: 1.32–1.34). Aim 3: After accounting for individual age, race, and insurance coverage, the relationship between area-level access to healthcare and deprivation and late-stage CRC became non-significant. While no area-level effects were significant, several individual-level features had consistent significant findings across outcomes and time periods. At the individual-level, having government insurance and being uninsured had significant positive relationships for all outomes and time periods. Age, and race had significant inverse relationships with late-stage CRC diagnosis. Conclusions: In summary, this study addressed the limitations of previous research by employing innovative measurement techniques, such as the Multi-Modal 2SFCA and Weighted Quantile Sum regression, and rigorous spatiotemporal methods to assess the impact of area-level access to healthcare and deprivation on CRC incidence and late-stage diagnosis. The findings highlight the importance of considering walking and public transit access to healthcare in relation to CRC incidence. Additionally, the study demonstrated the effectiveness of the WQS method in calculating an accurate area-level deprivation index, which enhanced the prediction of CRC incidence and identified high-risk areas for targeted interventions. However, individual-level characteristics, particularly insurance coverage, were found to be more influential in predicting the stage at which CRC was diagnosed than area-level effects. Regardless, using inferences and similar methods from this dissertation improves disease mapping and resource allocation for CRCS outreach, supports evidence for policy, and helps guide the development of tailored public health interventions to ultimately reduce the burden of CRC. / Epidemiology
226

Can extraversion buffer against sleep deprivation’s negative effect on social motivation? : An experimental study

Thurezon, Malin January 2023 (has links)
No description available.
227

Removing Soluble Phosphorus from Tertiary Municipal Wastewater Using Phosphorus- Deprived, Filamentous Microalgae

Ahern, Aloysia 01 September 2022 (has links) (PDF)
Harmful algal blooms (HABs) can be detrimental to ecosystems, human health, and economies. The low levels of phosphorus remaining in the effluent of municipal wastewater treatment plants can contribute to HAB formation. To achieve more complete phosphorus removal, an effluent treatment method has been proposed that uses phosphorus-deprived, filamentous microalgae to quickly assimilate soluble phosphorus to low concentrations. This study investigated two parameters that influence the feasibility of such a system: (1) the biomass growth productivity of algal cultures during the phosphorus deprivation period and (2) the correlation between the duration of this period and the phosphorus uptake rate by the biomass when contacted with the water to be treated. A single strain of filamentous algae, Tribonema minus, was used. Two experiments lasting 8-9 days compared the biomass productivity of cultures of T. minus grown in phosphorus-replete and -deplete media. While no significant difference in productivity was observed between treatments, further studies should be done to confirm this finding. In addition, 39 uptake contact experiments were conducted. The soluble phosphorus uptake rate was measured for algae deprived of phosphorus for 0 to 12 days of growth. The highest observed uptake rate was 3.83 mg P/g VSS-h, during the first three hours of contact, by biomass that had been phosphorus-deprived for 12 days. The correlation between phosphorus deprivation duration and 3-h uptake rate was 0.34 mg P/g VSS-h per day of deprivation (R2 = 0.81). Additional development efforts seem justified based on these results.
228

Examining the Role of Protests in South Korean Democratization

Bass, Abigail J 01 January 2021 (has links)
This research examines how relative deprivation theory can be applied to study the success of protest movements and their subsequent impact on the process of democratization of the South Korean state. This study hopes to provide a more comprehensive approach to how the role of protests in the development of a democratic state is explained within the field of political science. Utilizing both a quantitative and qualitative research design, this work applied a case study analysis as well as a supplemental data analysis regarding the success of Korean protest movements and their impact on democratization as well as global views of democratization as previously mentioned. For the case study analysis, I focused on four protest movements in South Korea and applied relative deprivation theory in each case. Then, I defined five metrics for protest success based on my previous analysis and used these metrics to conduct a comparative analysis regarding the short and long term success of each protest movement. For the data analysis, I utilized Systemic Peace's Polity Project Series V dataset in order to quantify changes in the qualities of the regime over time, on a scale ranging from highly authoritarian to highly democratic regime qualities. Based on this mixed-mode analysis, I find that protest movements that were linked to progressive deprivation led to most successful shifts towards democratic regime qualities in the long-term. This project is significant to the field as it will address criticisms in previously discounted protest theory as well as explore the changing narrative of democratization in the modern world and dispel historical misconceptions of political culture in East Asia, focusing on Korea.
229

REGULATION OF NON-PHOTIC PHASE-RESETTING OF THE MAMMALIAN CIRCADIAN CLOCK

Grossman, Gregory H. 20 November 2006 (has links)
No description available.
230

The Effects of 53 Hours of Sleep Deprivation on the Thermoregulatory, Hormonal, Metabolic, and Cognitive Responses of Young Adult Males to Multiple Bouts of Acute Cold Exposure

Pierce, Katherine E. 11 December 2008 (has links)
No description available.

Page generated in 0.107 seconds