• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 35
  • 20
  • 3
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 72
  • 28
  • 19
  • 15
  • 14
  • 13
  • 13
  • 10
  • 8
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Computational methods for analyzing dioxin-like compounds and identifying potential aryl hydrocarbon receptor ligands : multivariate studies based on human and rodent in vitro data

Larsson, Malin January 2017 (has links)
Polychlorinated dibenzo-p-dioxins/dibenzofurans (PCDD/Fs) and polychlorinated biphenyls (PCBs) are omnipresent and persistent environmental pollutants. In particular, 29 congeners are of special concern, and these are usually referred to as dioxin-like compounds (DLCs). In the European Union, the risks associated with DLCs in food products are estimated by a weighted sum of the DLCs’ concentrations. These weights, also called toxic equivalency factors (TEFs), compare the DLCs’ potencies to the most toxic congener, 2,3,7,8-tetrachloro-dibenzo-p-dioxin (2378- TCDD). The toxicological effects of PCDD/Fs and PCBs are diverse, ranging from chloracne and immunological effects in humans to severe weight loss, thymic atrophy, hepatotoxicity, immunotoxicity, endocrine disruption, and carcinogenesis in rodents. Here, the molecular structures of DLCs were used as the basis to study the congeneric differences in in vitro data from both human and rodent cell responses related to the aryl hydrocarbon receptor (AhR). Based on molecular orbital densities and partial charges, we developed new ways to describe DLCs, which proved to be useful in quantitative structure-activity relationship modeling. This thesis also provides a new approach, the calculation of the consensus toxicity factor (CTF), to condense information from a battery of screening tests. The current TEFs used to estimate the risk of DLCs in food are primarily based on in vivo information from rat and mouse experiments. Our CTFs, based on human cell responses, show clear differences compared to the current TEFs. For instance, the CTF of 23478-PeCDF is as high as the CTF for 2378-TCDD, and the CTF of PCB 126 is 30 times lower than the corresponding TEF. Both of these DLCs are common congeners in fish in the Baltic Sea. Due to the severe effects of DLCs and their impact on environmental and human health, it is crucial to determine if other compounds have similar effects. To find such compounds, we developed a virtual screening protocol and applied it to a set of 6,445 industrial chemicals. This protocol included a presumed 3D representation of AhR and the structural and chemical properties of known AhR ligands. This screening resulted in a priority list of 28 chemicals that we identified as potential AhR ligands.
62

The Effect of Texas Charter High Schools on Diploma Graduation and General Educational Development (Ged) Attainment

Maloney, Catherine 08 1900 (has links)
This dissertation is a study of the effect of Texas's charter high schools on diploma graduation and General Educational Development (GED) attainment. Utilizing data from the Texas Schools Project at the University of Texas at Dallas, the study follows a cohort of Texas students enrolled as 10th graders in the fall of 1999 and tracks their graduation outcomes through the summer of 2002 when they were expected to have completed high school. The analysis uses case study research and probit regression techniques to estimate the effect of charter school attendance on graduation and GED outcomes as well as the effect of individual charter school characteristics on charter students' graduation outcomes. The study's results indicate that charter school attendance has a strong negative effect on diploma graduation and a strong positive effect on GED attainment. In addition, the study finds that charter schools that offer vocational training, open entry/exit enrollment options, and charters that are operated in multiple sites or "chain" charters have positive effects on charter students' diploma graduation outcomes. Charters that offer accelerated instruction demonstrate a negative effect on diploma graduation. The study finds that charter school graduation outcomes improve as charters gain experience and that racially isolated minority charter schools experience reduced graduation outcomes. The study's results also indicate that Texas's charter high schools may be providing district schools with a means through which to offload students who may be difficult to educate. The analysis finds that districts may be pushing low-performing high school students with attendance and discipline problems into charter schools in order to avoid the effort of educating them and to improve district performance on accountability measures related to standardized test scores and graduation rates. This finding suggests that that competition from charter high schools will not provide much incentive for districts to improve their programs, undermining a central premise of school choice initiatives.
63

Factors that Impact African American High School Equivalency (HSE) Students' Pursuit of Higher Education

Chandler-Melton, Jamiyla 01 January 2016 (has links)
African Americans account for a disproportionate percentage of students who pursue college education in comparison to European Americans. Indeed, a considerable number of African American High School Equivalency (HSE) students are not enrolling in college once they earn their HSE diploma. The purpose of this qualitative case study was to examine 3 African American HSE students' perceptions about factors that influenced their pursuit of higher education at the selected HSE study site. These 3 students were selected for their inclusion because of their ethnicity, enrollment in the HSE program, academic underpreparedness and lack of pursuit of higher education, and strong feelings to share about the phenomenon under study. The theoretical framework was based on Vygotsky's sociocultural theory of human learning. The research question focused on assessing African American HSE students' lack of pursuit of higher education. Semistructured focus group interview and individual interview data were thematically analyzed using open-coding. Findings revealed that participants believed the lack of high school credentials, family background, intrinsic motivation and educational values, sociocultural influences, teacher and peer influence, and socioeconomic factors impacted their pursuit of higher education. A professional development project was developed based on study findings to provide HSE educators with training on the HSE exam, Common Core State Standards, and best practices to enrich the academic achievement of African American HSE students at the study site. Results have implications for positive social change among African American HSE students by emphasizing the importance of higher education on educational, sociocultural, professional, and personal advancement.
64

Designing Interaction Equivalency in Distance Education

Salamati, Zahra January 2012 (has links)
The fundamental advancement of information technology has given rise to distance education industry hence it has helped to the popularity of distance education among people. However, for employing innovative and advanced tools universities need financial resources. Reaching to these resources is not easy and accessible. Interaction equivalency theorem can be a good solution for overcoming the financial problems but designers are reluctant to utilize it because they think that education quality will decrease due to lack of teacher interaction. This study demonstrated that students’ perception toward interaction equivalency is positive as long as they have high level of interdependency with other students. Without this level of, students are not motivated in order to continue their courses. This study by providing techno-pedagogical design and IS design theory for support of IE helps e-learning practitioners who want to design an acceptable distance educational system with limited financial resources. / Program: Magisterutbildning i informatik
65

Design of Low-Power Reduction-Trees in Parallel Multipliers

Oskuii, Saeeid Tahmasbi January 2008 (has links)
<p>Multiplications occur frequently in digital signal processing systems, communication systems, and other application specific integrated circuits. Multipliers, being relatively complex units, are deciding factors to the overall speed, area, and power consumption of digital computers. The diversity of application areas for multipliers and the ubiquity of multiplication in digital systems exhibit a variety of requirements for speed, area, power consumption, and other specifications. Traditionally, speed, area, and hardware resources have been the major design factors and concerns in digital design. However, the design paradigm shift over the past decade has entered dynamic power and static power into play as well.</p><p>In many situations, the overall performance of a system is decided by the speed of its multiplier. In this thesis, parallel multipliers are addressed because of their speed superiority. Parallel multipliers are combinational circuits and can be subject to any standard combinational logic optimization. However, the complex structure of the multipliers imposes a number of difficulties for the electronic design automation (EDA) tools, as they simply cannot consider the multipliers as a whole; i.e., EDA tools have to limit the optimizations to a small portion of the circuit and perform logic optimizations. On the other hand, multipliers are arithmetic circuits and considering arithmetic relations in the structure of multipliers can be extremely useful and can result in better optimization results. The different structures obtained using the different arithmetically equivalent solutions, have the same functionality but exhibit different temporal and physical behavior. The arithmetic equivalencies are used earlier mainly to optimize for area, speed and hardware resources.</p><p>In this thesis a design methodology is proposed for reducing dynamic and static power dissipation in parallel multiplier partial product reduction tree. Basically, using the information about the input pattern that is going to be applied to the multiplier (such as static probabilities and spatiotemporal correlations), the reduction tree is optimized. The optimization is obtained by selecting the power efficient configurations by searching among the permutations of partial products for each reduction stage. Probabilistic power estimation methods are introduced for leakage and dynamic power estimations. These estimations are used to lead the optimizers to minimum power consumption. Optimization methods, utilizing the arithmetic equivalencies in the partial product reduction trees, are proposed in order to reduce the dynamic power, static power, or total power which is a combination of dynamic and static power. The energy saving is achieved without any noticeable area or speed overhead compared to random reduction trees. The optimization algorithms are extended to include spatiotemporal correlations between primary inputs. As another extension to the optimization algorithms, the cost function is considered as a weighted sum of dynamic power and static power. This can be extended further to contain speed merits and interconnection power. Through a number of experiments the effectiveness of the optimization methods are shown. The average number of transitions obtained from simulation is reduced significantly (up to 35% in some cases) using the proposed optimizations.</p><p>The proposed methods are in general applicable on arbitrary multi-operand adder trees. As an example, the optimization is applied to the summation tree of a class of elementary function generators which is implemented using summation of weighted bit-products. Accurate transistor-level power estimations show up to 25% reduction in dynamic power compared to the original designs.</p><p>Power estimation is an important step of the optimization algorithm. A probabilistic gate-level power estimator is developed which uses a novel set of simple waveforms as its kernel. The transition density of each circuit node is estimated. This power estimator allows to utilize a global glitch filtering technique that can model the removal of glitches in more detail. It produces error free estimates for tree structured circuits. For circuits with reconvergent fanout, experimental results using the ISCAS85 benchmarks show that this method generally provides significantly better estimates of the transition density compared to previous techniques.</p>
66

Design of Low-Power Reduction-Trees in Parallel Multipliers

Oskuii, Saeeid Tahmasbi January 2008 (has links)
Multiplications occur frequently in digital signal processing systems, communication systems, and other application specific integrated circuits. Multipliers, being relatively complex units, are deciding factors to the overall speed, area, and power consumption of digital computers. The diversity of application areas for multipliers and the ubiquity of multiplication in digital systems exhibit a variety of requirements for speed, area, power consumption, and other specifications. Traditionally, speed, area, and hardware resources have been the major design factors and concerns in digital design. However, the design paradigm shift over the past decade has entered dynamic power and static power into play as well. In many situations, the overall performance of a system is decided by the speed of its multiplier. In this thesis, parallel multipliers are addressed because of their speed superiority. Parallel multipliers are combinational circuits and can be subject to any standard combinational logic optimization. However, the complex structure of the multipliers imposes a number of difficulties for the electronic design automation (EDA) tools, as they simply cannot consider the multipliers as a whole; i.e., EDA tools have to limit the optimizations to a small portion of the circuit and perform logic optimizations. On the other hand, multipliers are arithmetic circuits and considering arithmetic relations in the structure of multipliers can be extremely useful and can result in better optimization results. The different structures obtained using the different arithmetically equivalent solutions, have the same functionality but exhibit different temporal and physical behavior. The arithmetic equivalencies are used earlier mainly to optimize for area, speed and hardware resources. In this thesis a design methodology is proposed for reducing dynamic and static power dissipation in parallel multiplier partial product reduction tree. Basically, using the information about the input pattern that is going to be applied to the multiplier (such as static probabilities and spatiotemporal correlations), the reduction tree is optimized. The optimization is obtained by selecting the power efficient configurations by searching among the permutations of partial products for each reduction stage. Probabilistic power estimation methods are introduced for leakage and dynamic power estimations. These estimations are used to lead the optimizers to minimum power consumption. Optimization methods, utilizing the arithmetic equivalencies in the partial product reduction trees, are proposed in order to reduce the dynamic power, static power, or total power which is a combination of dynamic and static power. The energy saving is achieved without any noticeable area or speed overhead compared to random reduction trees. The optimization algorithms are extended to include spatiotemporal correlations between primary inputs. As another extension to the optimization algorithms, the cost function is considered as a weighted sum of dynamic power and static power. This can be extended further to contain speed merits and interconnection power. Through a number of experiments the effectiveness of the optimization methods are shown. The average number of transitions obtained from simulation is reduced significantly (up to 35% in some cases) using the proposed optimizations. The proposed methods are in general applicable on arbitrary multi-operand adder trees. As an example, the optimization is applied to the summation tree of a class of elementary function generators which is implemented using summation of weighted bit-products. Accurate transistor-level power estimations show up to 25% reduction in dynamic power compared to the original designs. Power estimation is an important step of the optimization algorithm. A probabilistic gate-level power estimator is developed which uses a novel set of simple waveforms as its kernel. The transition density of each circuit node is estimated. This power estimator allows to utilize a global glitch filtering technique that can model the removal of glitches in more detail. It produces error free estimates for tree structured circuits. For circuits with reconvergent fanout, experimental results using the ISCAS85 benchmarks show that this method generally provides significantly better estimates of the transition density compared to previous techniques.
67

Častice v slovenčine a v češtine. Systémová a korpusovolingvistická analýza / Particles in Slovak and Czech. System and Corpus Analysis

Šimková, Mária January 2015 (has links)
The youngest word class type used to arouse great interest and discussions when entering the grammar; in some countries (e. g. in Germany) particles have been an object of systematic research. However, many other languages still lack a complex description of particles as a class on its own - they represent an appropriate material also for comparative researches. Differences in functioning and theoretical treatment of particles have been present in typologically different languages but they can emerge also in related languages, even in the case of Slovak and Czech. Lexicographical and grammar descriptions of these languages provide only small sets of particles (in Slovak roughly amounting to 400, in Czech exceeding 200) and are usually divided by authors into small groups and further on into even smaller subgroups. Due to specific features as well as to paradigmatic and syntagmatic relations with other language or speech phenomena even one particle or a couple of them or a narrowly defined group of particles can become an object of individual scientific and research projects. Step by step, our thesis presents the development of attitudes towards particles as an independent word class in general and in Russian linguistics in particular, grammar descriptions of particles in Slovak, Czech and other...
68

Využití metody náhrady přírodních zdrojů ("resource equivalency method") pro hodnocení náhrady škod způsobených na ekosystémech člověkem / Methods of resource utilization equivalnecy method for assessing damages caused to human ecosystems

MUNDOKOVÁ, Mariana January 2011 (has links)
The economic evaluation of costs for remediation of montane spruce forest attacked by bark beetle in the different stage of decline (plots with actually living mature trees, plots with dead tree stand, wood is remaining in the ecosystem, plots with damaged stands, which were clear-cut, ten model plots) was made in the National Nature Park Šumava (Modrava model area) using resource equivalency method. Microclimatic characteristics (temperature and humidity development) measured ba dataloggers and communities of epigeic beetles (pitfall traps) were used as environmental metrics. Results indicate that the natural remediation of declined forest is economically most profitable. The microclimatic characteristics of plots with dead tree stand are most similar to the active forest. The species diversity, activity of beetles and frequency of relic species and species indicating virginal forest is higher in plots with dead tree stands. Based on these data we can resulted that the natural remediation of montane spruce forest is the most acceptable way both from biological and economical view (regeneration of ecosystem services of montane spruce forest).
69

Application of dermal microdialysis and tape stripping methods to determine the bioavailability and/or bioequivalence of topical ketoprofen formulations

Tettey-Amlalo, Ralph Nii Okai January 2008 (has links)
The widespread acceptance of topical formulations intended for local and/or regional activity has prompted renewed interest in developing a model to determine the bioavailability of drugs in order to establish bioequivalence as a means of evaluating formulation performance of multisource products and also for use during formulation development. Current in vivo techniques such as blister suction and skin biopsy amongst others used to determine the bioavailability and/or bioequivalence of topical formulations are either too invasive to generate appropriate concentration-time profiles or require large numbers of study subjects thereby making the study expensive and time-consuming. Moreover, there are currently no sampling techniques that can demonstrate dermal bioavailability and/or bioequivalence of topical formulations intended for local and/or regional activity. Dermal microdialysis is a relatively new application of microdialysis that permits continuous monitoring of endogenous and/or exogenous solutes in the interstitial fluid. The technique is involves the implantation of semi-permeable membranes which are perfused with an isotonic medium at extremely slow flow rates and collection of microlitre sample volumes containing diffused drugs. Tape stripping, a relatively older technique, has been extensively used in comparative bioavailability studies of various topical formulations. However, due to shortcomings arising from reproducibility and inter-subject variation amongst others, the published FDA guidance outlining the initial protocol was subsequently withdrawn. The incorporation of transepidermal water loss with tape stripping has garnered renewed interest and has been used for the determination of drug bioavailability from a number of topical formulations. Hence the primary objective of this research is to develop and evaluate microdialysis sampling and tape stripping techniques, including the incorporation of the determination of transepidermal water loss, to assess the dermal bioavailability of ketoprofen from topical gel formulations and to develop models for bioequivalence assessment. A rapid UPLC-MS/MS method with requisite sensitivity for the analysis of samples generated from dermal microdialysis was developed and validated which accommodated the microlitre sample volumes collected. An HPLC-UV method was developed and validated for the analysis of samples generated from the in vitro microdialysis and in vivo tape stripping studies. The work presented herein contributes to a growing body of scientific knowledge seeking to develop a model for the determination of bioequivalence of pharmaceutically equivalent topical formulations intended for local and/or regional activity in human subjects.
70

Computer Modeling the Incursion Patterns of Marine Invasive Species

Johnston, Matthew W. 26 February 2015 (has links)
Abstract Not Available.

Page generated in 0.0577 seconds