501 |
’Difficulties’ of integrative evaluation practices : instances of language and context as/in contested space(s)Low, Marylin Grace 11 1900 (has links)
Although language is a medium of learning, most educational institutions typically
teach and therefore evaluate language separately from content. In second language contexts,
recent attention has been given to language/content integration through content-based
language instruction. Yet, questions of integrative evaluation (evaluating language and
content as one) remain uncertain and difficult. This inquiry explores difficulties invoked
when teachers engage in practices of integrative evaluation of English language learners'
writing at an international college for Japanese nationals in Canada.
Are these difficulties technical problems? Technical rationality has been critiqued
by a number of thinkers. Those interested in action research practices, contrast technical
rationality with what they call reflective rationality and argue for contextualizing, rather than
simplifying, difficult situations. Some with hermeneutic interests argue for an attunement to,
rather than concealment of, difficulties of life in the classroom. Others interested in writing
instruction, are critical of conventional approaches to writing pedagogy as reductionistic
and deterministic.
There are a number of instances of difficulty in teachers' integrative evaluation
practices. Prior to agreeing on a prompt, many teachers explore texts as interpretive, social
literacy but, in their uncertainty of how to mark such a text, they return to a question for
which there is a 'correct' and 'controlled' response. Once the prompt and evaluative criteria
are established, discordant orientations to evaluation, literacy, and language/content
integration complicate teachers' uncertainty. For example, teachers sometimes acknowledge
functional views of language/content integration, yet they are vague and uncertain about how
to mark in an integrated way. When teachers read texts prior to judgment, they comment
that the texts are difficult to interpret and then impose their own 'straightforward' readings
on the texts to reduce and simplify the difficulties.
These instances raise serious concerns in practices of evaluation, literacy and
language/content integration, especially when technical forms of evaluation are paradoxically
aligned with social and integrated texts. A turn to hermeneutics troubles a technical hold and
invites further inquiry into tensioned moments of integrative evaluation as difficult, living
practices. / Education, Faculty of / Language and Literacy Education (LLED), Department of / Graduate
|
502 |
MIMO channel modelling for indoor wireless communicationsMaharaj, Bodhaswar Tikanath Jugpershad 29 July 2008 (has links)
This thesis investigates multiple-input-multiple-output (MIMO) channel modelling for a wideband indoor environment. Initially the theoretical basis of geometric modelling for a typical indoor environment is looked at, and a space-time model is formulated. The transmit and receive antenna correlation is then separated and is expressed in terms of antenna element spacing, the scattering parameter, mean angle of arrival and number of antenna elements employed. These parameters are used to analyze their effect on the capacity for this environment. Then the wideband indoor channel operating at center frequencies of 2.4 GHz and 5.2 GHz is investigated. The concept of MIMO frequency scaling is introduced and applied to the data obtained in the measurement campaign undertaken at the University of Pretoria. Issues of frequency scaling of capacity, spatial correlation and the joint RX/TX double direction channel response for this indoor environment are investigated. The maximum entropy (ME) approach to MIMO channel modelling is investigated and a new basis is developed for the determination of the covariance matrix when only the RX/TX covariance is known. Finally, results comparing this model with the established Kronecker model and its application for the joint RX/TX spatial power spectra, using a beamformer, are evaluated. Conclusions are then drawn and future research opportunities are highlighted. / Thesis (PhD)--University of Pretoria, 2008. / Electrical, Electronic and Computer Engineering / unrestricted
|
503 |
Correlação entre a produção gasosa de água, hidroxila, monóxido de carbono e a magnitude heliocêntrica do Cometa C/1995 O1 (Hale-Bopp) / Correlation between the gas production of water, hydroxyl, carbon monoxide and the heliocentric magnitude of the Comet C/1995 O1 (Hale-Bopp)Guilherme Gastaldello Pinheiro Serrano 16 May 2011 (has links)
O objetivo do presente trabalho é estudar a correlação entre as taxas de produção gasosa e as magnitudes heliocêntricas do cometa C/1995 O1 (Hale-Bopp), tanto na fase pré-periélica como na fase pós-periélica. As evoluções da magnitude e das taxas de produção gasosa de H2O (água), OH (radical hidroxila) e CO (monóxido de carbono), ao longo da aproximação e do afastamento do cometa em relação ao Sol, são analisadas. Para essa análise, foram utilizadas 11.734 estimativas de magnitudes visuais, extraídas do ICQ (International Comet Quarterly) e 88 observações do monóxido de carbono (Biver, comunicação particular (2007); Disanti et al. 2001; Jewitt et al. 1996), cobrindo o intervalo de distâncias heliocêntricas de rh = 7,464 UA (na fase pré-periélica) até rh = 14,070 UA (na fase pós-periélica). É mostrado que a atividade do Hale-Bopp (temperatura média superficial ~ 110 K), além de 6,3 UA do Sol é controlada pela emissão do CO (temperatura de sublimação ~ 24 K), antes que pela emissão da H2O (temperatura de sublimação ~ 152 K). Esse resultado é consistente com as observações em ondas milimétricas de Biver et al. 1996 e Jewitt et al. 1996, realizadas em 6,5 UA. / The purpose of the present work is to study the correlation between the gas production rates and heliocentric magnitudes of comet C/1995 O1 (Hale-Bopp), in the pre-perihelion phase as well as in the post-perihelion phase. The evolutions of magnitudes and gas production rates of H2O (water), OH (hydroxil radical) and CO (carbon monoxide), along the approach to and leave of the comet from the Sun, are analyzed. For this analysis, we used 11,734 visual magnitude estimates, extracted from ICQ (International Comet Quarterly) and 88 observations of carbon monoxide (Biver, private communication (2007); DiSanti et al. (2001); Jewitt et al. (1996)), covering the range of heliocentric distances from rh = 7.464 AU (in the pre-perihelion phase) to rh = 14.070 AU (in the post-perihelion phase). It is shown that the activity of Hale-Bopp (average surface temperature ~ 110 K) beyond 6.3 AU from the Sun is controlled by CO emission (sublimation temperature ~ 24 K) rather than by H2O (sublimation temperature ~ 152 K). This result is consistent with millimeter-wave observations of Biver et al. (1996) and Jewitt et al. (1996), made at 6.5 AU.
|
504 |
Design and synthesis of new scaffolds as antiproliferative agents and potential hsp90 inhibitorsAdegoke, Yusuf Adeyemi January 2020 (has links)
Doctor Pharmaceuticae - DPharm / Natural products have been an important source of drugs and novel lead compounds in drug
discovery. Their unique scaffolds have led to the synthesis of derivatives that continue to give rise
to medicinally relevant agents. Thus, natural product-inspired drugs represent a significant
proportion of drugs in the market and with several more in development. Cancer is among the
leading public health problems and a prominent cause of death globally. Chemotherapy has been
important in the management of this disease even though side effects that arise due to lack of
selectivity is still an issue.
|
505 |
Validation of theoritical approach to measure biodiversity using plant species dataNeloy, Md Naim Ud Dwla January 2020 (has links)
Measuring Biodiversity is an important phenomenon to serve best to our ecology and also keep environment sound. Variety of life on different levels, like an ecosystem, life forms on a site, landscape collectively known as Biodiversity. Species richness and evenness combine measures as Biodiversity. Separate formula, index, equation are widely using to measure Biodiversity in each level. Swedish Environmental Protection Agency aimed to establish an index that consists of landscape functionality and landscape heterogeneity. For landscape functionality assessment, there BBCI (Biotope biodiversity Capacity index) is going to use. High BBCI indicates a high biodiversity for each biotope. However, empirically estimate species richness how much matched with BBCI that not been evaluated. The aim of this paper to see the relationship between empirical estimated Biodiversity and BBCI. A relationship between Shannon diversity index and BBCI also ran to see the matches between them. Collect the empirical data from selected 15 landscapes using Artportalen.se and sort the data for further calculation. Results showed that there was a strong positive relationship between empirical estimated Biodiversity and BBCI. Again Shannon diversity index and BBCI also demonstrated a positive correlation between them. It showed BBCI could explain 60%-69% of species richness data and 17%-22% of Shannon diversity index. It indicates the acceptance of theoretical study of measure Biodiversity.
|
506 |
Statistics for motion of microparticles in a plasmaMukhopadhyay, Amit Kumar 01 July 2014 (has links)
I report experimental and numerical studies of microparticle motion in a dusty plasma. These microparticles are negatively charged and are levitated in a plasma consisting of electrons, ions and neutral gas atoms. The microparticles repel each other, and are confined by the electric fields in the plasma. The neutral gas damps the microparticle motion, and also exerts random forces on them.
I investigate and characterize microparticle motion. In order to do this, I study velocity distributions of microparticles and correlations of their motion. To perform such a study, I develop new experimental and analysis techniques. My thesis consists of four separate projects.
In the first project, the battle between deterministic and random motion of microparticles is investigated. Two particle velocity distributions and correlations have previously studied only in theory. I performed an experiment with a very simple one dimensional (1D) system of two microparticles in a plasma. My study of velocity correlations involves just two microparticles which is the simplest system that allows interactions. A study of such a simple system provides insight into the motions of the microparticles. It allowed for the experimental measurement of two-particle distributions and correlations. For such a system, it is shown that the motion of the microparticles is dominated by deterministic or oscillatory effects.
In the second project, two experiments with just two microparticles are performed to isolate the effects of ion wakes. The two experiments differ in the alignment of the two microparticles: they are aligned either perpendicular or parallel to the ion flow. To have different alignments, the sheath is shaped differently in the two experiments. I demonstrate that microparticle motion is more correlated when they are aligned along the ion flow, rather than perpendicular to the ion flow.
In the third project, I develop a model with some key assumptions to compare with the experiments in the first two projects. My model includes all significant forces: gravity, electrical forces due to curved sheath and interparticle interaction, and gas forces. The model does not agree with both the experiments.
In the last project, I study the non-Gaussian statistics by analyzing data for microparticle motion from an experiment performed under microgranity conditions. Microparticle motion is studied in a very thin region of microparticles in a three dimensional dust cloud. The microparticle velocity distributions exhibit non-Gaussian characteristics.
|
507 |
High-dimensional statistical data integrationJanuary 2019 (has links)
archives@tulane.edu / Modern biomedical studies often collect multiple types of high-dimensional data on a common set of objects. A representative model for the integrative analysis of multiple data types is to decompose each data matrix into a low-rank common-source matrix generated by latent factors shared across all data types, a low-rank distinctive-source matrix corresponding to each data type, and an additive noise matrix. We propose a novel decomposition method, called the decomposition-based generalized canonical correlation analysis, which appropriately defines those matrices by imposing a desirable orthogonality constraint on distinctive latent factors that aims to sufficiently capture the common latent factors. To further delineate the common and distinctive patterns between two data types, we propose another new decomposition method, called the common and distinctive pattern analysis. This method takes into account the common and distinctive information between the coefficient matrices of the common latent factors. We develop consistent estimation approaches for both proposed decompositions under high-dimensional settings, and demonstrate their finite-sample performance via extensive simulations. We illustrate the superiority of proposed methods over the state of the arts by real-world data examples obtained from The Cancer Genome Atlas and Human Connectome Project. / 1 / Zhe Qu
|
508 |
An Analysis of Using Error Metrics to Determine the Accuracy of Modeled Historical Streamflow on a Global ScaleJackson, Elise Katherine 01 April 2018 (has links)
Streamflow data is used throughout the world in applications such as flooding, agriculture, and urban planning. Understanding daily and seasonal patterns in streamflow is important for decision makers, so that they can accurately predict and react to seasonal changes in streamflow for the region. This understanding of daily and seasonal patterns has historically been achieved through interpretation of observed historical data at stream reaches throughout the individual regions. Developing countries have limited and sporadic observed stream and rain gage data, making it difficult for stakeholders to manage their water resources to their fullest potential. In areas where observed historical data is not readily available, the European Reanalysis Interim (ERA-Interim) data provided by the European Center for Medium-Range Weather Forecasts (ECMWF) can be used as a surrogate. The ERA-Interim data can be compared to historic observed flow to determine the accuracy of the ERA-Interim data using statistical measures such as the correlation coefficient, the mean difference, the root mean square error, R2 coefficients and spectral angle metrics. These different statistical measures determine different aspects of the predicted data's accuracy. These metrics measure correlation, errors in magnitude, errors in timing, and errors in shape. This thesis presents a suite of tests that can be used to determine the accuracy and correlation of the ERA-Interim data compared to the observed data, the accuracy of the ERA-Interim data in capturing the overall events, and the accuracy of the data in capturing the magnitude of events. From these tests, and the cases presented in this thesis, we can conclude that the ERA-Interim is a sufficient model for simulating historic data on a global scale. It is able to capture the seasonality of the historical data, the magnitude of the events, and the overall timing of the events sufficiently to be used as a surrogate dataset. The suite of tests can also be applied to other applications, to make comparing two datasets of flow data a quicker and easier process.
|
509 |
Development of Measurement Methods for Application to a Wind Tunnel Test of an Advanced Transport ModelEhrmann, Robert S 01 August 2010 (has links)
California Polytechnic State University, San Luis Obispo is currently working towards developing a Computational Fluid Dynamics (CFD) database for future code validation efforts. Cal Poly will complete a wind tunnel test on the Advanced Model for Extreme Lift and Improved Aeroacoustics (AMELIA) in the National Full-Scale Aerodynamics Complex (NFAC) 40 foot by 80 foot wind tunnel at NASA Ames Research Center in the summer of 2011. The development of two measurement techniques is discussed in this work, both with the objective of making measurements on AMELIA for CFD validation.
First, the work on the application of the Fringe-Imaging Skin Friction (FISF) technique to AMELIA is discussed. The FISF technique measures the skin friction magnitude and direction by applying oil droplets on a surface, exposing them to flow, measuring their thickness, and correlating their thickness to the local skin friction. The technique has the unique ability to obtain global skin friction measurements. A two foot, nickel plated, blended wing section test article has been manufactured specifically for FISF. The model is illuminated with mercury vapor lamps and imaged with a Canon 50D with a 546 nm bandpass filter. Various tests are applied to the wing in order to further characterize uncertainties related with the FISF technique. Human repeatability has uncertainties of ±2.3% of fringe spacing and ±2.0° in skin friction vector direction, while image post processing yields ±25% variation in skin friction coefficient. A method for measuring photogrammetry uncertainty is developed. The effect of filter variation and test repeatability was found to be negligible. A validation against a Preston tube was found to have 1.8% accuracy.
Second, the validation of a micro flow measurement device is investigated. Anemometers have always had limited capability in making near wall measurements, driving the design of new devices capable of measurements with increased wall proximity. Utilizing a thermocouple boundary layer rake, wall measurements within 0.0025 inches of the surface have been made. A Cross Correlation Rake (CCR) has the advantage of not requiring calibration but obtaining the same proximity and resolution as the thermocouple boundary layer rake. The flow device utilizes time of flight measurements computed via cross correlation to calculate wall velocity profiles. The CCR was designed to be applied to AMELIA to measure flow velocities above a flap in a transonic flow regime. The validation of the CCR was unsuccessful. Due to the fragile construction of the CCR, only one data point at 0.10589 inches from the surface was available for validation. The subsonic wind tunnel’s variable frequency drive generated noise which could not be filtered or shielded, requiring the use of a flow bench for validation testing. Since velocity measurements could not be made in the flow bench, a comparison of a fast and slow velocity was made. The CCR was not able to detect the difference between the two flow velocities. Currently, the CCR cannot be applied on AMELIA due to the unsuccessfully validation of the device.
|
510 |
VALUE-AT-RISK ESTIMATION USING GARCH MODELS FOR THE CHINESE MAINLAND STOCK MARKETZhou, Dongya January 2020 (has links)
With the acceleration of economic globalization, the immature Chinese mainland stock market is gradually associated with the stock markets of other countries. This paper predict the return rate of Chinese mainland stock market using several models from GARCH family, test the predictability by calculating Value-at-Risk, also capture the dynamic correlation between other fifive countries or region and mainland China by DCC-GARCH model. The results indicate that E-ARMA-GARCH model fifits the best due to the signifificant heteroscedasticity and leverage effect of Chinese mainland stock market. It has the strongest positive correlation with HongKong while the weakest correlation with the United States.
|
Page generated in 0.1614 seconds