111 |
Adequacy and Equity: How the Texas Supreme Court's Perceptions Have Changed Over the Past 50 YearsFord, Daniel William 05 1900 (has links)
The purpose of this study identifies state court cases involving public school finance specifically related to adequacy and equity in funding. Results address how state court cases have challenged the constitutionality of school finance in the United States, including Texas, over the last 50 years. The study further shows how the decisions from previous cases have influenced the Supreme Court of Texas decision in the Texas Taxpayer & Student Fairness litigation.
|
112 |
Organizational Complexity, Plan Adequacy, and Nursing Home Resiliency: A Contingency PerspectiveBoyce, Cherie 01 January 2015 (has links)
Some social and organizational behavior scientists measure resiliency through anecdotal qualitative research, i.e. personality analyses and stories of life experience. Empirical evidence remains limited for identifying measurable indicators of resiliency. Therefore, a testable contingency model was needed to clarify resiliency factors pertinent to organizational performance. Two essential resiliency factors were: 1) a written plan and 2) affiliation with a disaster network. This contingency study demonstrated a quantifiable, correlational effect between organizational complexity, disaster plan adequacy and organizational resiliency. The unit of analysis, the skilled nursing facility proved vulnerable, therefore justifying the need for a written emergency management plan and affiliation with a disaster network. The main purpose of this research was to verify the significance of emergency management plans within a contingency framework of complexity theory, resource dependency, systems theory, and network theory. Distinct sample moments quantified causal relationships between organizational complexity (A), plan adequacy (B) and resiliency (C). Primary and secondary research data were collected from within the context of public health and emergency management sectors within the State of Florida.
|
113 |
Evaluation of Nutritional Adequacy and Symptom Improvement During Implementation of the Low-FODMAP Diet in Individuals with Irritable Bowel SyndromeRichards, Julie Ann 14 August 2018 (has links)
No description available.
|
114 |
THE TECHNICAL ADEQUACY OF STANDARDS-DERIVED CURRICULUM-BASED MEASURES FOR READING COMPREHENSION AND MATH COMPUTATION IN MIDDLE SCHOOLURSHEL, C ARRIE L. 03 April 2006 (has links)
No description available.
|
115 |
The Development and Field Testing of an Instrument for Measuring Citizens' Attitudes toward Public School Funding in Terms of Equity, Adequacy, and AccountabilityPark, YoongSoo 16 April 2010 (has links)
No description available.
|
116 |
On modeling the volatility in speculative pricesHou, Zhijie 12 June 2014 (has links)
Following the Probabilistic Reduction(PR) Approach, this paper proposes the Student’s Autoregressive (St-AR) Model, Student’s t Vector Autoregressive (St-VAR) Model and their heterogeneous versions, as an alternative to the various ARCH type models, to capture univariate and multivariate volatility. The St-AR and St-VAR models differ from the latter volatility models because they give rise to internally consistent statistical models that do not rely on ad-hoc specification and parameter restrictions, but model the conditional mean and conditional variance jointly.
The univariate modeling is illustrated using the Real Effect Exchange Rate(REER) indices of three mainstream currencies in Asia (RMB, Hong Kong Dollar and Taiwan Dollar), while the multivariate volatility modeling is applied to investigate the relationship between the REER indices and stock price indices in mainland China, as well as the relationship between the stock prices in mainland China and Hong Kong. Following the PR methodology, the information gained in Mis-Specification(M-S) testing leads to respecification strategies from the original Normal-(V)AR models to the St-(V)AR models. The results from formal Mis-Specification (M-S) tests and forecasting performance indicate that the St-(V)AR models provide a more appropriate way to model volatility for certain types of speculative price data. / Ph. D.
|
117 |
Generalized Principal Component AnalysisSolat, Karo 05 June 2018 (has links)
The primary objective of this dissertation is to extend the classical Principal Components Analysis (PCA), aiming to reduce the dimensionality of a large number of Normal interrelated variables, in two directions. The first is to go beyond the static (contemporaneous or synchronous) covariance matrix among these interrelated variables to include certain forms of temporal (over time) dependence. The second direction takes the form of extending the PCA model beyond the Normal multivariate distribution to the Elliptically Symmetric family of distributions, which includes the Normal, the Student's t, the Laplace and the Pearson type II distributions as special cases. The result of these extensions is called the Generalized principal component analysis (GPCA).
The GPCA is illustrated using both Monte Carlo simulations as well as an empirical study, in an attempt to demonstrate the enhanced reliability of these more general factor models in the context of out-of-sample forecasting. The empirical study examines the predictive capacity of the GPCA method in the context of Exchange Rate Forecasting, showing how the GPCA method dominates forecasts based on existing standard methods, including the random walk models, with or without including macroeconomic fundamentals. / Ph. D. / Factor models are employed to capture the hidden factors behind the movement among a set of variables. It uses the variation and co-variation between these variables to construct a fewer latent variables that can explain the variation in the data in hand. The principal component analysis (PCA) is the most popular among these factor models.
I have developed new Factor models that are employed to reduce the dimensionality of a large set of data by extracting a small number of independent/latent factors which represent a large proportion of the variability in the particular data set. These factor models, called the generalized principal component analysis (GPCA), are extensions of the classical principal component analysis (PCA), which can account for both contemporaneous and temporal dependence based on non-Gaussian multivariate distributions.
Using Monte Carlo simulations along with an empirical study, I demonstrate the enhanced reliability of my methodology in the context of out-of-sample forecasting. In the empirical study, I examine the predictability power of the GPCA method in the context of “Exchange Rate Forecasting”. I find that the GPCA method dominates forecasts based on existing standard methods as well as random walk models, with or without including macroeconomic fundamentals.
|
118 |
The Development of Second Language Writing Across the Lexical and Communicative Dimensions of PerformanceDenison, George Clinton, 0009-0003-0615-3489 08 1900 (has links)
Enabling learners to successfully use their second language (L2) in meaningful ways is a critical goal of instruction. Ultimately, most learners want to meet the L2 demands of the contexts in which they will use the language. To accomplish this, learners must develop linguistic knowledge and apply it in a manner that is contextually appropriate considering the requirements of the task at hand. In other words, learners must develop their L2 across both linguistic and communicative aspects to use it successfully. However, there is a paucity of L2 research in which linguistic and communicative performance have been simultaneously investigated.In this study, I investigated the development of English as a Foreign Language (EFL) learners’ L2 written production across the lexical and communicative dimensions of performance. The study involved 290 Japanese participants recruited from 20 intact EFL classes at four tertiary educational institutions in Japan, representing a wide range of L2 English proficiency levels and instructional contexts. The study used a non-intervention, repeated-measures design, allowing for the general development of participants’ L2 English writing to be examined. Participants’ L2 English written responses were collected using four argumentative writing tasks, which were administered at the beginning and end of the first and second semesters in a counterbalanced manner. Although a total of 952 responses were collected, responses shorter than 50 words were removed, leaving a total of 775 responses written by 250 participants. The 775 responses included in the primary analyses constituted a corpus of 89,122 words.
Twenty-four single-word, multi-word, and lexical variation measures were calculated for the responses and subjected to an exploratory factor analysis. Seven latent lexical factors were identified in the data: High-Frequency Trigrams, Lexical Clarity, High-Frequency Bigrams, Lexical Variation, Lexical Breadth, Low-Frequency N-Grams, and Directional Association Strength of N-Grams. In addition, raters scored the responses for functional adequacy (i.e., Content, Comprehensibility, Organization, and Task Completion) and Lexical Appropriateness. The scores were analyzed using many-facet Rasch measurement, which converted the ordinal scores into equal-interval measures that had been adjusted for the influences of task and rater severity. The lexical factor scores and communicative Rasch measures were examined using linear mixed modeling, dominance analysis, and latent growth modeling to investigate (a) if and how lexical development had occurred, (b) if and how communicative development had occurred, (c) the relationships between the lexical and communicative components, and (d) the relationship between lexical and communicative growth.
For the lexical factors, the results indicated that Directional Association Strength of N-Grams scores increased in a linear manner. Directional Association Strength of N-Grams comprised ΔP scores, which indicate the degree to which the first word(s) are predictive of the following word(s) in two- and three-word combinations. Thus, the results indicated that participants’ use of multi-word expressions improved. On the other hand, Lexical Clarity, which comprised imageability, concreteness, meaningfulness, and hypernymy scores, showed quadratic change, with scores improving and then regressing. Thus, the findings provide evidence of differing developmental trends for lexical aspects of L2 writing.
For the communicative measures, the results indicated that Comprehensibility, Organization, and Lexical Appropriateness changed substantially over time. Improvement of Task Completion was dependent on the university context, and little change was observed for Content. Lexical Appropriateness showed the most improvement, with evidence of both linear and quadratic change. Bias interaction analyses also confirmed the presence of linear and quadratic trends for the communicative measures. Thus, the findings provide evidence of differing developmental trends for communicative aspects of L2 writing.
For the relationships between the lexical and communicative components, the results indicated that two lexical factors were of key importance: Lexical Variation and Directional Association Strength of N-Grams. Lexical Variation was found to predict Content, Organization, and Task Completion; and Directional Association Strength of N-Grams was found to predict Comprehensibility and Lexical Appropriateness. The findings suggest that L2 performance assessments should not conflate measurement of lexical variation and use of multi-word expressions because they diverge in terms of the communicative outcomes they predict.
The results also indicated a positive relationship between lexical and communicative development. A parallel process latent growth model was constructed that related lexical and communicative growth. The paths tested in the model suggest participants who had lower initial communicative scores were able to increase their lexical scores at faster rates, which in turn leveraged communicative growth. The findings highlight the potential for learners to improve their communicative ability through a targeted focus on multi-word expressions. / Applied Linguistics
|
119 |
Intermediate addition multifocals provide safe stair ambulation with adequate 'short-term' readingElliott, David, Hotchkiss, John, Scally, Andy J., Foster, Richard J., Buckley, John 24 July 2015 (has links)
Yes / A recent randomised controlled trial indicated that providing long-term
multifocal wearers with a pair of distance single-vision spectacles for use outside
the home reduced falls risk in active older people. However, it also found that
participants disliked continually switching between using two pairs of glasses and
adherence to the intervention was poor. In this study we determined whether
intermediate addition multifocals (which could be worn most of the time inside
and outside the home and thus avoid continual switching) could provide similar
gait safety on stairs to distance single vision spectacles whilst also providing adequate
‘short-term’ reading and near vision.
Methods: Fourteen healthy long-term multifocal wearers completed stair ascent
and descent trials over a 3-step staircase wearing intermediate and full addition
bifocals and progression-addition lenses (PALs) and single-vision distance spectacles.
Gait safety/caution was assessed using foot clearance measurements (toe on
ascent, heel on descent) over the step edges and ascent and descent duration.
Binocular near visual acuity, critical print size and reading speed were measured
using Bailey-Lovie near charts and MNRead charts at 40 cm.
Results: Gait safety/caution measures were worse with full addition bifocals
and PALs compared to intermediate bifocals and PALs. The intermediate
PALs provided similar gait ascent/descent measures to those with distance single-
vision spectacles. The intermediate addition PALs also provided good
reading ability: Near word acuity and MNRead critical print size were better
with the intermediate addition PALs than with the single-vision lenses
(p < 0.0001), with a mean near visual acuity of 0.24 0.13 logMAR (~N5.5)
which is satisfactory for most near vision tasks when performed for a short
period of time.
Conclusions: The better ability to ‘spot read’ with the intermediate addition PALs
compared to single-vision spectacles suggests that elderly individuals might better
comply with the use of intermediate addition PALs outside the home. A lack of
difference in gait parameters for the intermediate addition PALs compared to distance
single-vision spectacles suggests they could be usefully used to help prevent
falls in older well-adapted full addition PAL wearers. A randomised controlled
trial to investigate the usefulness of intermediate multifocals in preventing falls
seems warranted.
|
120 |
De la gestion du ratio de solvabilité bancaire : Étude empirique des ajustements prudentiels relatifs à la juste valeur / Capital Adequacy Ratio Management : An empirical study of prudential fair value adjustmentsKamara, Diéne Mohamed 07 December 2017 (has links)
Dans l'industrie bancaire, plusieurs études ont montré l'existence de la gestion du ratio de solvabilité. Toutefois, elles se sont focalisées pour l'essentiel sur la manipulation des provisions et avancent généralement que la gestion du ratio de solvabilité est mise en œuvre en vue d'éviter les coûts réglementaires associés à un ratio inférieur au seuil minimum. Notre thèse examine la pratique de gestion du ratio de solvabilité à travers les ajustements prudentiels qui sont des retraitements que la banque doit opérer pour passer des fonds propres comptables aux fonds propres réglementaires. Les ajustements prudentiels sont composés de déductions et de filtres prudentiels destinés à atténuer l'impact de la volatilité des fonds propres induite par la juste valeur liée à l'application des IFRS. Adoptant une démarche diachronique et une approche instrumentale, l'étude se base sur un échantillon de banques européennes et utilise des méthodes de régression par données de panel, ainsi que des tests de robustesse tels que le bootstrap et la régression quantile. Le principal apport de cette thèse est de montrer que la transformation de l'information comptable en information réglementaire passe par les ajustements prudentiels qui constituent un pont sur lequel une gestion opportuniste du ratio de solvabilité peut être effectuée à travers des variables relatives à la qualité du capital et à la performance opérationnelle de la banque. L'étude montre que la gestion du capital n'est pas l'exclusivité des banques présentant un ratio faible. Enfin, elle permet de ne plus considérer le ratio de solvabilité comme une boîte noire et de l'examiner à travers ses composantes. / Through Earnings Management practices applied to banking industry, several studies have shown existence of Capital Adequacy Ratio Management (CARM). However, they are mainly focused on loss loan provision (LLP) manipulation's and suppose that Capital adequacy ratio management motivation is to reduce regulatory costs imposed when the bank's capital adequacy ratio falls below the minimum. This thesis deals with the possibilities of banks to manage the regulatory ratio via the prudential adjustments, which are corrections made to equity items in the statement of financial position, to safeguard the quality of the supervisory capital and to reduce potential volatility induced by fair value accounting (application of IFRS). Adopting diachronic and instrumental approaches, the study is based on a sample of European banks and uses regression methods by panel data and bootstrap and quantile regression as post estimation and robustness tests. The main contribution of this thesis is to show that the necessary transformation of accounting information into regulatory information by prudential adjustments constitutes a bridge on which a timely CARM could be carried out through variables relating to the quality of the capital and the operational performance of the bank. Furthermore, the results show that CARM is not exclusively dedicated to banks with ratio close to minimum. Finally the results make possible to no longer consider the capital adequacy ratio as a black box and to examine it through its components.
|
Page generated in 0.0313 seconds