Spelling suggestions: "subject:"consistency"" "subject:"onsistency""
171 |
Consistency techniques for test data generationTran Sy, Nguyen 10 June 2005 (has links)
This thesis presents a new approach for automated test data generation of imperative programs containing integer, boolean and/or float variables. A test program (with procedure calls) is represented by an Interprocedural Control Flow Graph (ICFG). The classical testing criteria (statement, branch, and path coverage), widely used in unit testing, are extended to the ICFG. Path coverage is the core of our approach. Given a specified path of the ICFG, a path constraint is derived and solved to obtain a test case. The constraint solving is carried out based on a consistency notion. For statement (and branch) coverage, paths reaching a specified node or branch are dynamically constructed. The search for suitable paths is guided by the interprocedural control dependences of the program. The search is also pruned by our consistency filter. Finally, test data are generated by the application of the proposed path coverage algorithm. A prototype system implements our approach for C programs. Experimental results, including complex numerical programs, demonstrate the feasibility of the method and the efficiency of the system, as well as its versatility and flexibility to different classes of problems (integer and/or float variables; arrays, procedures, path coverage, statement coverage).
|
172 |
Essays on monetary policy and banking regulationLi, Jingyuan 15 November 2004 (has links)
A central bank is usually assigned two functions: the control of inflation and the maintenance of a safetybanking sector. What are the precise conditions under which trigger strategies from the private sector can solve the time inconsistency problem and induce the central bank to choose zero inflation under a nonstationary natural rate? Can an optimal contract be used together with reputation forces to implement a desired socially optimal monetary policy rule? How to design a truthtelling contract to control the risk taking behaviors of the bank? My dissertation attempts to deal with these issues using three primary methodologies: monetary economics, game theory and optimal stochastic control theory.
|
173 |
Consistency techniques for test data generationTran Sy, Nguyen 10 June 2005 (has links)
This thesis presents a new approach for automated test data generation of imperative programs containing integer, boolean and/or float variables. A test program (with procedure calls) is represented by an Interprocedural Control Flow Graph (ICFG). The classical testing criteria (statement, branch, and path coverage), widely used in unit testing, are extended to the ICFG. Path coverage is the core of our approach. Given a specified path of the ICFG, a path constraint is derived and solved to obtain a test case. The constraint solving is carried out based on a consistency notion. For statement (and branch) coverage, paths reaching a specified node or branch are dynamically constructed. The search for suitable paths is guided by the interprocedural control dependences of the program. The search is also pruned by our consistency filter. Finally, test data are generated by the application of the proposed path coverage algorithm. A prototype system implements our approach for C programs. Experimental results, including complex numerical programs, demonstrate the feasibility of the method and the efficiency of the system, as well as its versatility and flexibility to different classes of problems (integer and/or float variables; arrays, procedures, path coverage, statement coverage).
|
174 |
Semantische Repräsentation, obligatorische Aktivierung und verbale Produktion arithmetischer Fakten / Semantic representation, obligatory activation, and verbal production of arithmetic factsDomahs, Frank January 2006 (has links)
Die vorliegende Arbeit widmet sich der Repräsentation und Verarbeitung arithmetischer Fakten. Dieser Bereich semantischen Wissens eignet sich unter anderem deshalb besonders gut als Forschungsgegenstand, weil nicht nur seine einzelne Bestandteile, sondern auch die Beziehungen dieser Bestandteile untereinander außergewöhnlich gut definierbar sind. Kognitive Modelle können also mit einem Grad an Präzision entwickelt werden, der in anderen Bereichen kaum je zu erreichen sein wird. Die meisten aktuellen Modelle stimmen darin überein, die Repräsentation arithmetischer Fakten als eine assoziative, netzwerkartig organisierte Struktur im deklarativen Gedächtnis zu beschreiben. Trotz dieser grundsätzlichen Übereinstimmung bleibt eine Reihe von Fragen offen. In den hier vorgestellten Untersuchungen werden solche offene Fragen in Hinsicht auf drei verschiedene Themenbereiche angegangen:
1) die neuroanatomischen Korrelate 2) Nachbarschaftskonsistenzeffekte bei der verbalen Produktion sowie 3) die automatische Aktivierung arithmetischer Fakten.
In einer kombinierten fMRT- und Verhaltensstudie wurde beispielsweise der Frage nachgegangen, welche neurofunktionalen Entsprechungen es für den Erwerb arithmetischer Fakten bei Erwachsenen gibt. Den Ausgangspunkt für diese Untersuchung bildete das Triple-Code-Modell von Dehaene und Cohen, da es als einziges auch Aussagen über neuroanatomische Korrelate numerischer Leistungen macht. Das Triple-Code-Modell geht davon aus, dass zum Abruf arithmetischer Fakten eine „perisylvische“ Region der linken Hemisphäre unter Einbeziehung der Stammganglien sowie des Gyrus angularis nötig ist (Dehaene & Cohen, 1995; Dehaene & Cohen, 1997; Dehaene, Piazza, Pinel, & Cohen, 2003). In der aktuellen Studie sollten gesunde Erwachsene komplexe Multiplikationsaufgaben etwa eine Woche lang intensiv üben, so dass ihre Beantwortung immer mehr automatisiert erfolgt. Die Lösung dieser geübten Aufgaben sollte somit – im Gegensatz zu vergleichbaren ungeübten Aufgaben – immer stärker auf Faktenabruf als auf der Anwendung von Prozeduren und Strategien beruhen. Hingegen sollten ungeübte Aufgaben im Vergleich zu geübten höhere Anforderungen an exekutive Funktionen einschließlich des Arbeitsgedächtnisses stellen. Nach dem Training konnten die Teilnehmer – wie erwartet – geübte Aufgaben deutlich schneller und sicherer beantworten als ungeübte. Zusätzlich wurden sie auch im Magnetresonanztomografen untersucht. Dabei konnte zunächst bestätigt werden, dass das Lösen von Multiplikationsaufgaben allgemein von einem vorwiegend linkshemisphärischen Netzwerk frontaler und parietaler Areale unterstützt wird. Das wohl wichtigste Ergebnis ist jedoch eine Verschiebung der Hirnaktivierungen von eher frontalen Aktivierungsmustern zu einer eher parietalen Aktivierung und innerhalb des Parietallappens vom Sulcus intraparietalis zum Gyrus angularis bei den geübten im Vergleich zu den ungeübten Aufgaben. So wurde die zentrale Bedeutung von Arbeitsgedächtnis- und Planungsleistungen für komplexe ungeübte Rechenaufgaben erneut herausgestellt. Im Sinne des Triple-Code-Modells könnte die Verschiebung innerhalb des Parietallappens auf einen Wechsel von quantitätsbasierten Rechenleistungen (Sulcus intraparietalis) zu automatisiertem Faktenabruf (linker Gyrus angularis) hindeuten.
Gibt es bei der verbalen Produktion arithmetischer Fakten Nachbarschaftskonsistenzeffekte ähnlich zu denen, wie sie auch in der Sprachverarbeitung beschrieben werden? Solche Effekte sind nach dem aktuellen „Dreiecksmodell“ von Verguts & Fias (2004) zur Repräsentation von Multiplikationsfakten erwartbar. Demzufolge sollten richtige Antworten leichter gegeben werden können, wenn sie Ziffern mit möglichst vielen semantisch nahen falschen Antworten gemeinsam haben. Möglicherweise sollten demnach aber auch falsche Antworten dann mit größerer Wahrscheinlichkeit produziert werden, wenn sie eine Ziffer mit der richtigen Antwort teilen. Nach dem Dreiecksmodell wäre darüber hinaus sogar der klassische Aufgabengrößeneffekt bei einfachen Multiplikationsaufgaben (Zbrodoff & Logan, 2004) auf die Konsistenzverhältnisse der richtigen Antwort mit semantisch benachbarten falschen Antworten zurückzuführen. In einer Reanalyse der Fehlerdaten von gesunden Probanden (Campbell, 1997) und einem Patienten (Domahs, Bartha, & Delazer, 2003) wurden tatsächlich Belege für das Vorhandensein von Zehnerkonsistenzeffekten beim Lösen einfacher Multiplikationsaufgaben gefunden. Die Versuchspersonen bzw. der Patient hatten solche falschen Antworten signifikant häufiger produziert, welche die gleiche Zehnerziffer wie das richtigen Ergebnisses aufwiesen, als ansonsten vergleichbare andere Fehler. Damit wird die Annahme unterstützt, dass die Zehner- und die Einerziffern zweistelliger Zahlen separate Repräsentationen aufweisen – bei der Multiplikation (Verguts & Fias, 2004) wie auch allgemein bei numerischer Verarbeitung (Nuerk, Weger, & Willmes, 2001; Nuerk & Willmes, 2005). Zusätzlich dazu wurde in einer Regressionsanalyse über die Fehlerzahlen auch erstmalig empirische Evidenz für die Hypothese vorgelegt, dass der klassische Aufgabengrößeneffekt beim Abruf von Multiplikationsfakten auf Zehnerkonsistenzeffekte zurückführbar ist: Obwohl die Aufgabengröße als erster Prädiktor in das Modell einging, wurde diese Variable wieder verworfen, sobald ein Maß für die Nachbarschaftskonsistenz der richtigen Antwort in das Modell aufgenommen wurde.
Schließlich wurde in einer weiteren Studie die automatische Aktivierung von Multiplikationsfakten bei gesunden Probanden mit einer Zahlenidentifikationsaufgabe (Galfano, Rusconi, & Umilta, 2003; Lefevre, Bisanz, & Mrkonjic, 1988; Thibodeau, Lefevre, & Bisanz, 1996) untersucht. Dabei sollte erstmals die Frage beantwortet werden, wie sich die automatische Aktivierung der eigentlichen Multiplikationsergebnisse (Thibodeau et al., 1996) zur Aktivierung benachbarter falscher Antworten (Galfano et al., 2003) verhält. Ferner sollte durch die Präsentation mit verschiedenen SOAs der zeitliche Verlauf dieser Aktivierungen aufgeklärt werden. Die Ergebnisse dieser Studie können insgesamt als Evidenz für das Vorhandensein und die automatische, obligatorische Aktivierung eines Netzwerkes arithmetischer Fakten bei gesunden, gebildeten Erwachsenen gewertet werden, in dem die richtigen Produkte stärker mit den Faktoren assoziiert sind als benachbarte Produkte (Operandenfehler). Dabei führen Produkte kleiner Aufgaben zu einer stärkeren Interferenz als Produkte großer Aufgaben und Operandenfehler großer Aufgaben zu einer stärkeren Interferenz als Operandenfehler kleiner Aufgaben. Ein solches Aktivierungsmuster passt gut zu den Vorhersagen des Assoziationsverteilungsmodells von Siegler (Lemaire & Siegler, 1995; Siegler, 1988), bei dem kleine Aufgaben eine schmalgipflige Verteilung der Assoziationen um das richtige Ergebnis herum aufweisen, große Aufgaben jedoch eine breitgipflige Verteilung.
Somit sollte die vorliegende Arbeit etwas mehr Licht in bislang weitgehend vernachlässigte Aspekte der Repräsentation und des Abrufs arithmetischer Fakten gebracht haben: Die neuronalen Korrelate ihres Erwerbs, die Konsequenzen ihrer Einbindung in das Stellenwertsystem mit der Basis 10 sowie die spezifischen Auswirkungen ihrer assoziativen semantischen Repräsentation auf ihre automatische Aktivierbarkeit.
Literatur
Campbell, J. I. (1997). On the relation between skilled performance of simple division and multiplication. Journal of Experimental Psychology: Learning, Memory, and Cognition, 23, 1140-1159.
Dehaene, S. & Cohen, L. (1995). Towards an anatomical and functional model of number processing. Mathematical Cognition, 1, 83-120.
Dehaene, S. & Cohen, L. (1997). Cerebral pathways for calculation: double dissociation between rote verbal and quantitative knowledge of arithmetic. Cortex, 33, 219-250.
Dehaene, S., Piazza, M., Pinel, P., & Cohen, L. (2003). Three parietal circuits for number processing. Cognitive Neuropsychology, 20, 487-506.
Domahs, F., Bartha, L., & Delazer, M. (2003). Rehabilitation of arithmetic abilities: Different intervention strategies for multiplication. Brain and Language, 87, 165-166.
Galfano, G., Rusconi, E., & Umilta, C. (2003). Automatic activation of multiplication facts: evidence from the nodes adjacent to the product. Quarterly Journal of Experimental Psychology A, 56, 31-61.
Lefevre, J. A., Bisanz, J., & Mrkonjic, L. (1988). Cognitive arithmetic: evidence for obligatory activation of arithmetic facts. Memory and Cognition, 16, 45-53.
Lemaire, P. & Siegler, R. S. (1995). Four aspects of strategic change: contributions to children's learning of multiplication. Journal of Experimental Psychology: General, 124, 83-97.
Nuerk, H. C., Weger, U., & Willmes, K. (2001). Decade breaks in the mental number line? Putting the tens and units back in different bins. Cognition, 82, B25-B33.
Nuerk, H. C. & Willmes, K. (2005). On the magnitude representations of two-digit numbers. Psychology Science, 47, 52-72.
Siegler, R. S. (1988). Strategy choice procedures and the development of multiplication skill. Journal of Experimental Psychology: General, 117, 258-275.
Thibodeau, M. H., Lefevre, J. A., & Bisanz, J. (1996). The extension of the interference effect to multiplication. Canadian Journal of Experimental Psychology, 50, 393-396.
Verguts, T. & Fias, W. (2004). Neighborhood Effects in Mental Arithmetic. Psychology Science.
Zbrodoff, N. J. & Logan, G. D. (2004). What everyone finds: The problem-size effect. In J. I. D. Campbell (Hrsg.), Handbook of Mathematical Cognition (pp.331-345). New York, NY: Psychology Press. / The present thesis deals with the representation and processing of arithmetic facts. This domain of semantic knowledge has gained a substantial amount of interest as its components as well as their interrelations are well specified. Thus, cognitive models can be developed with a degree of precision, which cannot be reached in many other domains. Most recent models agree that arithmetic facts are represented in an associative, network-like structure in declarative memory. Despite this general agreement a lot of issues still remain unresolved. The open questions tackled in the present work address three different aspects of arithmetic facts: 1) their neuro-anatomical correlates, 2) neighbourhood consistency effects in their verbal production and 3) their automatic activation.
In a combined behavioural and fMRI study the neurofunctional correlates of the acquisition of arithmetic facts in adults were examined. This research was based on the Triple-Code-Model of Dehaene and Cohen, the only recent model which makes explicit assumptions on neuroanatomical correlates of numerical abilities. The Triple-Code-Model assumes that a “perisylvian” region in the left hemisphere including the basal ganglia and the Angular Gyrus is involved in the retrieval of arithmetic facts (Dehaene & Cohen, 1995; Dehaene & Cohen, 1997; Dehaene, Piazza, Pinel, & Cohen, 2003). In the present study healthy adults were asked to train complex multiplication problems extensively during one week. Thus, these problems could be solved more and more automatically. It was reasoned that answering these trained problems should more and more rely on the retrieval of facts from declarative memory, whereas answering untrained problems should rely on the application of strategies and procedures, which impose high demands on executive functions including working memory. After the training was finished, participants – as expected – could solve trained problems faster and more accurately than non-trained problems. Participants were also submitted to a functional magnetic resonance imaging examination. In general, this examination added to the evidence for a mainly left hemispheric fronto-parietal network being involved in mental multiplication. Crucially, comparing trained with non-trained problems a shift of activation from frontal to more parietal regions was observed. Thus, the central role of central executive and working memory for complex calculation was highlighted. Moreover, a shift of activation from the Intraparietal Sulcus to the Angular Gyrus took place within the parietal lobe. According to the Triple-Code-Model, this shift may be interpreted to indicate a strategy change from quantity based calculation, relying on the Intraparietal Sulcus, to fact retrieval, relying on the left Angular Gyrus.
Are there neighbourhood consistency effects in the verbal production of arithmetic facts similar to what has been described for language production? According to the “Triangle Model” of simple multiplication, proposed by Verguts & Fias (2004), such effects can be expected. According to this model corrects answers can be given more easily if they share digits with many semantically close wrong answers. Moreover, it can be assumed that wrong answers, too, are more likely to be produced if they share a digit with the correct result. In addition to this, the Triangle Model also states that the classical problem size effect in simple multiplication (Zbrodoff & Logan, 2004) can be drawn back to neighbourhood consistency between the correct result and semantically close wrong answers. In fact, a re-analysis of error data from a sample of healthy young adults (Campbell, 1997) and a patient with acalculia (Domahs, Bartha, & Delazer, 2003) provided evidence for the existence of decade consistency effects in the verbal production of multiplication results. Healthy participants and the patient produced significantly more wrong answers which shared the decade digit with the correct result than otherwise comparable wrong answers. This result supports the assumption of separate representations of decade and unit digits in two-digit numbers in multiplication (Verguts & Fias, 2004) and in number processing in general (Nuerk, Weger, & Willmes, 2001; Nuerk & Willmes, 2005). Moreover, an additional regression analysis on the error rates provided first empirical evidence for the hypothesis that the classical problem size effect in the retrieval of multiplication facts may be an artefact of neighbourhood consistency: Although problem size was the first variable to enter the model, it was excluded from the model once a measure for neighbourhood consistency was included.
Finally, in a further study the automatic activation of multiplication facts was examined in a number matching task (Galfano, Rusconi, & Umilta, 2003; Lefevre, Bisanz, & Mrkonjic, 1988; Thibodeau, Lefevre, & Bisanz, 1996). This experiment addressed the question how the automatic activation of actual multiplication results (Thibodeau et al., 1996) relates to the activation of semantically close wrong answers (Galfano et al., 2003). Furthermore, using different SOAs the temporal properties of these activations should be disclosed. In general, the results of this study provide evidence for an obligatory and automatic activation of a network of arithmetic facts in healthy educated adults in which correct results are stronger associated with the operands than semantically related wrong answers. Crucially, products of small problems lead to stronger interference effects than products of larger problems while operand errors of large problems lead to stronger interference effects than operand errors of small problems. Such a pattern of activation is in line with predictions of Siegler’s Distribution of Associations Model (Lemaire & Siegler, 1995; Siegler, 1988) which assumes a more peaked distribution of associations between operands and potential results for small compared to large multiplication problems.
In sum, the present thesis should shed some light into largely ignored aspects of arithmetic fact retrieval: The neural correlates of its acquisition, the consequences of its implementation in the base 10 place value system, as well as the specific effects of its semantic representation for automatic activation of correct multiplication facts and related results.
References
Campbell, J. I. (1997). On the relation between skilled performance of simple division and multiplication. Journal of Experimental Psychology: Learning, Memory, and Cognition, 23, 1140-1159.
Dehaene, S. & Cohen, L. (1995). Towards an anatomical and functional model of number processing. Mathematical Cognition, 1, 83-120.
Dehaene, S. & Cohen, L. (1997). Cerebral pathways for calculation: double dissociation between rote verbal and quantitative knowledge of arithmetic. Cortex, 33, 219-250.
Dehaene, S., Piazza, M., Pinel, P., & Cohen, L. (2003). Three parietal circuits for number processing. Cognitive Neuropsychology, 20, 487-506.
Domahs, F., Bartha, L., & Delazer, M. (2003). Rehabilitation of arithmetic abilities: Different intervention strategies for multiplication. Brain and Language, 87, 165-166.
Galfano, G., Rusconi, E., & Umilta, C. (2003). Automatic activation of multiplication facts: evidence from the nodes adjacent to the product. Quarterly Journal of Experimental Psychology A, 56, 31-61.
Lefevre, J. A., Bisanz, J., & Mrkonjic, L. (1988). Cognitive arithmetic: evidence for obligatory activation of arithmetic facts. Memory and Cognition, 16, 45-53.
Lemaire, P. & Siegler, R. S. (1995). Four aspects of strategic change: contributions to children's learning of multiplication. Journal of Experimental Psychology: General, 124, 83-97.
Nuerk, H. C., Weger, U., & Willmes, K. (2001). Decade breaks in the mental number line? Putting the tens and units back in different bins. Cognition, 82, B25-B33.
Nuerk, H. C. & Willmes, K. (2005). On the magnitude representations of two-digit numbers. Psychology Science, 47, 52-72.
Siegler, R. S. (1988). Strategy choice procedures and the development of multiplication skill. Journal of Experimental Psychology: General, 117, 258-275.
Thibodeau, M. H., Lefevre, J. A., & Bisanz, J. (1996). The extension of the interference effect to multiplication. Canadian Journal of Experimental Psychology, 50, 393-396.
Verguts, T. & Fias, W. (2004). Neighborhood Effects in Mental Arithmetic. Psychology Science.
Zbrodoff, N. J. & Logan, G. D. (2004). What everyone finds: The problem-size effect. In J. I. D. Campbell (Ed.), Handbook of Mathematical Cognition (pp.331-345). New York, NY: Psychology Press.
|
175 |
Situation cognition and coherence in personality : an individual-centered approachKrahé, Barbara January 1990 (has links)
This volume reexamines the long-standing controversy about consistency in personality from a social psychological perspective. Barabara Krahé reconsiders the concept of consistency in terms of the systematic coherence of situation cognition and behaviour across situations. In the first part of the volume she undertakes an examination of recent social psychological models of situation cognition for their ability to clarify the principles underlying the perception of situational similarities. She then advances an individual-centred methedology in which nomothetic hypotheses about cross-situational coherence are tested on the basis of idiogrphic measurement of situation cognition and behaviour.
In the second part of the volume, a series of empirical studies is reported which apply the individual-centred framework to the analysis of cross-situational coherence in the domain of anxiety-provoking situations. These studies are distinctive in that they extend over several months and use free-response data; they are based on idiographic sampling; and they employ explicit theoretical models to capture the central features of situation perception. The results demonstrate the benefits of integrating idiographic and nomothetic research strategies and exploiting the advantages of both perspectives.
|
176 |
Estimation in partly parametric additive Cox modelsLäuter, Henning January 2003 (has links)
The dependence between survival times and covariates is described e.g. by proportional hazard models. We consider partly parametric Cox models and discuss here the estimation of interesting parameters. We represent the ma- ximum likelihood approach and extend the results of Huang (1999) from linear to nonlinear parameters. Then we investigate the least squares esti- mation and formulate conditions for the a.s. boundedness and consistency of these estimators.
|
177 |
Low consistency refining of mechanical pulp : process conditions and energy efficiencyAndersson, Stefan January 2011 (has links)
The thesis is focussed on low consistency (LC) refining of mechanical pulp. Theresearch included evaluations of energy efficiency, development of pulpproperties, the influence of fibre concentration on LC refining and effects of rotorposition in a two-zoned LC refiner. Trials were made in mill scale in a modern TMP line equipped with an MSDImpressafiner for chip pre-treatment, double disc (DD) first stage refining and aprototype 72-inch TwinFlo LC refiner in the second stage. Tensile index increasedby 8 Nm/g and fibre length was reduced by 10 % in LC refining at 140 kWh/adtgross specific refining energy and specific edge load 1.0 J/m. Specific lightscattering coefficient did not develop significantly over the LC refiner. The above mentioned TMP line was compared with a two stage single disc highconsistency Twin 60 refiner line. The purpose was to evaluate specific energyconsumption and pulp properties. The two different process solutions were testedin mill scale, running similar Norway spruce wood supply. At the same tensileindex and freeness, the specific energy consumption was 400 kWh/adt lower in theDD-LC concept compared with the SD-SD system. Pulp characteristics of the tworefining concepts were compared at tensile index 47 Nm/g. Fibre length was lowerafter DD-LC refining than after SD-SD refining. Specific light scattering coefficientwas higher and shive content much lower for DD-LC pulp. The effects of sulphite chip pre-treatment on second stage LC refining were alsoevaluated. No apparent differences in fibre properties after LC refining werenoticed between treated and untreated pulps. Sulphite chip pre-treatment iniiicombination with LC refining in second stage, yielded a pulp without screeningand reject refining with tensile index and shives content that were similar to nonpre-treated final pulp after screening and reject refining. A pilot scale study was performed to investigate the influence of fibreconcentration on pulp properties in LC refining of mechanical pulps. MarketCTMP was utilised in all trials and fibre concentrations were controlled by meansof adjustments of the pulp consistency and by screen fractionation of the pulp. Inaddition, various refiner parameters were studied, such as no-load, gap and baredge length. Pulp with the highest fibre concentration supported a larger refinergap than pulp with low fibre concentration at a given gross power input. Fibreshortening was lower and tensile index increase was higher for long fibre enrichedpulp. The results from this study support the interesting concept of combiningmain line LC refining and screening, where screen reject is recycled to the LCrefiner inlet. It has been observed that the rotor in two-zoned refiners is not always centred,even though pulp flow rate is equal in both refining zones. This leads to unequalplate gaps, which renders unevenly refined pulp. Trials were performed in millscale, using the 72-inch TwinFlo, to investigate differences in pulp properties androtor positions by means of altering the pressure difference between the refiningzones. In order to produce homogenous pulp, it was found that uneven plate gapscan be compensated for in LC refiners with dual refining zones. Results from thedifferent flow rate adjustments indicated that the control setting with similar plategap gave the most homogenous pulp.
|
178 |
Two Sides of the Same Coin : A study of EFL-teachers‟ knowledge regarding the divergences between British and American English; and the challenges which arise from having more than one accepted variety of English in EFL teaching.Jensen, Linda January 2010 (has links)
Institution: Halmstad University/School of Teacher Education (LUT) Course: C-level paper, 15 credits Term: Spring 2010 Title: Two Sides of the Same Coin - A study of EFL-teachers knowledge regarding the divergences between British and American English; and the challenges which arise from having more than one accepted variety of English in EFL teaching. Pages: 41 Writer: Linda Jensen Purpose: The purpose of this essay is to ascertain if Swedish EFL teachers have sufficient knowledge regarding the differences between BrE and AmE, the two major varieties of English. Furthermore, I aim to examine what challenges are created when two models of English, BrE and AmE, are accepted in upper secondary schools in Sweden. Method: A quantitative web-based survey.Material: Questionnaire filled in by 59 EFL teachers in upper secondary schools in Halland, Sweden. Main results: Upper secondary EFL teachers in Halland, Sweden do appear to have a basic knowledge of the differences between BrE and AmE and as such a majority placed themselves in the correct category. However, there is a lack of consistency and all the teachers mixed the two varieties to some extent. The challenges that arise from having two accepted varieties in Swedish schools are amongst other things the question of the consistency rule, dealing with the value system associated with British and American English and the question of whether Mid-Atlantic English should be accepted as a third educational standard. Keywords: British English, American English, Mid-Atlantic English, divergence, EFL, consistency, challenges, value system.
|
179 |
Shape Descriptors Based On Intersection Consistency And Global Binary PatternsSivri, Erdal 01 September 2012 (has links) (PDF)
Shape description is an important problem in computer vision because most vision tasks that require comparing or matching visual entities rely on shape descriptors. In this thesis, two novel shape descriptors are proposed, namely Intersection Consistency Histogram (ICH) and Global Binary Patterns (GBP). The former is based on a local regularity measure called Intersection Consistency (IC), which determines whether edge pixels in an image patch point towards the center or not. The second method, called Global Binary Patterns, represents the shape in binary along horizontal, vertical, diagonal or principal directions. These two methods are extensively analyzed on several databases, and retrieval and running time performances are presented. Moreover, these methods are compared with methods such as Shape Context, Histograms of Oriented Gradients, Local Binary Patterns and Fourier Descriptors. We report that our descriptors perform comparable to these methods.
|
180 |
Estimators of semiparametric truncated and censored regression modelsKarlsson, Maria January 2005 (has links)
This thesis contributes in several ways to the existing knowledge on estimation of truncated, censored, and left truncated right censored (LTRC) regression models. Three new semiparametric estimators are proposed, allowing for asymmetric error distributions. A bootstrap method for estimation of the covariance matrix of the quadratic mode estimator (QME) is proposed and studied. In addition, finite sample properties of estimators for truncated, censored, and LTRC data are studied within simulation studies and applications with real data. The first paper consists of a simulation study of the QME and other estimators of truncated regression models. The paper contributes with results suggesting the bootstrap technique being potentially useful for estimation of the QME covariance matrix. In the second paper estimators of truncated and censored semiparametric regression models are proposed. These estimators are generalizations of the QME and the winsorized mean estimator (WME) by allowing asymmetric ``trimming'' of observations. Consistency and asymptotic normality of the estimators are shown. By combining the two moment restrictions used to derive the estimators in the second paper, a consistent estimator of LTRC regression models is proposed in the third paper. The fourth paper contains an application where LTRC interpurchase intervals of cars are analysed. Results regarding the interpurchase behaviour of consumers are provided, as are results on estimator properties.
|
Page generated in 0.0689 seconds