• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 2
  • Tagged with
  • 14
  • 14
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Models of Discrete-Time Stochastic Processes and Associated Complexity Measures / Modelle stochastischer Prozesse in diskreter Zeit und zugehörige Komplexitätsmaße

Löhr, Wolfgang 24 June 2010 (has links) (PDF)
Many complexity measures are defined as the size of a minimal representation in a specific model class. One such complexity measure, which is important because it is widely applied, is statistical complexity. It is defined for discrete-time, stationary stochastic processes within a theory called computational mechanics. Here, a mathematically rigorous, more general version of this theory is presented, and abstract properties of statistical complexity as a function on the space of processes are investigated. In particular, weak-* lower semi-continuity and concavity are shown, and it is argued that these properties should be shared by all sensible complexity measures. Furthermore, a formula for the ergodic decomposition is obtained. The same results are also proven for two other complexity measures that are defined by different model classes, namely process dimension and generative complexity. These two quantities, and also the information theoretic complexity measure called excess entropy, are related to statistical complexity, and this relation is discussed here. It is also shown that computational mechanics can be reformulated in terms of Frank Knight's prediction process, which is of both conceptual and technical interest. In particular, it allows for a unified treatment of different processes and facilitates topological considerations. Continuity of the Markov transition kernel of a discrete version of the prediction process is obtained as a new result.
12

Text complexity visualisations : An exploratory study on teachers interpretations of radar chart visualisations of text complexity / Visualisering av textkomplexitet : En utforskande studie kring lärares tolkningar av radardiagramsvisualiseringar av textkomplexitet

Anderberg, Caroline January 2022 (has links)
Finding the appropriate level of text for students with varying reading abilities is an important and demanding task for teachers. Radar chart visualisations of text complexity could potentially be an aid in that process, but they need to be evaluated to see if they are intuitive and if they say something about the complexity of a text. This study explores how visualisations of text complexity, in the format of radar charts, are interpreted, what measures they should include and what information they should contain in order to be intelligible for teachers who work with people who have language and/or reading diffi- culties. A preliminary study and three focus group sessions were conducted with teachers from special education schools for adults and gymnasium level. Through thematic analysis of the sessions, five themes were generated and it was found that the visualisations are intelligible to some extent, but they need to be adapted to the target group by making sure the measures are relevant, and that the scale, colours, categories and measures are clearly explained. / Det är både en viktig och krävande uppgift för lärare att hitta lämplig textnivå för elever med varierande läsförmågor. Radardiagramsvisualiseringar av textkomplexitet kan poten- tiellt stötta den processen, men de måste utvärderas för att undersöka om de är intuitiva, vilka mått som bör inkluderas samt om de säger något om komplexiteten av en text. Den här studien utforskar hur visualiseringar av textkomplexitet i form av radardiagram tolkas, vilka mått de bör inkludera samt vilken information de bör innehålla i syfte att vara begripliga för lärare som jobbar med elever med språk och/eller lässvårigheter. En förundersökning och tre fokusgruppsessioner utfördes, med lärare från särgymnasium och särvuxskolor. Efter tematisk analys av data från fokusgrupperna genererades fem teman. Reultaten visade att visualiseringarna var begripliga till viss del, men de behöver anpassas till målgruppen genom att se till att måtten är relevanta samt att skalan, färgerna, kategorierna och måtten är tydligt förklarade.
13

Models of Discrete-Time Stochastic Processes and Associated Complexity Measures

Löhr, Wolfgang 12 May 2010 (has links)
Many complexity measures are defined as the size of a minimal representation in a specific model class. One such complexity measure, which is important because it is widely applied, is statistical complexity. It is defined for discrete-time, stationary stochastic processes within a theory called computational mechanics. Here, a mathematically rigorous, more general version of this theory is presented, and abstract properties of statistical complexity as a function on the space of processes are investigated. In particular, weak-* lower semi-continuity and concavity are shown, and it is argued that these properties should be shared by all sensible complexity measures. Furthermore, a formula for the ergodic decomposition is obtained. The same results are also proven for two other complexity measures that are defined by different model classes, namely process dimension and generative complexity. These two quantities, and also the information theoretic complexity measure called excess entropy, are related to statistical complexity, and this relation is discussed here. It is also shown that computational mechanics can be reformulated in terms of Frank Knight''s prediction process, which is of both conceptual and technical interest. In particular, it allows for a unified treatment of different processes and facilitates topological considerations. Continuity of the Markov transition kernel of a discrete version of the prediction process is obtained as a new result.
14

How Often do Experts Make Mistakes?

Palix, Nicolas, Lawall, Julia L., Thomas, Gaël, Muller, Gilles January 2010 (has links)
Large open-source software projects involve developers with a wide variety of backgrounds and expertise. Such software projects furthermore include many internal APIs that developers must understand and use properly. According to the intended purpose of these APIs, they are more or less frequently used, and used by developers with more or less expertise. In this paper, we study the impact of usage patterns and developer expertise on the rate of defects occurring in the use of internal APIs. For this preliminary study, we focus on memory management APIs in the Linux kernel, as the use of these has been shown to be highly error prone in previous work. We study defect rates and developer expertise, to consider e.g., whether widely used APIs are more defect prone because they are used by less experienced developers, or whether defects in widely used APIs are more likely to be fixed.

Page generated in 0.0573 seconds