• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 879
  • 588
  • 132
  • 109
  • 90
  • 68
  • 62
  • 23
  • 21
  • 15
  • 12
  • 10
  • 9
  • 8
  • 8
  • Tagged with
  • 2385
  • 334
  • 305
  • 284
  • 260
  • 255
  • 239
  • 239
  • 201
  • 196
  • 183
  • 180
  • 180
  • 180
  • 175
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
231

Towards Dimensionality in Psychosis: A Conceptual Analysis of the Dimensions of Psychosis Symptom Severity

Carmona, Jessica Abigail 01 March 2016 (has links)
Given the heterogeneity of symptoms allowed in the diagnosis of psychotic disorders, as well as other challenges of categorical diagnosis (e.g., First et al., 2002; Krueger, 1999), the increased specificity brought by dimensional ratings of underlying features is often important. Models using the factorial structure of psychotic symptoms perform as good as or better than traditional categorical models (Allardyce, Suppes, & Van Os, 2007). DSM-5 has provided such a system of ratings to aid clinicians, the Clinician Rated Dimensions of Psychosis Symptom Severity Scale (PSS; APA, 2013). In this approach, the clinician rates symptom severity in eight domains which emphasize traditional psychotic symptomatology, cognition, and mood. Given its accessibility and the support of the DSM-5, it is possible that the measure could achieve wide use. However, little is known about the measure and the challenges of applying it in clinical settings. This study is a conceptual analysis of the conceptual foundation of the PSS, including its psychometric properties, applications, and demonstrated validity. It is also compared to the widely used Brief Psychiatric Rating Scale – Revised (BPRS-R). The PSS is more concise that other measures, and five of the PSS domains parallel the DSM-5's "Key Features That Define the Psychotic Disorders" (p. 87-88) (although the brief instructions of the PSS differ at times from DSM-5 definitions, and little in the way of definition is offered in the PSS itself). In contrast, no rationale is given for adding the remaining three domains. The dimensional model of the PSS has similarities to the factor structure typically found for symptomatology in psychotic disorder, but a number of important differences are noted. The data required for making ratings is never defined, although the only mention of data that might be helpful for rating one of the domains depends upon extensive testing. Although anchors for the ratings might, at first glance, appear to be given in the PSS, in fact, they offer almost nothing beyond the adjectives of "equivocal," "mild," "moderate," and "severe." Finally, we found that very little research exists on the PSS, no field trial was done, psychometric properties are largely unknown, and normative data is unavailable. The PSS is brief and provides a quick way to rate the severity of the five key features of psychosis required by DSM-5 diagnoses. Thus, it can work as a quick quantification of these features. Beyond this its utility is unknown, and it appears to lack the specificity of other rating scales, such as the BPRS-R.
232

The view from the armchair: a defense of traditional philosophy

Bryson, Anthony Alan 01 December 2009 (has links)
Traditional philosophy has been under attack from several quarters in recent years. The traditional philosopher views philosophy as an armchair discipline relying, for the most part, on reason and reflection. Some philosophers doubt the legitimacy of this type of inquiry. Their arguments usually occur along two dimensions. Some argue that the primary data source for the armchair philosopher--intuition--does not provide evidence for philosophical theories. Others argue that conceptual analysis, which is the preferred method of inquiry for armchair philosophers, can't yield the results the philosopher is looking for, since concepts like 'knowledge' or 'free-will' vary from culture to culture or even between persons within a culture. Finally, some philosophers argue that we should abandon the armchair program because philosophy should be an empirical enterprise continuous with the sciences. I argue that attempts to undermine intuition fail and that one can justify the evidential status of intuition in a non-question begging way. I then argue that attacks on the belief in shared concepts do not succeed because they often conflate the nature of scientific objects with those of interest to the philosopher. However, if concepts do vary from culture to culture, I show that the philosopher need not abandon the armchair. She can still do conceptual analysis but it will be only the entry point into the philosophical dialogue. I apply this approach to epistemology arguing that the central epistemic questions ought to be the existential and the normative. This approach helps to vindicate epistemic internalism.
233

Conceptual change in secondary chemistry : the role of multiple analogical models of atoms and molecules.

Harrison, Allan G. January 1996 (has links)
Chemistry textbooks and teachers frequently use a variety of metaphors, analogies and models to describe atomic and molecular structures and processes. While it is widely believed that multiple analogical models encourage students to construct appropriate mental models of chemical phenomena, uncritical use of multiple analogical models may actually be responsible for a number of alternative conceptions in chemistry. Students hear and read about electron clouds and shells, atoms that are like miniature solar systems and balls, and molecules that are simultaneously represented by balls-and-sticks, joined spheres, electron-dot and structural diagrams. A strong case has been made that students try to integrate these diverse analogical models resulting in the generation of unscientific synthetic models. Conceptual change research programs also propose that carefully designed teaching and learning activities can stimulate students to exchange their intuitive and synthetic conceptions for more scientific conceptions.This thesis investigates the occurrence of students' intuitive and synthetic mental models of atoms and molecules at both a general and specific level. The investigations consisted in the first phase of semi-structured interviews with 48 Year 8-10 science students. While the data were predominantly qualitative the interviews also generated simple quantitative data. The second phase was wholly qualitative and involved the researcher as teacher' in the Year 11 class. Portfolios were compiled for each student in the class and six portfolios were interpreted to produce a set of case studies describing the students' learning about atoms, molecules and bonds. These data were derived from transcripts of class discussions and individual interviews; pre-tests, formative tests and post-tests; student essays and worksheets and analogical teaching events. The data were ++ / interpreted from a constructivist viewpoint with attention given to credibility, viability and transferability, and dependability. The desire to collect every piece of useful data was constrained by the ethical need to minimise the disruptive effect of the research on the students' normal learning.The first or general phase of this study investigated the question: With what models of atoms and molecules are lower secondary science students familiar? The interviews about atomic and molecular conceptions held by the Year 8-10 students found, for example, that some students confused atoms with cells because both have a nucleus, while others believed that electron shells enclose and protect the atom. All but two students visualised atoms with large nuclei and close static electrons. A majority of this student sample were confused by ball-and- stick molecular models and had a strong preference for space-filling molecular models because they were more 'real'.The second or specific phase of this study consisted of an in-depth study of the development of mental models of atoms, molecules and bonds by six Year 11 chemistry students over 40 weeks of instruction. This study investigated the question: Do systematically presented multiple analogical models help students change their conceptions of atoms, molecules and bonds in favour of the scientific view? The students' prior mental models of an atom were dominated by a solar system model with the electrons in simple shells. A variety of metaphors, analogical models and explanations emphasising the diffuse spaciousness of atoms helped three students restructure their conceptions in favour of the scientific concept. Students also were encouraged to identify the shared and unshared attributes of familiar molecular models and, in time, three students became competent multiple modellers. It is claimed that these three students ++ / changed their conceptions of atoms and molecules in the sense that they realised that models are thinking and communicative tools, not reality itself. The significant change in these students' thinking was their recognition that atomic and molecular analogical models are context-dependent.The phase two study's pre-occupation with conceptual change or knowledge restructuring raised an important methodological question: Is a multi-dimensional approach a better way to interpret conceptual change learning? or, are the various theoretical perspectives on conceptual change complementary? The study's theoretical framework found that conceptual change learning can be interpreted from epistemological, ontological, motivational, holistic explanatory and developmental perspectives. The collection and analysis of the data showed that student modelling ability and Perry's model of intellectual development were powerful interpretive tools when data needed to be examined from multiple perspectives. The six case studies support the assertion that multi-dimensional interpretive frameworks have superior credibility and viability compared to uni-dimensional studies.Finally, the research raised several questions requiring further investigation. No direct support was found for the claim that dissatisfaction is central to conceptual change. This issue needs much more study due to the popularity of discrepant event teaching. While a multi-dimensional conceptual change model has been synthesised, this model needs further refinement as does the issue of how to monitor the status of students' conceptions. A most promising line of pedagogical research is the value of teaching scientific modelling through the use of multiple systematic analogical models.
234

A FRAMEWORK FOR CONCEPTUAL INTEGRATION OF HETEROGENEOUS DATABASES

Srinivasan, Uma, Computer Science & Engineering, Faculty of Engineering, UNSW January 1997 (has links)
Autonomy of operations combined with decentralised management of data has given rise to a number of heterogeneous databases or information systems within an enterprise. These systems are often incompatible in structure as well as content and hence difficult to integrate. This thesis investigates the problem of heterogeneous database integration, in order to meet the increasing demand for obtaining meaningful information from multiple databases without disturbing local autonomy. In spite of heterogeneity, the unity of overall purpose within a common application domain, nevertheless, provides a degree of semantic similarity which manifests itself in the form of similar data structures and common usage patterns of existing information systems. This work introduces a conceptual integration approach that exploits the similarity in meta level information in existing systems and performs metadata mining on database objects to discover a set of concepts common to heterogeneous databases within the same application domain. The conceptual integration approach proposed here utilises the background knowledge available in database structures and usage patterns and generates a set of concepts that serve as a domain abstraction and provide a conceptual layer above existing legacy systems. This conceptual layer is further utilised by an information re-engineering framework that customises and packages information to reflect the unique needs of different user groups within the application domain. The architecture of the information re-engineering framework is based on an object-oriented model that represents the discovered concepts as customised application objects for each distinct user group.
235

The Role of Mental Imagery in Conceptual Designing

Bilda, Zafer January 2006 (has links)
PhD / In design literature, how designers think and how they design have been identified as a reflection of how they interact with their sketches. Sketching in architectural design is still a central concern which shapes our understanding of the design process and the development of new tools. Sketching not only serves as a visual aid to store and retrieve conceptualisations, but as a medium to facilitate more ideas, and to revise and refine these ideas. This thesis examined how mental imagery and sketching is used in designing by conducting a protocol analysis study with six expert architects. Each architect was required to think aloud and design under two different conditions: one in which s/he had access to sketching and one in which s/he was blindfolded (s/he did not have access to sketching). At the end of the blindfold condition the architects were required to quickly sketch what they held in their minds. The architects were able to come up with satisfying design solutions and some reported that using their imagery could be another way of designing. The resulting sketches were assessed by judges and were found to have no significant differences in overall quality. Expert architects were able to construct and maintain the design of a building without having access to sketching. The analysis of the blindfold and sketching design protocols did not demonstrate any differences in the quantity of cognitive actions in perceptual, conceptual, functional and evaluative categories. Each architect’s cognitive structure and designing behaviour in the blindfold activity mimicked her/his cognitive structure and designing behaviour in the sketching activity. The analysis of links between the design ideas demonstrated that architects’ performance in idea development was higher under the blindfold condition, compared to their sketching condition. It was also found that architects’ blindfold design performance was improved when they were more familiar with the site layout. These results imply that expert designers may not need sketching as a medium for their reflective conversation with the situation. This study indicates that constructing internal representations can be a strong tool for designing. Future studies may show that designers may not need sketching for the generation of certain designs during the early phases of conceptual designing.
236

A Probabilistic Approach to Conceptual Sensor Modeling

Sonesson, Mattias January 2005 (has links)
<p>This report develops a method for probabilistic conceptual sensor modeling. The idea is to generate probabilities for detection, recognition and identification based on a few simple factors. The</p><p>focus lies on FLIR sensors and thermal radiation, even if discussions of other wavelength bands are made. The model can be used as a hole or some or several parts can be used to create a simpler model. The core of the model is based on the Johnson criteria that uses resolution as the input parameter. Some extensions that models other factors are also implemented. In the end a short discussion of the possibility to use this model for other sensors than FLIR is made.</p>
237

Improving children's understanding of mathematical equivalence

Watchorn, Rebecca P. D. 06 1900 (has links)
A great majority of children in Canada and the United States from Grades 2-6 fail to solve equivalence problems (e.g., 2 + 4 + 5 = 3 + __) despite having the requisite addition and subtraction skills. The goal of the present study was to determine the relative influence of two variables, instructional focus (procedural or conceptual) and use of manipulatives (with or without), in helping children learn to solve equivalence problems and develop an appropriate understanding of the equal sign. Instruction was provided in four conditions consisting of the combination of these two variables. Students in Grade 2 (n = 122) and Grade 4 (n = 151) participated in four sessions designed to assess the effectiveness of four instructional methods for learning and retention. Session 1 included a pretest of equivalence problem solving and three indicators of understanding of the equal sign. In Sessions 2 and 3 instruction was provided in one of the four instructional conditions or a control condition. Students were tested for their skill at solving equivalence problems immediately following instruction and at the beginning of Session 3 to assess what they had retained from Session 2. In Session 4, one month later, children were re-tested on all of the tasks presented in Session 1 to assess whether instruction had a lasting effect. All four instructional groups outperformed the control group in solving equivalence problems, but differences among instructional groups were minimal. Performance on indicators of understanding, however, favoured students who received conceptually focused instruction. Preliminary evidence was found that children’s understanding of problem structure and attentional skill may be associated with the ability to benefit from instruction on equivalence problems. Children clustered into four groups based on their performance across tasks that are consistent with the view that children’s understanding of the equal sign develops gradually, beginning with learning the definition. These findings suggest that a relatively simple intervention can markedly improve student performance in the area of mathematical equivalence, and that these improvements can be maintained over a period of time and show some limited generality to other indicators that children understand equivalence.
238

Multidisciplinary Design in Aeronautics, Enhanced by Simulation-Experiment Synergy

Melin, Tomas January 2006 (has links)
This thesis covers some aspects of current aircraft design, and presents how experiment and simulation are used as tools. Together they give enhanced effects over employing either one separately. The work presented has been produced using both simulations and experiments. An overview of aircraft design tools is presented, together with a description of their application in research. Participation in two major design projects, HELIX and the Rescue wing, gave an opportunity to combine traditional experimental and computational tools. They also serve as a platform for developing two new tools, the vortex lattice program Tornado and the DoTrack camera based wind tunnel measurement system. The HELIX project aimed at exploring new, unconventional high-lift systems, such as blown flaps, flaperons and active vortex generators. The concepts were investigated with an array of conceptual design tools, ranging from handbook methods to high Reynold’s number wind tunnels. The research was done in several stages. After each stage the concepts failing to reach specifications were discontinued. The active vortex generator concept is followed in detail from the first phase in the HELIX project, and was finally evaluated by full computational fluid dynamics (CFD) and wind tunnel testing. The lessons learned in HELIX were applied to the Rescue wing project, where a kite balloon system for emergency localization was developed. The project is truly multidisciplinary, and both experiment and simulation had to be used in close conjunction. Lack of appropriate methods for measurement and analysis of this kind of device meant that new methods had to be developed. Recent experience of academia working closely together with industry has shown substantial benefits to all parties involved. The synergy of computer modeling and simulation with experiment plays an important role in the common collaborative modus operandi of academia and industry. In particular, the later stages of aeronautic educational programmes should actively pursue such collaboration. / QC 20100910
239

Modelling the Mind: Conceptual Blending and Modernist Narratives

Copland, Sarah 18 February 2010 (has links)
This thesis offers a new approach to mind modelling in modernist narratives. Taking Nietzsche’s work as exemplary of modernist ideas about cognition’s relational basis, I argue that conceptual blending theory, a particularly cogent model of a fundamental cognitive process, has roots in modernism. I read inscriptions of relational cognition in modernist narratives as “conceptual blends” that invite cognitive mobility as a central facet of reader response. These blends, which integrate conceptual domains, invite similarity-seeing and difference-seeing, exposing the reader to new conceptual content and new cognitive styles; she is thus better able to negotiate the reading-related complexities of modernist narrative’s formal innovations and the real-world complexities of modernity’s local and global upheavals. Chapter One considers blending’s interrelated rhetorical motivations and cognitive effects in Chiang Yee’s Silent Traveller narratives: bringing together English and Chinese domains, Chiang’s blends defamiliarize his readers’ culturally entrenched assumptions, invite collaborative reading strategies, and thus equip his readers for relating flexibly to a newly globalized world. Moving away from blends in a text’s narration, Chapter Two focuses on blends as textual structuring principles. I read Virginia Woolf’s The Waves as a thinking mind with fundamentally relational cognitive processes; I consider the mobile cognitive operations we perform reading about a text’s mind thinking and thinking along with it. Chapters Three and Four cross the nebulous text-peritext border to examine blends in modernist prefaces. Chapter Three focuses on blends in Joseph Conrad’s and Henry James’s prefaces, relating them, through the reading strategies they invite, to the narratives they accompany. Chapter Four considers allographic prefaces to Arthur Morrison’s Tales of Mean Streets and two of Chiang’s narratives: blends in these prefaces invite the cognitive mobility necessary for reconceptualizing both allographic preface-text and East-West relations. All four chapters treat the modernist narrative text as a textual system whose blends, often interacting and borderless, signal reciprocal, mutually permeable relations among its textual levels. Dialogic relations also underwrite the interaction between these blends and blends the reader performs when engaging with them. Modernist narratives model (bear inscriptions of) cognition’s relational processes in order to model (shape) the reader’s mind.
240

Data Quality By Design: A Goal-oriented Approach

Jiang, Lei 13 August 2010 (has links)
A successful information system is the one that meets its design goals. Expressing these goals and subsequently translating them into a working solution is a major challenge for information systems engineering. This thesis adopts the concepts and techniques from goal-oriented (software) requirements engineering research for conceptual database design, with a focus on data quality issues. Based on a real-world case study, a goal-oriented process is proposed for database requirements analysis and modeling. It spans from analysis of high-level stakeholder goals to detailed design of a conceptual databases schema. This process is then extended specifically for dealing with data quality issues: data of low quality may be detected and corrected by performing various quality assurance activities; to support these activities, the schema needs to be revised by accommodating additional data requirements. The extended process therefore focuses on analyzing and modeling quality assurance data requirements. A quality assurance activity supported by a revised schema may involve manual work, and/or rely on some automatic techniques, which often depend on the specification and enforcement of data quality rules. To address the constraint aspect in conceptual database design, data quality rules are classified according to a number of domain and application independent properties. This classification can be used to guide rule designers and to facilitate building of a rule repository. A quantitative framework is then proposed for measuring and comparing DQ rules according to one of these properties: effectiveness; this framework relies on derivation of formulas that represent the effectiveness of DQ rules under different probabilistic assumptions. A semi-automatic approach is also presented to derive these effectiveness formulas.

Page generated in 0.0874 seconds