• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1314
  • 700
  • 234
  • 112
  • 97
  • 43
  • 36
  • 18
  • 16
  • 16
  • 15
  • 15
  • 11
  • 10
  • 10
  • Tagged with
  • 3151
  • 582
  • 547
  • 368
  • 355
  • 298
  • 296
  • 294
  • 237
  • 221
  • 215
  • 208
  • 191
  • 186
  • 180
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
981

Action, Time and Space in Description Logics

Milicic, Maja 08 September 2008 (has links) (PDF)
Description Logics (DLs) are a family of logic-based knowledge representation (KR) formalisms designed to represent and reason about static conceptual knowledge in a semantically well-understood way. On the other hand, standard action formalisms are KR formalisms based on classical logic designed to model and reason about dynamic systems. The largest part of the present work is dedicated to integrating DLs with action formalisms, with the main goal of obtaining decidable action formalisms with an expressiveness significantly beyond propositional. To this end, we offer DL-tailored solutions to the frame and ramification problem. One of the main technical results is that standard reasoning problems about actions (executability and projection), as well as the plan existence problem are decidable if one restricts the logic for describing action pre- and post-conditions and the state of the world to decidable Description Logics. A smaller part of the work is related to decidable extensions of Description Logics with concrete datatypes, most importantly with those allowing to refer to the notions of space and time.
982

Komplexitetsmax

Hoberg, Christer January 2006 (has links)
<p>This study explores the development of an engineering praxis as a result of a cooperation between an engineering company and the programme KTH <i>Advanced Programme in Reflective Practice</i> of the research area of Skill and technology at the Royal Institute of Technology (KTH) in Stockholm.</p><p>The dialogue seminar method has been used and taken as a starting point for the dialogue in groups of engineers. Using the writings of René Descartes, Ludwik Fleck, Einar Már Gudmundsson and Fridtjof Nansen the study begins to formulate an approach – Maximum Complexity – to the skill of problem-solving.</p><p>The approach has been presented previously in a short form in<i> Dialogue,</i> <i>Skill and Tacit Knowledege</i> edited by Bo Göranzon, Maria Hammarén and Richard Ennals.</p>
983

Scheduling of 2-operation jobs on a single machine to minimize the number of tardy jobs [electronic resource] / by Radhika M. Yeleswarapu.

Yeleswarapu, Radhika M. January 2003 (has links)
Title from PDF of title page. / Document formatted into pages; contains 80 pages. / Thesis (M.S.I.E.)--University of South Florida, 2003. / Includes bibliographical references. / Text (Electronic thesis) in PDF format. / ABSTRACT: This study focuses on the study of a unique but commonly occurring manufacturing problem of scheduling of customized jobs consisting of two operations on a single multi-purpose machine with the performance objective of minimizing the number of tardy jobs (jobs that are not completed by their due dates). Each customized job to be complete needs one unique operation and one common operation performed on it. We considered a static case in this work. The objective of minimizing the number of tardy jobs is considered where all jobs have equal weights and the maximum tardiness has no effect on the performance. This problem is proved in literature as NP-hard and hence practically very difficult to obtain optimal solution within reasonable computational time. Till date only a pseudo-polynomial algorithm is given to solve this problem with no concrete computational experiments designed to prove the efficiency and working of the algorithm for different problem instances. / ABSTRACT: We propose a heuristic algorithm based on the Moore-Hodgson's algorithm combining with other procedures and optimal schedule properties from the literature to solve this problem. In literature, Moore-Hodgson's algorithm is an efficient heuristic algorithm that minimizes the number of tardy jobs for the classical single machine one-operation problems. The performance of the heuristic is evaluated through extensive computational experiments for large real size data. The obtained results are compared to the solutions obtained by implementing the optimal pseudo-polynomial algorithm and the performance of the heuristic is tested on large data sets. The test data for the computational experiments are generated randomly using MATLAB 6.1. Future directions of research and development on the problem to improve the obtained solution by the heuristic algorithm are given. / System requirements: World Wide Web browser and PDF reader. / Mode of access: World Wide Web.
984

Compression Techniques for Boundary Integral Equations - Optimal Complexity Estimates

Dahmen, Wolfgang, Harbrecht, Helmut, Schneider, Reinhold 05 April 2006 (has links) (PDF)
In this paper matrix compression techniques in the context of wavelet Galerkin schemes for boundary integral equations are developed and analyzed that exhibit optimal complexity in the following sense. The fully discrete scheme produces approximate solutions within discretization error accuracy offered by the underlying Galerkin method at a computational expense that is proven to stay proportional to the number of unknowns. Key issues are the second compression, that reduces the near field complexity significantly, and an additional a-posteriori compression. The latter one is based on a general result concerning an optimal work balance, that applies, in particular, to the quadrature used to compute the compressed stiffness matrix with sufficient accuracy in linear time. The theoretical results are illustrated by a 3D example on a nontrivial domain.
985

Randomness and completeness in computational complexity

Melkebeek, Dieter van. January 2000 (has links)
Texte remanié de : Ph. D : Mathématiques : University of Chicago : 1999. / Notes bibliogr.
986

Which CBSC-objectives matter? : A multiple case study of corporate managers’ focus in corporate control

Barkman, Daniel, Sörensen, Nils January 2015 (has links)
This study investigates which objectives in the corporate balanced scorecard (CBSC) that corporate managers in large unlisted companies focus on within corporate control. It also investigates what the explanatory factors are for the corporate managers’ focus. The CBSC was proposed to alleviate the historical financial focus of managers in control activities. This study makes a contribution by reviewing corporate managers' focus on financial and non-financial CBSC-objectives in corporate control. A multiple case study was conducted, consisting of a mutually owned and a governmentally owned company, where data was collected from semi-structured interviews, internal documents and observations. Results indicate that corporate managers from the mutually owned company primarily focused on financial and customer objectives. Corporate managers in the governmentally owned firm primarily focused on financial objectives, complemented with quality objectives. Although having a mixed influence, the perceived complexity of measures, relationship between objectives and capital market pressure promoted corporate managers’ focus. The conclusion of this study is that financial objectives are prioritised in corporate control because of the influence of the three explanatory factors.
987

From Pond to Plate : The implementation of standards in Global Value Chains

Rein, Johanna, Swanson, Michaela January 2015 (has links)
Increased international trade has sparked a debate on the need to coordinate dispersedactivities in Global Value Chains, linking production to end consumers. Implementationof standards has in the literature on Global Value Chains been suggested as a wayto coordinate a value chain. We have investigated the value chain of shrimp andprawns production in Bangladesh, in order to analyze if standards placed by the EUhave proven a successful way to coordinate the value chain. The implementation ofstandards has been studied to capture the coordination in the value chain. A singlecase study was conducted with interviews from a sample of actors in direct or closeconnection to the production of shrimp and prawns in Bangladesh. The focus of thestudy has been on the perspectives of the individuals and if and how standards areimplemented successfully in a social context. The attempt has been to bridgeunderstandings of implementation of standards together with knowledge of the complexnature of Global Value Chains. The results show that there are multiple challengesto successful implementation of standards. Hurdles can especially be linked tothe ability to follow standards where lack of human- and financial resources havebeen found. In addition, the will to follow standards can have an impact when traditionalmethods stand in the way and immediate financial incentives are not in place.
988

Does the Use of Personally Relevant Stimuli in Semantic Complexity Training Facilitate Improved Functional Communication Performance Compared to Non-Personally Relevant Stimulus Items among Adults with Chronic Aphasia?

Karidas, Stephanie 01 January 2013 (has links)
This study investigated the influence of semantic complexity treatment in individuals with fluent aphasia on discourse performance. Semantic treatment is an effective way to improve semantically based word retrieval problems in aphasia. Treatment focused on the semantic application of the Complexity Account of Treatment Efficacy (CATE) (Thompson, Shapiro, Kiran, & Sobecks, 2003) promotes training of complex items resulting in generalization to less complex, untrained items. In addition, research has shown that the personal relevance of treatment material can increase treatment efficacy. This study investigated the effect of semantic treatment of atypical personally relevant items among individuals with aphasia on discourse performance. Two treatment phases were applied to examine the influence of personally relevant and non-relevant treatment material on discourse performance. In addition, generalization from trained atypical items to untrained typical items was investigated. Methods and procedures were partially replicated from Kiran, Sandberg, & Sebastian (2011) examining semantic complexity within goal-derived (ad hoc) categories. Three participants with fluent aphasia were trained on three semantic tasks including category sorting, semantic feature generation/selection, and Yes/No feature questions. A generative naming task was used for probe data collection every second session. Stimuli consisted of atypical items only. The hypothesis that semantic complexity training of personally relevant items from ad hoc categories will produce greater generalization to associated, untrained items than training of non-relevant items and consequently increase discourse performance was not supported. The findings revealed a failure to replicate the magnitude and type of improvements previously reported for the typicality effect in generative naming. Clinical significance was found for personally relevant and non-relevant discourse performance. However, no consistent pattern was found within and across participants. In addition, effect size for generalization from trained atypical to untrained typical items was not significant. Limitations of this study lead to future directions to further specify participation selection, such as cognitive abilities, procedural changes, and the inclusion of discourse performance as an outcome measure. Overall, the results of this study provide weak support for replicating semantic treatment of atypical exemplars in ad-hoc categories and hence demonstrate the critical role of replication across labs to identify key issues in the candidacy, procedures, and outcome measurement of any developing treatment.
989

The Role of the Interruption in Young Adult Epistolary Novels

Herzhauser, Betty J. 01 January 2015 (has links)
Within the genre of young adult literature, a growing trend is the use of epistolary messages through electronic methods between characters. These messages are set apart from the formal text of the narrative of the novel creating a break in the text features and layout of the page. Epistolary texts require a more sophisticated reading method and level of interpretation because the epistolary style blends multiple voices and points of view into the plot, creating complicated narration. The reader must navigate the narrator’s path in order to extract meaning from the text. In this hermeneutic study, I examined the text structures of three young adult novels that contained epistolary excerpts. I used ethnographic content analysis (Altheide 1987) to isolate, analyze, and then contextualize the different epistolary moments within the narrative of the novel. The study was guided by two research questions: 1. What types of text structures and features did authors of selected young adult literature with epistolary interruptions published since 2008 use across the body of the published work? 2. How did the authors of selected young adult literature situate the different text structures of interruption into the flow of the narrative? What happened after the interruption? I used a coding system that I developed from a case study of the novel Falling for Hamlet by Michelle Ray (2011). Through my analysis I found that the authors used specific verbs to announce an interruption. The interruptions, though few in number, require readers to consider context of the message for event, setting, speaker, purpose and tone as it relates within the message itself and the arc of the plot. In addition, following the interruptions, the reader must decide how to incorporate the epistolary interruption into the narrative as adding to the conflict, adding detail, ending a scene, or simply returning to the narrative. . Therefore, the interruptions in epistolary young adult novels incorporated the text or literacy practices of young adults. Such incorporation reflects the changes in literacy practices in the early 21st century that may render novels of this style a challenge to readers in creating meaning. The study further incorporates Bakhtin’s theory of heteroglossia (1980) that a novel does not contain a single language but a plurality of languages within a single langue and Dresang’s Theory of Radical Change (1999) of connectivity, interactivity, and access. Texts of this nature offer teachers of reading opportunities to guide students through text features to synthesize information in fiction and non-fiction texts.
990

The size and depth of Boolean circuits

Jang, Jing-Tang Keith 27 September 2013 (has links)
We study the relationship between size and depth for Boolean circuits. Over four decades, very few results were obtained for either special or general Boolean circuits. Spira showed in 1971 that any Boolean formula of size s can be simulated in depth O(log s). Spira's result means that an arbitrary Boolean expression can be replaced by an equivalent "balanced" expression, that can be evaluated very efficiently in parallel. For general Boolean circuits, the strongest known result is that Boolean circuits of size s can be simulated in depth O(s / log s). We obtain significant improvements over the general bounds for the size versus depth problem for special classes of Boolean circuits. We show that every layered Boolean circuit of size s can be simulated by a layered Boolean circuit of depth O(sqrt{s log s}). For planar circuits and synchronous circuits of size s, we obtain simulations of depth O(sqrt{s}). Improving any of the above results by polylog factors would immediately improve the bounds for general circuits. We generalize Spira's theorem and show that any Boolean circuit of size s with segregators of size f(s) can be simulated in depth O(f(s)log s). This improves and generalizes a simulation of polynomial-size Boolean circuits of constant treewidth k in depth O(k² log n) by Jansen and Sarma. Since the existence of small balanced separators in a directed acyclic graph implies that the graph also has small segregators, our results also apply to circuits with small separators. Our results imply that the class of languages computed by non-uniform families of polynomial size circuits that have constant size segregators equals non-uniform NC¹. As an application of our simulation of circuits in small depth, we show that the Boolean Circuit Value problem for circuits with constant size segregators (or separators) is in deterministic SPACE (log² n). Our results also imply that the Planar Circuit Value problem, which is known to be P-Complete, is in SPACE (sqrt{n} log n). We also show that the Layered Circuit Value and Synchronous Circuit Value problems, which are both P-complete, are in SPACE(sqrt{n}). Our study of circuits with small separators and segregators led us to obtain space efficient algorithms for computing balanced graph separators. We extend this approach to obtain space efficient approximation algorithms for the search and optimization versions of the SUBSET SUM problem, which is one of the most studied NP-complete problems. Finally we study the relationship between simultaneous time and space bounds on Turing machines and Boolean circuit depth. We observe a new connection between planar circuit size and simultaneous time and space products of input-oblivious Turing machines. We use this to prove quadratic lower bounds on the product of time and space for several explicit functions for input-oblivious Turing machines. / text

Page generated in 0.068 seconds