• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 267
  • 111
  • 90
  • 36
  • 26
  • 24
  • 21
  • 14
  • 7
  • 6
  • 6
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 733
  • 140
  • 138
  • 131
  • 101
  • 90
  • 87
  • 82
  • 81
  • 68
  • 66
  • 64
  • 63
  • 63
  • 62
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
461

Interpretive Functions of Adjectives in English : A Cognitive Approach

Frännhag, Helena January 2010 (has links)
This thesis presents a theoretical discussion of meaning creation in general, and interpretive functions of English adjectives in particular. The discussion rests on a dynamic view of meaning and interpretation, according to which there are no fixed linguistics meanings – not even for single lexemes. Instead of symbolising meaning in a more or less static and ‘eternal’ fashion, linguistic items are assumed to effect the creation of meaning and to shape meaning dynamically in the particular communicative event at hand, from some kind of underlying ‘raw material’ (also referred to as purport and schemas). It is suggested that the interpretive functions of linguistic items – that is the effects that such items have in the creation of meaning – may be approached in two main ways, namely from the formal and from the semantic point of view respectively. Effects triggered by the form of a certain item are referred to as formal interpretive functions (FIFs), and effects prompted by the meaning created for the form are referred to as semantic interpretive functions (SIFs). FIFs are claimed to be the same for all items – namely to activate, delimit and shape underlying purport and schemas – whereas SIFs are said to differ between items, and also for one and the same item on different occasions of use. It is furthermore suggested that FIFs affect the creation of meaning for the relevant item itself, whereas SIFs affect the creation of meaning for other items, on any level of conceptual organisation. For instance, a form such as tall typically activates and delimits purport and schemas to do with some kind of extension (notably in space), thereby shaping a basic word meaning tall. The meaning thus created may in turn affect other meaning in the larger context. For instance, tall, as created in default interpretation of a tall man entered the room, affects the meaning of the noun phrase a tall man as a whole, in that it specifies the interpreter’s conception of a certain something that entered a specific room. In this case, the relevant SIF is thus to specify. Other SIFs suggested for adjectives are kind identification, element identification, identity provision and stipulation. The aim of the thesis is two-fold: on the one hand to outline a suggestive theory of meaning creation and interpretive function in general, and, on the other hand, to present a theoretical discussion of adjective functions in particular, with the ultimate goal of providing a general framework from which more specific models for in-depth empirical research can be obtained.
462

Modelling and forecasting economic time series with single hidden-layer feedforward autoregressive artificial neural networks

Rech, Gianluigi January 2001 (has links)
This dissertation consists of 3 essays In the first essay, A Simple Variable Selection Technique for Nonlinear Models, written in cooperation with Timo Teräsvirta and Rolf Tschernig, I propose a variable selection method based on a polynomial expansion of the unknown regression function and an appropriate model selection criterion. The hypothesis of linearity is tested by a Lagrange multiplier test based on this polynomial expansion. If rejected, a kth order general polynomial is used as a base for estimating all submodels by ordinary least squares. The combination of regressors leading to the lowest value of the model selection criterion is selected.  The second essay, Modelling and Forecasting Economic Time Series with Single Hidden-layer Feedforward Autoregressive Artificial Neural Networks, proposes an unified framework for artificial neural network modelling. Linearity is tested and the selection of regressors performed by the methodology developed in essay I. The number of hidden units is detected by a procedure based on a sequence of Lagrange multiplier (LM) tests. Serial correlation of errors and parameter constancy are checked by LM tests as well. A Monte-Carlo study, the two classical series of the lynx and the sunspots, and an application on the monthly S&amp;P 500 index return series are used to demonstrate the performance of the overall procedure. In the third essay, Forecasting with Artificial Neural Network Models (in cooperation with Marcelo Medeiros), the methodology developed in essay II, the most popular methods for artificial neural network estimation, and the linear autoregressive model are compared by forecasting performance on 30 time series from different subject areas. Early stopping, pruning, information criterion pruning, cross-validation pruning, weight decay, and Bayesian regularization are considered. The findings are that 1) the linear models very often outperform the neural network ones and 2) the modelling approach to neural networks developed in this thesis stands up well with in comparison when compared to the other neural network modelling methods considered here. / <p>Diss. Stockholm : Handelshögskolan, 2002. Spikblad saknas</p>
463

Explicit or Symbolic Translation of Linear Temporal Logic to Automata

Rozier, Kristin Yvonne 24 July 2013 (has links)
Formal verification techniques are growing increasingly vital for the development of safety-critical software and hardware in practice. Techniques such as requirements-based design and model checking for system verification have been successfully used to verify systems for air traffic control, airplane separation assurance, autopilots, CPU logic designs, life-support, medical equipment, and other functions that ensure human safety. Formal behavioral specifications written early in the system-design process and communicated across all design phases increase the efficiency, consistency, and quality of the system under development. We argue that to prevent introducing design or verification errors, it is crucial to test specifications for satisfiability. We advocate for the adaptation of a new sanity check via satisfiability checking for property assurance. Our focus here is on specifications expressed in Linear Temporal Logic (LTL). We demonstrate that LTL satisfiability checking reduces to model checking and satisfiability checking for the specification, its complement, and a conjunction of all properties should be performed as a first step to LTL model checking. We report on an experimental investigation of LTL satisfiability checking. We introduce a large set of rigorous benchmarks to enable objective evaluation of LTL-to-automaton algorithms in terms of scalability, performance, correctness, and size of the automata produced. For explicit model checking, we use the Spin model checker; we tested all LTL-to-explicit automaton translation tools that were publicly available when we conducted our study. For symbolic model checking, we use CadenceSMV, NuSMV, and SAL-SMC for both LTL-to-symbolic automaton translation and to perform the satisfiability check. Our experiments result in two major findings. First, scalability, correctness, and other debilitating performance issues afflict most LTL translation tools. Second, for LTL satisfiability checking, the symbolic approach is clearly superior to the explicit approach. Ironically, the explicit approach to LTL-to-automata had been heavily studied while only one algorithm existed for LTL-to-symbolic automata. Since 1994, there had been essentially no new progress in encoding symbolic automata for BDD-based analysis. Therefore, we introduce a set of 30 symbolic automata encodings. The set consists of novel combinations of existing constructs, such as different LTL formula normal forms, with a novel transition-labeled symbolic automaton form, a new way to encode transitions, and new BDD variable orders based on algorithms for tree decomposition of graphs. An extensive set of experiments demonstrates that these encodings translate to significant, sometimes exponential, improvement over the current standard encoding for symbolic LTL satisfiability checking. Building upon these ideas, we return to the explicit automata domain and focus on the most common type of specifications used in industrial practice: safety properties. We show that we can exploit the inherent determinism of safety properties to create a set of 26 explicit automata encodings comprised of novel aspects including: state numbers versus state labels versus a state look-up table, finite versus infinite acceptance conditions, forward-looking versus backward-looking transition encodings, assignment-based versus BDD-based alphabet representation, state and transition minimization, edge abbreviation, trap-state elimination, and determinization either on-the-fly or up-front using the subset construction. We conduct an extensive experimental evaluation and identify an encoding that offers the best performance in explicit LTL model checking time and is constantly faster than the previous best explicit automaton encoding algorithm.
464

A Robust Traffic Sign Recognition System

Becer, Huseyin Caner 01 February 2011 (has links) (PDF)
The traffic sign detection and recognition system is an essential part of the driver warning and assistance systems. In this thesis, traffic sign recognition system is studied. We considered circular, triangular and square Turkish traffic signs. For detection stage, we have two different approaches. In first approach, we assume that the detected signs are available. In the second approach, the region of interest of the traffic sign image is given. Traffic sign is extracted from ROI by using a detection algorithm. In recognition stage, the ring-partitioned method is implemented. In this method, the traffic sign is divided into rings and the normalized fuzzy histogram is used as an image descriptor. The histograms of these rings are compared with the reference histograms. Ring-partitions provide robustness to rotation because the rotation does not change the histogram of the ring. This is very critical for circle signs because rotation is hard to detect in circle signs. To overcome illumination problem, specified gray scale image is used. To apply this method to triangle and square signs, the circumscribed circle of these shapes is extracted. Ring partitioned method is tested for the case where the detected signs are available and the region of interests of the traffic sign is given. The data sets contain about 500 static and video captured images and the images in the data set are taken in daytime.
465

Issues in Specifying Requirements for Adaptive Software Systems

Peng, Qian January 2009 (has links)
<p>This thesis emphasizes on surveying the state-of-the-art in software requirements specification with a focus on, autonomic, self-adapting software systems. Since various requirements are brought forward accord with environments, modeling requirements for adaptive software systems may be changed at run-time. Nowadays, Keep All Objectives Satisfied (KAOS) is an effective method to build goal model. Various manipulations, such as change, remove, active and de-active goals, appear new goals, could mediate conflicts among goals in adaptive software system. At specification time, specifications of event sequences to be monitored are generated from requirements specification.</p>
466

Understanding the mechanisms of floor plate specification in the vertebrate midbrain and its functions during development

Bayly, Roy Downer, 1981- 15 October 2009 (has links)
We have previously shown that the arcuate organization of cell fates within the ventral midbrain critically depends upon the morphogen, Sonic Hedgehog (SHH), which is secreted from a signaling center located along the ventral midline, called the floor plate (FP). Thus, it is ultimately the specification of the FP that is responsible for the patterning and specification of ventral midbrain cell fates. Interestingly, we have found that the chick midbrain FP can be divided into medial (MFP) and lateral (LFP) regions on the basis of gene expression, mode of induction and function. Overexpression of SHH alone is sufficient to recapitulate the entire pattern of ventral cell fates, although remarkably it cannot induce MFP, consistent with the observation that the MFP is refractory to any perturbations of HH signaling. In contrast, overexpression of the winged-helix transcription factor FOXA2/HNF3[beta]robustly induced the MFP fate throughout ventral midbrain while blocking its activity resulted in the absence of the MFP. Thus, by analyzing the differences between SHH and FOXA2 blockade and overexpression, we were able to attribute functions to each the LFP and the MFP. Notably, we observed that FOXA2 overexpression caused a bending of the midbrain neurepithelium that resembled the endogenous median hinge-point observed during neurulation. Additionally, FOXA2 misexpression led to a robust induction of DA progenitors and neurons that was never observed after SHH expression alone. In contrast, we found that all other ventral cell types required HH signaling directly, at a distance and early on in the development of the midbrain when its tissue size is relatively small. Additionally, HH blockade resulted in increased cell-scatter of the arcuate territories and in the disruption of the regional boundaries between the ventral midbrain and adjacent tissue. Thus, we bring new insight into the mechanism by which midbrain FP is specified and ascribe functional roles to its subregions. We propose that while the MFP regulates the production of dopaminergic progenitors and the changes in cellshape required for bending and shaping the neural tube, the LFP appears to be largely responsible for cell survival and the formation of a spatially coherent pattern of midbrain cell fates. / text
467

Reikalavimų infomacinei sistemai specifikavimo veiklos taisyklių pagrindu rezultatų saugykla / Repository for the results of business rules based IS requirements specification

Vyzas, Donatas 16 January 2006 (has links)
The purpose of this project was to create a requirement repository using a methodology, which is being developed in the department of Information Systems of Kaunas University of Technology. The implementation would raise the methodology to the higher level enabling its use in practice. Using the implementation one could either prove or disprove the main idea behind the methodology - the simplification of the stakeholder-analyst dialogue during the IS requirements specification. For that, the experiment was conducted. The purpose of the experiment was to specify requirements for the real-world IS using implemented requirement repository, and compare the results to other similar software.
468

Plynose kirtavietėse paliekamų biologinės įvairovės medžių būklės ir išlikimo analizė / Survival and Status Analysis of biodiversity trees left in the clean-cut

Morkūnas, Vilius 14 January 2009 (has links)
Magistro darbe tiriama paliktų biologinei įvairovei medžių plyno kirtimo biržėse būklė ir išlikimas. Darbo objektas – plyno kirtimo biržės ir jose palikti biologinės įvairovės medžiai. Darbo tikslas – nustatyti plynose kirtavietėse paliekamų biologinės įvairovės medžių būklę, rūšinę sudėtį, kiekį ir pasiskirstymą bei išlikimą kirtavietėje. Darbo metodika – atliekant tyrimą, dėmesys buvo kreipiamas į du skirtingus parametrus. Pirmiausia buvo išsiaiškintas atskirų medžių atsparumas aplinkos poveikiams, priklausomai nuo jų individualių savybių. Kitas tyrimas, kuriame buvo apskaičiuota žuvusių medžių dalis kirtavietėje, priklausomai nuo kirtavietės parametrų ir medžių išsidėstymo joje. Darbo rezultatai – Plynose kirtavietėse vidutiniškai paliekama 11 medžių hektare, iš kurių per pirmuosius šešerius metus 14 procentų žūsta. Didėjant kirtavietės dydžiui bendras išlikimo procentas mažėja. Geriau išlieka grupėmis paliekami medžiai nei medžiai paliekami pavieniui. Didelę reikšmę medžių išlikimui turi vėjas - du trečdaliai žuvusių medžių žuvo dėl vėjo, buvo išversti arba nulaužti. Didžiausią poveikį vėjas turi medžiams, kurių paviršinė šaknų sistema, ir medžiams, kurių storio ir aukščio santykis mažesnis. Didėjant atstumui nuo kirtavietės krašto, dėl didesnio vėjo poveikio, medžių išlikimas mažėja. Atskirų medžių rūšių išlikimas priklauso nuo augavietės atitikimo medžių rūšiai ir paliekamų medžių individualių parametrų. Atrenkant paliekamus biologinei įvairovei medžius didesnis... [toliau žr. visą tekstą] / The master thesis presents evaluation of the trees condition and ability to survive, which are left for biological diversity in the clear cutting areas. Object of the work – trees left for biological diversity in clear cutting areas Aim of the work – to evaluate the condition, variety, quantity and layout of the trees, left for biological diversity in clear cutting areas. Methods of work- attention were paid to two different indicates: first, it has been investigated trees ability to resist environmental factors, depending on individual aspects of each tree. Second, it has been investigated area of coppice, where it was examined fraction of dead trees depending of trees layout and different factors in the coppice. Result of the work – there are approximately eleven trees left in each hectare and 14 percent of them do not survive during the first 6 years. The more the coppice expands the less possibility of tree to survive. The trees are more likely to survive when they are left in groups rather than on their own. The wind has important implication to the trees; two thirds of the trees get destroyed by the wind. The trees that are mostly in danger due to the wind are the ones that have roots spread out close to the ground surface and which have a reduced diameter and height ratio. Trees that are in the middle of the coppice have an increased risk to die from wind. Various trees survive depending on the type of the tree specification and its environmental... [to full text]
469

Verslo sistemų modelio analizė, panaudojant agregatinę schemą ir loginį programavimą / Analysis of business systems REA model using aggregate schema and logic programming

Janušauskaitė, Živilė 06 June 2006 (has links)
This work presents business process analysis methodology which consists of presentation of the business processes created on the ground of the Resource Event Agent model by means of Piece–Linear Aggregate approach. The aggregate specification is analyzing using first order predicate logic while checking correctness by resolution method using logic programming based language Prolog. The work is concluded with concrete example of analysis of Resource Event Agent model based business process using the aggregate approach. The novelty of this work PLA (Piece-Linear Aggregate) model and the software tools, created on the ground of PLA (Piece-Linear Aggregate), are used the first time for business processes analysis that is defined using REA formalism. The use of such integrated models allows performing the automated analysis of general and individual properties (completeness, deadlock freeness, termination or cyclic behavior, boundedness) of defined business processes. The main results are achieved: • The methodology, that consists of presentation of the business processes created on the ground of the Resource Event Agent model by means of Piece–Linear Aggregate approach. • Verification and validation of general and individual properties by using PLA and PROLOG language approach designed system that executes the analysis of aggregate specification. • Implementing internal accounting controls as constraints in relational algebra, SQL and PROLOG language. • Concrete... [to full text]
470

Valdymo taisyklėmis ribojamų komponentų sąsajų specifikavimo metodika / Interface specification technique capturing control flow rules

Balandytė, Milda 31 May 2004 (has links)
This work presents development of Human Resource Management System based on analysis of modern development trends as well as fundamentals of key functions and data structures of human resource management systems in organizations. Introduced general-purpose model meets the requirements of human resource management systems and fits for the organizations of any size and structure. The system, mentioned above, was designed using ICONIX, Martin-Odell, RUP, XP, UMM methodologies, tested using black and white box techniques and implemented by means of Lotus Notes/Domino SDK . Created software was installed for the trial period at the joint-stock company 'PTI Technologijos'. The conceptual part of the theses represents component interface specification technique capturing control flow rules. It describes a clear process for moving from business requirements to system specification and identifying system behavior rules conditioning interfaces of the system. Proposed model facilitates dealing with the change and substitutability of business rules, what can be achieved only if the system is properly specified. Interface specification technique was used in practice to design the interface between human resource management system and accountancy system.

Page generated in 0.0799 seconds