• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6177
  • 1947
  • 1
  • Tagged with
  • 8125
  • 8125
  • 8125
  • 8064
  • 904
  • 830
  • 659
  • 626
  • 614
  • 560
  • 466
  • 400
  • 397
  • 335
  • 334
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
521

Integrering av Lantmäteriets system för fastighetsvärdering

Högberg, Joakim, Sandh, Pontus January 2014 (has links)
Lantmäteriet använder idag ett verktyg för fastighetsvärdering, detta verktyg består av ett system med spridd arkitektur för att lagra fastighetsdata och geografisk information. Studien syftar till att undersöka vad det finns för alternativa tekniker och lösningar för att integrera spatialdata och fastighetsdata i samma databas. För att finna svar på detta analyseras litteratur, svar från enkäter som besvarats av slutanvändare samt resultat från djupintervjuer av systemexpter. Resultatet visar att tekniker som t.ex. ArcSDE och Oracle Spatial kan lösa det angivna problemet. Intervjuerna och enkäterna visar att majoriteten av slutanvändare är nöjda med dagens system men det återkommer kommentarer att kartstödet inte fungerar felfritt. Slutligen rekommenderas en enkel modell på lösningsförslag där Oracle Spatial används med dagens system men även också nollresultatet där en vidareutveckling inte sker.
522

Implementation och utvärdering av Mimer SQL Real-Time på INtime RTOS.

Watson, Daniel January 2014 (has links)
Virtualisering är en ökande trend där man istället för att ha en separat hårdvaruplattform förvarje system, så exekverar flera virtuella system på en och samma maskin. Dessa system kanha olika egenskaper och krav, exempelvis kan ett delsystem med hårda realtidskravinteragera med ett annat delsystem utan realtidskrav. Ett exempel på detta är ett industrielltstyrsystem där ena delsystemet samlar in tidskritisk systemdata i realtid medan det andradelsystemet utför icke tidskritiska åtgärder som är baserad på denna information, som tillexempel att grafiskt åskådliggöra systemstatus.Att dela upp system mellan olika virtuella delsystem ställer höga krav på delning av data. Ettsätt att dela data på ett säkert och strukturerat sätt är att använda sig av en realtidsdatabas.I detta exjobb har vi undersökt möjligheten att anpassa den kommersiella realtidsdatabasenMimer SQL Real-Time Edition för att användas i det virtualiserbara realtidssystemet INtime.Arbetet omfattar undersökningar av olika modeller för delat minne samt synkroniseringmellan systemen. Resultaten av dessa undersökningar har inkorporerats i en version avMimer SQL Real-Time Edition anpassad för virtualisering i INtime. Denna implementationhar följts upp av funktions- och prestandatester som visar att implementationen fungerarbåde funktionellt och prestandamässigt väl samt med bibehållen predikterbarhet jämförtmed tidigare versioner av Mimer SQL Real-Time Edition.
523

Quality of Service Analysis for CloudMAC

Vestin, Jonathan January 2014 (has links)
Software Defined Networking (SDN) is a relatively recent technology, which has seen a rise in popularity. OpenFlow is a very popular SDN technology, but it does not have any support for Quality of Service options. Open vSwitch is a common software switch which supports OpenFlow. Open vSwitch is used in CloudMAC, another recent technology in wireless networking. Testing of CloudMAC has shown that it has a low connection success rate. This thesis investigated the cause of the connection failures, and a solution using traffic control options in Open vSwitch was developed. Furthermore, additional traffic control options were added to Open vSwitch in order to evaluate the effectiveness of different traffic control algorithms. While the connection success rate in a standard network without cross-traffic was 99%, with cross-traffic, the success rate dropped to 43%. With the introduction of traffic control, the success rate reached up to 99% again.
524

Methods for Detecting Unsolvable Planning Instances using Variable Projection

Ståhlberg, Simon January 2017 (has links)
In this thesis we study automated planning, a branch of artificialintelligence, which deals with construction of plans. A plan is typically an action sequence that achieves some specific goal. In particular, we study unsolvable planning instances, i.e. there is no plan. Historically, this topic has been neglected by the planning community, and up to recently the International Planning Competition has only evaluated planners on solvable planning instances. For many applications we can know, e.g. by design, that there is a solution, but this cannot be a general assumption. One example is penetration testing in computer security, where a system inconsidered safe if there is no plan for intrusion. Other examples are resource bound planning instances that have insufficient resources to achieve the goal. The main theme of this thesis is to use variable projection to prove unsolvability of planning instances. We implement and evaluate two planners: the first checks variable projections with the goal of finding an unsolvable projection, and the second builds a pattern collection to provide dead-end detection. In addition to comparing the planners to existing planners, we also utilise a large computer cluser to statistically assess whether they can be optimised further. On the benchmarks of planning instances that we used, it turns out that further improvement is likely to come from supplementary techniques rather than optimisation. We pursued this and enhanced variable projections with mutexes, which yielded a very competitive planner. We also inspect whether unsolvable variable projections tend to be composed of variables that play different roles, i.e. they are not 'similar'. We devise a variable similarity measure to rate how similar two variables are on a scale, and statistically analyse it. The measure can differentiate between unsolvable and solvable planning instances quite well, and is integrated into our planners. We also define a binary version of the measure, namely, that two variables are isomorphic if they behave exactly the same in some optimal solution (extremely similar). With the help of isomorphic variables we identified a computationally tractable class of planning instances that meet certain restrictions. There are several special cases of this class that are of practical interest, and this result encompass them.
525

A Constraint-Based Approach for Hybrid Reasoning in Robotics

Mansouri, Masoumeh January 2016 (has links)
The quest of AI and Robotics researchers to realize fully AI-driven integrated robotic systems has not yet led to such realizations, in spite of great attainments in both research areas. This thesis claims that one of the major hindrances to these realizations is the lack of attention to what we call “the hybrid reasoning problem”. This is the problem of jointly reasoning about heterogeneous and inter-dependent aspects of the world, expressed in different forms and at different levels of abstraction. In this thesis, we propose an approach to hybrid reasoning (or integrated reasoning) for robot applications. Our approach constitutes a systematic way of achieving a domain-specific integration of reasoning capabilities. Its underpinning is to jointly reason about the sub-problems of an overall hybrid problem in the combined search space of mutual decisions. Each sub-problem represents one viewpoint, or type of requirement, that is meaningful in the particular application. We propose a Constraint Satisfaction Problem (CSP) formulation of the hybrid reasoning problem. This CSP, called meta-CSP, captures the dependencies between sub-problems. It constitutes a high-level representation of the (hybrid) requirements that define a particular application. We formalize the meta-CSP in a way that is independent of the viewpoints that are relevant in the application, as is the algorithm used for solving the meta-CSP. In order to verify the applicability of the meta-CSP approach in real-world robot applications, we instantiate it in several different domains, namely, a waiter robot, an automated industrial fleet management application, and a drill pattern planning problem in open-pit mining. These realizations highlight the important features of the approach, namely, modularity, generality, online reasoning and solution adjustment, and the ability to account for domain-specific metric and symbolic knowledge.
526

Algorithms for aggregate information extraction from sequences

Bengtsson, Fredrik January 2007 (has links)
In this thesis, we propose efficient algorithms for aggregate information extraction from sequences and multidimensional arrays. The algorithms proposed are applicable in several important areas, including large databases and DNA sequence segmentation. We first study the problem of efficiently computing, for a given range, the range-sum in a multidimensional array as well as computing the k maximum values, called the top-k values. We design two efficient data structures for these problems. For the range-sum problem, our structure supports fast update while preserving low complexity of range-sum query. The proposed top-k structure provides fast query computation in linear time proportional to the sum of the sizes of a two-dimensional query region. We also study the k maximum sum subsequences problem and develop several efficient algorithms. In this problem, the k subsegments of consecutive elements with largest sum are to be found. The segments can potentially overlap, which allows for a large number of possible candidate segments. Moreover, we design an optimal algorithm for ranking the k maximum sum subsequences. Our solution does not require the value of k to be known a priori. Furthermore, an optimal linear-time algorithm is developed for the maximum cover problem of finding k subsequences of consecutive elements of maximum total element sum. / Godkänd; 2007; 20070528 (ysko)
527

On large-scale neural simulations and applications in neuroinformatics

Benjaminsson, Simon January 2013 (has links)
This thesis consists of three parts related to the in silico study of the brain: technologies for large-scale neural simulations, neural algorithms and models and applications in large-scale data analysis in neuroinformatics. All parts rely on the use of supercomputers. A large-scale neural simulator is developed where techniques are explored for the simulation, analysis and visualization of neural systems on a high biological abstraction level. The performance of the simulator is investigated on some of the largest supercomputers available. Neural algorithms and models on a high biological abstraction level are presented and simulated. Firstly, an algorithm for structural plasticity is suggested which can set up connectivity and response properties of neural units from the statistics of the incoming sensory data. This can be used to construct biologically inspired hierarchical sensory pathways. Secondly, a model of the mammalian olfactory system is presented where we suggest a mechanism for mixture segmentation based on adaptation in the olfactory cortex. Thirdly, a hierarchical model is presented which uses top-down activity to shape sensory representations and which can encode temporal history in the spatial representations of populations. Brain-inspired algorithms and methods are applied to two neuroinformatics applications involving large-scale data analysis. In the first application, we present a way to extract resting-state networks from functional magnetic resonance imaging (fMRI) resting-state data where the final extraction step is computationally inexpensive, allowing for rapid exploration of the statistics in large datasets and their visualization on different spatial scales. In the second application, a method to estimate the radioactivity level in arterial plasma from segmented blood vessels from positron emission tomography (PET) images is presented. The method outperforms previously reported methods to a degree where it can partly remove the need for invasive arterial cannulation and continuous sampling of arterial blood during PET imaging. In conclusion, this thesis provides insights into technologies for the simulation of large-scale neural models on supercomputers, their use to study mechanisms for the formation of neural representations and functions in hierarchical sensory pathways using models on a high biological abstraction level and the use of large-scale, fine-grained data analysis in neuroinformatics applications. / <p>QC 20130515</p>
528

System-Level Analysis and Design under Uncertainty

Ukhov, Ivan January 2017 (has links)
One major problem for the designer of electronic systems is the presence of uncertainty, which is due to phenomena such as process and workload variation. Very often, uncertainty is inherent and inevitable. If ignored, it can lead to degradation of the quality of service in the best case and to severe faults or burnt silicon in the worst case. Thus, it is crucial to analyze uncertainty and to mitigate its damaging consequences by designing electronic systems in such a way that uncertainty is effectively and efficiently taken into account. We begin by considering techniques for deterministic system-level analysis and design of certain aspects of electronic systems. These techniques do not take uncertainty into account, but they serve as a solid foundation for those that do. Our attention revolves primarily around power and temperature, as they are of central importance for attaining robustness and energy efficiency. We develop a novel approach to dynamic steady-state temperature analysis of electronic systems and apply it in the context of reliability optimization. We then proceed to develop techniques that address uncertainty. The first technique is designed to quantify the variability in process parameters, which is induced by process variation, across silicon wafers based on indirect and potentially incomplete and noisy measurements. The second technique is designed to study diverse system-level characteristics with respect to the variability originating from process variation. In particular, it allows for analyzing transient temperature profiles as well as dynamic steady-state temperature profiles of electronic systems. This is illustrated by considering a problem of design-space exploration with probabilistic constraints related to reliability. The third technique that we develop is designed to efficiently tackle the case of sources of uncertainty that are less regular than process variation, such as workload variation. This technique is exemplified by analyzing the effect that workload units with uncertain processing times have on the timing-, power-, and temperature-related characteristics of the system under consideration. We also address the issue of runtime management of electronic systems that are subject to uncertainty. In this context, we perform an early investigation into the utility of advanced prediction techniques for the purpose of fine-grained long-range forecasting of resource usage in large computer systems. All the proposed techniques are assessed by extensive experimental evaluations, which demonstrate the superior performance of our approaches to analysis and design of electronic systems compared to existing techniques.
529

The Euclidean traveling salesman problem with neighborhoods and a connecting fence

Jonsson, Håkan January 2000 (has links)
An important class of problems in robotics deals with the planning of paths. In this thesis, we study this class of problems from an algorithmic point of view by considering cases where we have complete knowledge of the environment and each solution must ensure that a point-sized robot capable of moving continuously and turning arbitrarily accomplishes the following: (1) visits a given set of objects attached to an impenetrable simple polygon in the plane, and (2) travels along a path of minimum length over all the possible paths that visit the objects without crossing the polygon. In its general form, this is The (Euclidean) Traveling Salesman Problem with Neighborhoods and a Connecting Fence. We make several contributions. One is an algorithm that computes a shortest watchman path in a rectilinear polygon in time polynomial in the size of the polygon. Each point in the polygon is visible from some point along the computed path, which is a shortest visiting path for a set of convex polygons, each of which is bounded by a chord in the interior of the polygon. For the special case of computing a shortest watchman route, where the end points of the resulting path must coincide, we give a polynomial-time algorithm for general simple polygons. We also give substantially faster and more practical algorithms for computing provably short approximations, that is watchman paths/routes with lengths guaranteed to be at most a constant times longer than the length of a shortest watchman path/route only. To achieve one of these approximations, we develop a linear-time algorithm for computing a constant factor approximation in the case where the convex polygons are impenetrable. For this problem, which is called the Zookeeper's Problem, we show how an exact solution can be computed in linear time when the number of convex polygons is constant. We also present an application of our results to the computation of both exact and approximate solutions to the problem of computing a shortest visiting route for a set of lines in the plane. / Godkänd; 2000; 20061116 (haneit)
530

Time- and size-efficient supercompilation

Jonsson, Peter A. January 2011 (has links)
Intermediate structures such as lists and higher-order functions are very common in most styles of functional programming. While allowing the programmer to write clear and concise programs, the creation and destruction of these structures impose a run time overhead which is not negligible. Supercompilation algorithms is a family of program transformations that remove these intermediate structures in an automated fashion, thereby improving program performance.While there has been plenty of work on supercompilation algorithms that remove intermediate structures for languages with call-by-name semantics, no investigations have been performed for call-by-value languages. It has been suggested that existing call-by-name algorithms could be applied to call-by-value programs, possibly introducing termination in the program. This hides looping bugs from the programmer, and changes the behaviour of a program depending on whether it is optimized or not.We present positive supercompilation algorithms for higher-order call-by-value and callby-name languages that preserves termination properties of the programs they optimize. We prove the call-by-value algorithm correct and compare it to existing call-by-name transformations. Our results show that deforestation-like transformations are both possible and useful for call-by-value languages, with speedups up to an order of magnitude for certain benchmarks.We also suggest to speculatively supercompile expressions and discard the result if it turned out bad. To test this approach we implemented the call-by-name algorithm in GHC and performed measurements on the standard nofib benchmark suite. We manage to supercompile large parts of the imaginary and spectral parts of nofib in a matter of seconds while keeping the binary size increase below 5%.Our algorithms are particularly important in the context of embedded systems where resources are scarce. By both removing intermediate structures and performing program specialization the footprint of programs can shrink considerably without any manual intervention by the programmer. / Godkänd; 2011; 20110330 (pj); DISPUTATION Opponent: Professor Taha Walid, Sektionen för Informationsvetenskap, data- och elektroteknik, Högskolan i Halmstad Ordförande: Professor Per Lindgren, Institutionen för system- och rymdteknik, Luleå tekniska universitet Tid: Onsdag den 27 april 2011, kl 13.00 Plats: E243, Luleå tekniska universitet / ESIS

Page generated in 0.0724 seconds