• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 187
  • 148
  • 110
  • 30
  • 25
  • 12
  • 10
  • 8
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 587
  • 190
  • 148
  • 138
  • 127
  • 103
  • 84
  • 79
  • 75
  • 74
  • 72
  • 68
  • 64
  • 63
  • 60
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Site to Cloud lösning i Microsoft Azure / Site to Cloud solution in Microsoft Azure

Birgersson, Christoffer January 2017 (has links)
I dagens samhälle så blir vi allt mer och mer beroende av den teknik vi utvecklar, därför är det av större vikt att vi alltid skall ha tillgång till den information som vi behöver för att kunna göra våra dagliga sysslor. Detta lägger större press på att de system som utvecklas skall fungera felfritt och man skall ha snabba responstider för att allt ska flyta på så bra som möjligt och om det så skulle bli en kritisk systemkrasch så skall inte informationen gå helt förlorad. Därför har det blivit populärt att ändra från att ha personliga servrar till att köra allt mer och mer applikationer och information i det så kallade ”molnet”.  Därför är det av allt större intresse att ha koll på några av de större leverantörerna av molntjänster och hur man kan använda dessa för att få en bättre verksamhet. Därför kommer denna rapport att handla om hur ett arbete utförts på TeamNorrs begäran för att göra en jämförelse mellan Microsoft Azure, Google Cloud Platform och Amazon Web Services. Den kommer även gå igenom hur responstiderna ser ut till några av världens datacenter med en utgångspunkt från TeamNorrs huvudkontor i Umeå. Rapporten går även igenom hur man själv kan testa på Microsoft Azure, Google Cloud Platform och Amazon Web Services, och hur man i Microsoft Azure kan göra en så kallad ”Site to Cloud” lösning med en VPN tunnel för att kunna skapa en MSSQL backup i molnet. Sedan diskuteras för och nackdelar med det arbete som utförts, samt vilka förbättringar som skulle kunna vara aktuella i framtiden. / In todays society we become more and more dependent on the technology we develop, so it is of greater importance that we always have access to the information we need to do our daily tasks. It is more pressing than ever that the developed systems should work well and that you should have quick response times for everything to flow as smooth as possible and if there would be a critical system crash, the information should not be completely lost. Therefore, it has become more popular to go from, having personal servers to use more and more applications and information in the so-called "cloud". That´s why it is becoming increasingly important to keep track of some of the major cloud service providers and how to use them to archive a better business. This report will be about how a work was done at TeamNorrs request to make a comparison between Microsoft Azure, Google Cloud Platform and Amazon Web Services. It will also review the response times to some of the worlds data centers with a starting point from TeamNorrs headquarters in Umeå. It also describe how you can put up your own environment to test Microsoft Azure, Google Cloud Platform and Amazon Web Services, and how to do a so-called "Site to Cloud" solution with a Virtual Private Network tunnel in Microsoft Azure, where it will also show you how you can create an MSSQL backup in the cloud. Then discusses the pros and cons of the work done and what improvements could be relevant for the future.
42

Využití nástroje Informatica v datovém skladu a optimalizace ETL procesů / Usage of Informatica tools in a datawarehouse and optimalization of ETL processes

Adámek, Karel January 2009 (has links)
The graduation thesis is engaged in the problems of ETL development in terms of Business Intelligence solution. It analyses the wider portfolio of aplications supporting ETL development and offers the survey of its basic characteristics. There is a detail analysis of the pair of chosen instruments which utility is shown on the development of the real transformations. Then the advantages and possible risks of the transition on the higher edition of the product PowerCenter from Informatica company are being analysed too. The constituent of the thesis is engaged in optimalization procedure which is put into real operation of data warehouse and shows its benefit in term of computer time reduction The aim of the thesis is to show the detail look into the ETL instruments, formation of ETL transformations and not least the ways of optimalization of ETL transformations. The graduation thesis proposes to solve the problems in selection of convenient ETL instrument not only from the view of ETL developer in the shape of detailed comparison of the pair instrument usage but also from the view of management in the shape of general survey, basic characteristics and the comparison of wider porfolio of ETL instruments. In terms of detailed analysis the set Informatica PowerCenter should give us answers for the questions like: "What advantages does the upgrade of Informatica PowerCenter range 7 to range 8 bring?" or "What problems may that upgrade cause and how can we solve those problems? The part devoted to optimalization proposes to offer the outlook to the basic optimalization procedures inside the data warehouse and comparison of its contribution. In parallel with analytic part there is also proceeding the formation of practical part of the graduation thesis which consists in development of really used ETL transformations and in optimalization of selected ETL transformations of data warehouse. First the graduation thesis is concentrated on the analytic activity consisting in the products research which are on the top of ETL development. Then it is gathering the detailed knowledge of two instruments which is aplicated for the formation of ETL transformations The second analytic phase is concentrated on the optimalization procedures. The knowledge acquired in that phase is subsequently used in practice and the final analysis of those procedures and benefits is proceeding. The graduation thesis does not also miss out to partly outline the future development in the field of ETL.The main contribution consists in giving not only the comprehensive survey of ETL instruments but also in the detailed analysis of pros and cons of the pair of selected products. The thesis offered the answers to the questions asked in the outset connected with the transition to upper version of Power Center. The analysis of optimalization procedure enabled the optimalization of 29 ETL transformations and their high effectivity was proved too.
43

Analýza možností a omezení BI nástroje Report Studio / Analysis of possibilities and limitations of BI tool Report Studio

Čapková, Lenka January 2013 (has links)
This master thesis focuses on the opinions and barriers from the BI tool in Report Studio as the part of IBM Cognos BI for the creation of reports. The aim of this thesis is to analyze all the functions of this tool and simplify that experience for other users. The theoretical section of this thesis introduces Business Intelligence as well as the product IBM Cognos BI itself divided into individual components. The final part of thesis introduces and compares similar products available on the market from the competitors. The practical part consists of analysis of Report Studio tool functions followed by applied criteria. In this part, the strengths and weaknesses of Report Studio are mentioned along with proposals of procedures regarding work efficiency using this tool. This section also lists nonstandard examples collected from the client's experience followed by the exact procedures using solution default functions of Report Studio and script language JavaScript.
44

Optimalizace strukturovaných dotazů nad rozsáhlými databázemi / Optimization of Structured Queries on Large Databases

Janeček, Jiří January 2012 (has links)
This master's thesis deals with optimization of structured queries on large databases. Principles of these optimizations are used during creation of application, which allows finding over one specific large database. At the same time this thesis compares efficiency between the new designed SQL constructions and the not optimized SQL constructions.
45

Verzování databází / Databases Versioning

Jindra, Petr January 2013 (has links)
This diploma thesis is about versioning of databases. Firsts chapters are shortly describing the biggest problems what will need to solve. Thr second part refers to choosen database systems and introcution to SQL language. The last unit describes the development of program solving sketched problems.
46

Vývoj SQL/XML funkcionality v databázi PostgreSQL / Development of SQL/XML Functionality in PostgreSQL Database

Pospíšil, Tomáš January 2011 (has links)
The aim of this thesis is to propose a way to implement the missing XML functionality for the database system PostgreSQL. The second chapter discusses the theoretical paradigm with an XML-related technologies like Xpath or XQuery. The third chapter discusses the ISO SQL standards and describes the current level of implementation of native XML databases, versus traditional relational databases. The last part focuses on different approaches and it proposes a solution to implement the XML API to PostgreSQL, which validates XML documents against XSD, DTD and RelaxNG schemes. Next point is focused on XML indexing techniques and proposal of new index based on GiST.
47

Automatiserad dokumentation vid systemutveckling / Automated documentation in systems development

Andersson, Magnus January 2012 (has links)
Ett erkänt problem inom industrin för mjukvaruutveckling är bristen på kvalitativ systemdokumentation. Hos företaget Multisoft Consulting finns detta problem. Utvecklare måste spendera onödigt mycket tid på att sätta sig in i befintliga system. Som en del av lösningen vill företaget införa automatisk generering av dokumentation. Genereringen ska ske i plattformen Softadmin® som används för att bygga alla kundsystem. Plattformen är baserad på C# och Microsofts SQL Server och innehåller en mängd färdiga komponenter med olika funktionalitet. För att veta vilken dokumentation som bör genereras automatiskt har litteraturstu-dier och intervjuer med Multisoft Consulting-anställda genomförts. För att veta vilken dokumentation som kan genereras har Softadmin® analyserats. Undersökningarna visar att dokumentation som ger en överblick över ett system samt visar hur systemet används både är önskvärd och möjlig att generera via Softadmin®. En form av system-överblick fanns redan implementerad i Softadmin® i form av en träd-struktur. Överblicken saknade dock en del önskvärda detaljer vilket medförde att fokus för studien blev att implementera en prototyp som kompletterade överblicken. Resultatet är att information om systemets menyval, vilket är sidor med olika funktionalitet, nu visas i överblicken. / A well-known problem within the software development industry is the absence of qualitative system documentation. This problem can be found within the company Multisoft Consulting. Too much time is spent by the developers when they must familiarize themselves with an existing system. As a part of the solution to this problem the company would like to generate documentation automatically. The generation should be performed by the platform Softadmin® which is used to develop all customer systems. The platform is based on C# and Microsoft’s SQL Server and contains different components, each with its own functionality. In order to find out which documentation that should be generated automatically literature has been studied and developers have been interviewed. In order to know which documentation that can be generated by Softadmin® the platform has been analyzed. The conclusion is that documentation which provides a general view of the system and demonstrates how the system is used is both desirable and possible to generate with Softadmin®. A kind of general view had already been implemented in Softadmin®. However, some desired features were not included in the general view. The priority of this study became to implement a prototype which completed the general view by including some of the desired features. The result is that it is now possible to display information about the system’s menu items, which is pages with different functionality, within the general view.
48

Detecting SQL Injection Attacks in VoIP using Real-time Deep Packet Inspection : Can a Deep Packet Inspection Firewall Detect SQL Injection Attacks on SIP Traffic with Reasonable Performance?

Sjöström, Linus January 2019 (has links)
The use of the Internet has increased over the years, and it is now an integral part of our daily activities, as we often use it for everything from interacting on social media to watching videos online. Phone calls nowadays tend to use Voice over IP (VoIP), rather than the traditional phone networks. As with any other services using the Internet, these calls are vulnerable to attacks. This thesis focus on one particular attack: SQL injection in the Session Initial Protocol (SIP), where SIP is a popular protocol used within VoIP. To find different types of SQL injection, two classifiers are implemented to either classify SIP packets as "valid data" or "SQL injection". The first classifier uses regex to find SQL meta-characters in headers of interest. The second classifier uses naive Bayes with a training data set to classify. These two classifiers are then compared in terms of classification throughput, speed, and accuracy. To evaluate the performance impact of packet sizes and to better understand the classifiers resiliance against an attacker introducing large packets, a test with increasing packet sizes is also presented. The regex classifier is then implemented in a Deep Package Inspection (DPI) open-source implementation, nDPI, before being evaluated with regards to both throughput and accuracy. The result are in favor of the regex classifier as it had better accuracy and higher classification throughput. Yet, the naive Bayes classifier works better for new types of SQL injection that we do not know. It therefore argues that the best choice depends on the scenario; both classifiers have their strengths and weakness!
49

Query AutoAwesome

Suryavanshi, Chetna 01 August 2019 (has links)
This research investigates how to improve legacy queries. Legacy queries are queries that programmers have coded and are used in applications. A database application typically has tens to hundreds of such queries. One way to improve legacy queries is to add new, interesting queries that are similar to or based on the set of queries. We propose Query AutoAwesome, a tool to generate new queries from legacy queries. The Query AutoAwesome philosophy is taken from Google’s AutoAwesomizer tool for photos, which automatically improves a photo uploaded to Google by animating the photo or adding special effects. In a similar vein, Query AutoAwesome automatically enhances a query by ingesting a database and the query. Query AutoAwesome produces a set of enhanced queries that a user can then choose to use or discard. A key problem that we solve is that the space of potential enhancements is large, so we introduce objective functions to narrow the search space to a tractable space. We describe our plans for implementing Query AutoAwesome and discuss our ideas for future work.
50

Reportingové služby software pro telemarketingová centra / Reporting Services for Telemarketing Software

Sušil, Martin January 2007 (has links)
This work covers extension of the existing telemarketing software. Functions for creating user defined report were added to this software. The server side is based on the PostgreSQL 8.2 database server. The client side is implemented in language C# using Visual Studio 2005 as development environment.

Page generated in 0.0373 seconds