• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 34
  • 8
  • 3
  • 3
  • 1
  • Tagged with
  • 52
  • 52
  • 45
  • 16
  • 16
  • 16
  • 13
  • 10
  • 9
  • 7
  • 7
  • 7
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Administratívny portál pre skladový software

Karabin, Štefan January 2019 (has links)
This diploma thesis is focused on designing and implementing a web solution to support the operation of a warehouse management system as requested by the system provider. The theoretical part of this paper analyzes the already existing approaches towards the problematic parts of web application development, such as the design of architecture and the method used to record the settings and permissions. The final portal is built using mainly Microsoft technologies. The thesis concludes with an evaluation of applicability of this solution from both technical and economical standpoints.
32

Modernisering av webbaserat användargränssnitt för Skärblacka Bruk

Johansson, Tony January 2021 (has links)
My independent work has been to modernize a user interface in a 10-year-old MVC Framework application used by Skärblacka mills within the BillerudKorsnäs Group. The concept of modernization also means increasing its quality; to replace older knowledge and technology with new and modern technology such as be able to click on the text of a checkbox that is considered to be more relevant for its time. The framework chosen for the implementation was ASP.NET Core MVC, which is an open source for the most common platforms. The implementation of the porting meant that server code that is well-functioning has been maintained to a large extent. JavaScript is also pretty much the same except in a few places. To get a good and durable structure, the application is layered in the form of 3 layers with Controller, BusinessLayer and Repository. The database with SQL Server is basically the same except that Identity has been introduced. To be able to communicate with the database, ORM EF Core has been chosen, which is a slim version of EF. A lot of what is in the old MVC framework has been redesigned because it is not supported in Core MVC. The application consists of an assembly with a logical development tree that consists of the files included in the application.Ajax has been used to get soft desktop-like updates.The application, which is complex with a lot of complicated things, made time run away. / Mitt självständiga arbete har varit att modernisera ett användargränssnittet i en 10 år gammal MVC Framework applikation som används av Skärblacka bruk inom BillerudKorsnäs koncernen. I begreppet modernisera ligger också innebörden att öka dess kvalité; att byta ut äldre kunskaper och teknik mot ny och modern teknik som t.ex. kunna klicka på texten till en checkbox som anses vara av mer relevans för sin tid. Det ramverk som valdes för implementationen var ASP.NET Core MVC som är ett open sourse för de vanligaste plattformarna. Genomförandet av porteringen innebar att serverkod som är väl fungerande har bibehållits i stora delar. Även JavaScript är i stort sett densamma förutom på några få ställen. För att få en bra och hållbar struktur är applikationen skiktad i form av 3 lager med Controller, BusinessLayer och Repository. Databasen med SQL Server är i stort sett densamma förutom att Identity har införs. För att kunna kommunicera med databasen har man valt ORM EF Core som är en slimmad version av EF. En hel del av det som finns i den gamla MVC framework är omgjort eftersom det stöds in i Core MVC. Applikationen består av en assembly med ett logiskt utvecklingsträd som utgörs av de filer som ingår i applikationen. Ajax har används för att få mjuka desktop liknade uppdateringar. Applikationen som är komplex med en hel del komplicerade saker gjorde att tiden rann iväg.
33

Realizace datového skladu pro hodnocení kvality překladů

Kaplanová, Lucie January 2020 (has links)
This master’s thesis deals with the realization of a data warehouse for a selected company. The data warehouse is oriented on the process of translation quality assessment and is created on Microsoft SQL Server. The thesis includes designing a data model, implementation of physical structures and the whole ETL process, which is used for the automatic loading of transformed data.Formula clause:Faculty of Business and Economics of Mendel University in Brno decided to postpone the publication of the final thesis for 3 years. After the period the publication of the final thesis will be piblished in the UIS.
34

Využití PDA pro distribuci informací v rámci uzavřených sítí / PDA for Information Distribution in Closed Networks

Maslaňák, Martin Unknown Date (has links)
This thesis deals with creating client - server application. Client part of the application is created with a help of the Compact .NET framework and it is running on the mobile facility (PDA). Server part is written on .NET framework and it is running on the desktop computer. In the first part I characterize PDA facility, it's using and also I discuss communication between SQL Servers and PDA facilities. Next part describes .NET platform and advantages, which this platform provides. Also I tried to show differences between client - server architectures, because of understanding in my working. The last part of this work deals with implementation of the client - server application.
35

Master Data Management a jeho využití v praxi / Master Data Management and its usage in practice

Kukačka, Pavel January 2011 (has links)
This thesis deals with the Master Data Management (MDM), specifically its implementation. The main objectives are to analyze and capture the general approaches of MDM implementation including best practices, describe and evaluate the implementation of MDM project using Microsoft SQL Server 2008 R2 Master Data Services (MDS) realized in the Czech environment and on the basis of the above theoretical background, experiences of implemented project and available technical literature create a general procedure for implementation of the MDS tool. To achieve objectives above are used these procedures: exploration of information resources (printed, electronic and personal appointments with consultants of Clever Decision), cooperation on project realized by Clever Decision and analysis of tool Microsoft SQL Server 2008 R2 Master Data Services. Contributions of this work are practically same as its goals. The main contribution is creation of a general procedure for implementation of the MDS tool. The thesis is divided into two parts. The first (theoretically oriented) part deals with basic concepts (including definition against other systems), architecture, implementing styles, market trends and best practices. The second (practically oriented) part deals at first with implementation of realized MDS project and hereafter describes a general procedure for implementation of the MDS tool.
36

Měření výkonnosti obchodníků v softwarové společnosti / Performance Measurement of Salespeople in Software Company

Uhlíř, Radek January 2011 (has links)
This thesis is focused on research of available resources about applicable methods and approaches of implementation of Business Intelligence in sales department of mid-size local company focused on information technologies. The solution is aimed on controlling and performance measurement. Local specifics and company culture of innovation and creativeness are considered during the whole work. The next part of the work is analysis of the company environment and appropriable process of implementation matching current maturity level of company. The goal is to define appropriable set of indicators for performance measurement of sales representatives in order to reflect the reality and allow the relative comparison of individuals. This solution is applied in specific company and as a result there are identified issues of this proposal and suggested recommendations for the future modifications. The work is based on research of available resources and identification of the best practices methods of design and implementation of this system. The next part contains detail analysis of the company, application of conclusion of the theoretical part and suggestion for optimal process of successful adaptation. As a result the structure of metrics has been built and it was verified that the detail analysis is required for relevant definition of the scope of the project, identification of risks and preparing the real schedule. It has been verified, that implementation of performance measurement system requires complex changes in company culture and close coordination to other triggered changes in workflow and quality of recorded data in information systems.
37

Produktkonfigurator

Björkman, Lucas, Erixon, Jimmy January 2010 (has links)
Rapporten presenterar skapandet av en produktkonfigurator och flera förslag på hur den kan förbättras. Uppdraget gavs av System Andersson till två studenter på Jönköpings Tekniska Högskola som genomförde detta som sitt examensarbete. Rapporten diskuterar möjliga lösningar på en produktkonfigurator och visar ett resultat som utgår från dessa lösningar. Produktkonfiguratorn i rapporten består av två delar. Dels konfigurationsdelen som hjälper användaren att skapa nya produkter utifrån valda artiklar. Artiklarna sorteras automatiskt av produktkonfiguratorn så att endast kompatibla artiklar kan sättas ihop. Detta sker med ett ”drag and drop” gränssnitt. Dessutom innehåller produktkonfiguratorn en kompatibilitetsdel, vars uppgift är att underlätta vid definierandet av kompatibilitet mellan artiklar. Detta kan göras löpande under tiden en produkt konfigureras om så önskas. Användaren kan välja på två eller flera artiklar som ska vara kompatibla med varandra, eller hela kategorier. Sist i rapporten diskuteras de val som gjorts under arbetets gång och möjliga förbättringar som kan göras på produktkonfiguratorn.
38

KPI management - auto alerting na platformě MS SQL / KPI management - auto alerting in MS SQL

Vedral, Jakub January 2009 (has links)
This diploma thesis deals with the problem of automatic alerting on Microsoft SQL Server 2008 platform. Thesis was elaborated with Clever Decision s.r.o. company which significantly contributed to thesis assignment. Main goal of this thesis is to provide a solution concept for absence of alerting for multidimensional data in Microsoft's product. Alerting is aimed at critical business data -- Key performance indicators - KPIs. KPIs are used to deliver enhanced business performance through Corporate Performance Management concept. These indicators need to be watched frequently. Considering that alerting solution is convenient for watching the data. Solution concept is based on market research of Business Intelligence (BI) platforms which dominates the market. Platforms are examined for their alerting capabilities. Next goal is to provide overall insight to KPI management -- creating, managing, analyzing and monitoring of KPIs. Thesis is divided into three sections. First part creates theoretical background for solution creation and describes the field of KPI management. Second part consists of market research using selected criteria. Third part is aimed at providing the solution concept through web application fixing the absence of alerting on Microsoft SQL Server 2008 platform. This thesis is primarily intended for Clever Decision company but also for various BI experts dealing with alerting problems on BI platforms. Thesis serves also as a theoretical summary of KPI management which is commonly evaded by available technical literature.
39

Využití principů business intelligence v dotazníkových šetřeních / Business Intelligence principles and their use in questionnaire investigation

Hanuš, Václav January 2010 (has links)
This thesis is oriented on practical usage of tools for data mining and business intelligence. Main goals are processing of source data to suitable form and test use of chosen tool on the test case. As input data I used database which was created as result of processing forms from research to verify the level of IT and economics knowledge among Czech universities. These data was modified into the form, which allows processing them via data mining tools included in Microsoft SQL Server 2008. I choose two cases for verification the potentials of these tools. First case was focused on clustering using Microsoft Clustering algorithm. Main task was to sort the universities into the clusters by comparing their attributes which was amounts of credits of each knowledge group. I had to deal with two problems. It was necessary to reduce the number of groups of subjects, otherwise there was a danger of creation too many clusters which I couldn't put the name on. Another problem was unequal value of credits in each group and this problem caused another problem with weights of these groups. Solution was at the end quite simple. I put together similar groups to bigger formation with more general category. For unequal value, I used parameter for each of new group and transform it to scale 0-5. Second case was focused on prediction task using Microsoft Logistic Regresion algorithm and Microsoft Neural Network algorithm. In this case was the goal to predict the number of presently studying students. I had a historical data from years 2001-2009. A predictive model was processed based on them and I could compare the prediction with real data. In this case, it was also necessary to transform the source data, otherwise it couldn't be processed by tested tool. Original data was placed into the view instead of table and contained not only wished objects but more types of these. For example divided by a sex. Solution was in creation of new table in database where only relevant objects for test case were placed. Last problem come up when I tried to use prediction model to predict data for year 2010 for which there wasn't real data in the table. Software reported an error and couldn't make prediction. During my research on the Microsoft technical support I find some threads which refer to similar problem, so it's possible that this is a system error whit will be fix in forthcoming actualization. Fulfillment of these cases provided me enough clues to determine abilities of these tools from Microsoft. After my former school experience with data mining tools from IBM (former SSPS) and SAS, I can recognize, if tested tools can match these software from major data mining supplier on the market and if it can be use for serious deployment.
40

Analýza veřejně dostupných dat Českého statistického úřadu / Analysis of Public Data of the Czech Statistical Office

Pohl, Ondřej January 2017 (has links)
The aim of this thesis is analysis of data of the Czech Statistical Office concerning foreign trade. At first, reader familiarize with Business Intelligence and data warehousing. Further, OLAP analysis and data mining basics are explained. In next parts the thesis deal with describing and analysis of data of foreign trade by the help of OLAP technology and data mining in MS SQL Server including selected analytical tasks implementation.

Page generated in 0.0437 seconds