• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 250
  • 191
  • 69
  • 37
  • 28
  • 18
  • 12
  • 10
  • 10
  • 9
  • 9
  • 8
  • 4
  • 3
  • 3
  • Tagged with
  • 700
  • 124
  • 114
  • 101
  • 96
  • 91
  • 88
  • 84
  • 82
  • 77
  • 74
  • 73
  • 72
  • 69
  • 69
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Contribuição do uso do controle gerencial para o desenvolvimento da resiliência organizacional / Contribution of the use of management control to the development of organizational resilience.

Martins, Daiana Bragueto 29 March 2019 (has links)
Esta tese investigou como o uso do Sistema de Controle Gerencial influencia na gestão da resiliência organizacional de uma empresa brasileira. A escolha pelo modelo teórico \"Levers of Control\" de Robert Simons (1995, 2000) está vinculada ao seu objetivo que é analisar quais são os sistemas de controle gerencial utilizados pela organização para promover a renovação da estratégia organizacional, com isto auxiliar na gestão da resiliência estratégica da organização. Desenvolveu-se um survey single entity em uma empresa do setor de comércio de derivados de petróleo, com estrutura, porte e sistema de controle gerencial demandados para a investigação. A empresa escolhida foi conveniente pois, no momento da pesquisa, estava vivenciando um momento de tensão em relação a sua estrutura e modo de funcionamento. O levantamento dos dados ocorreu a partir de um questionário enviado aos executivos de diversas áreas da organização abrangendo uma amostra de 64 gestores organizacionais. Os resultados suportam a hipótese de que o uso do sistema de controle gerencial impacta positivamente na resiliência estratégica porque permite uma visão estruturada para ação e reação das organizações. Observou-se que elevados níveis do uso do sistema de controle gerencial nas formas de sistemas de crenças, restrições, uso diagnóstico e uso interativo aumentam a capacidade da organização para a resiliência ao atuarem de forma proativa, com visão estratégica frente às adversidades do cenário empresarial, proporcionando a renovação da estratégica proposta por Simons (1995). Dentre as contribuições da pesquisa, destacam-se: (i) a discussão sobre o sistema de controle gerencial sob o prisma das alavancas de controle gerencial de Simons (1995, 2000) no cenário nacional e sua associação com a teoria da resiliência organizacional; (ii) o desenvolvimento e a validação de uma ferramenta para mensurar resiliência estratégica no nível empresarial; e (iii) do ponto de vista prático, este estudo auxilia os gerentes a decidirem qual padrão de controle melhor se adequa às circunstâncias em que operam e aos seus desafios estratégicos, contribuindo para a compreensão de como o uso do sistema de controle gerencial impacta a resiliência no ambiente organizacional. / This Ph.D. dissertation investigated how the use of the Management Control System influences in the management of organizational resilience in a Brazilian company. The choice by the theoretical model named \"Levers of Control\" by Robert Simons (1995, 2000) is linked to its purpose to analyze which are the management control systems used by the organization to provide the strategic renewal and helping with the organization\'s strategic resilience management. A survey single entity developed in a Brazilian oil derivative trading company with structure, size and management control system required for the research. The chosen company was convenient because, at the time of the research, it was experiencing a moment of tension in relation to its structure and mode of operation. The data were collected based on a questionnaire sent to executives from different departments, covering a sample of 64 organizational managers. The results support the hypothesis that the use of the management control system has a positive impact on the strategic resilience because it allows a structured view for organizations\' action and reaction. Thus, it was observed that high levels of the use of the management control system in the forms of beliefs systems, boundary systems, diagnostic control system and interactive control system increase the organization\'s capacity for resilience by acting in a proactive way, with strategic vision facing the business context\'s adversities, providing the strategic renewal proposed by Simons (1995). Among the research contributions, the following stand out: (i) the theoretical discussion about the Management Control System through the prism of Levers of Control by Simons (1995, 2000) in the national scenario, and its association with the organizational resilience theory; (ii) the development and validation of a tool to measure strategic resilience at the organizational level; and (iii) the practical point of view, this research helps managers to decide which pattern of control best fits the circumstances in which they operate and their strategic challenges, contribute to understanding how the use of management control system impacts on the resilience in the organizational environment.
112

Automatic Extraction and Assessment of Entities from the Web

Urbansky, David 23 October 2012 (has links) (PDF)
The search for information about entities, such as people or movies, plays an increasingly important role on the Web. This information is still scattered across many Web pages, making it more time consuming for a user to find all relevant information about an entity. This thesis describes techniques to extract entities and information about these entities from the Web, such as facts, opinions, questions and answers, interactive multimedia objects, and events. The findings of this thesis are that it is possible to create a large knowledge base automatically using a manually-crafted ontology. The precision of the extracted information was found to be between 75–90 % (facts and entities respectively) after using assessment algorithms. The algorithms from this thesis can be used to create such a knowledge base, which can be used in various research fields, such as question answering, named entity recognition, and information retrieval.
113

Effiziente MapReduce-Parallelisierung von Entity Resolution-Workflows

Kolb, Lars 11 December 2014 (has links) (PDF)
In den vergangenen Jahren hat das neu entstandene Paradigma Infrastructure as a Service die IT-Welt massiv verändert. Die Bereitstellung von Recheninfrastruktur durch externe Dienstleister bietet die Möglichkeit, bei Bedarf in kurzer Zeit eine große Menge von Rechenleistung, Speicherplatz und Bandbreite ohne Vorabinvestitionen zu akquirieren. Gleichzeitig steigt sowohl die Menge der frei verfügbaren als auch der in Unternehmen zu verwaltenden Daten dramatisch an. Die Notwendigkeit zur effizienten Verwaltung und Auswertung dieser Datenmengen erforderte eine Weiterentwicklung bestehender IT-Technologien und führte zur Entstehung neuer Forschungsgebiete und einer Vielzahl innovativer Systeme. Ein typisches Merkmal dieser Systeme ist die verteilte Speicherung und Datenverarbeitung in großen Rechnerclustern bestehend aus Standard-Hardware. Besonders das MapReduce-Programmiermodell hat in den vergangenen zehn Jahren zunehmend an Bedeutung gewonnen. Es ermöglicht eine verteilte Verarbeitung großer Datenmengen und abstrahiert von den Details des verteilten Rechnens sowie der Behandlung von Hardwarefehlern. Innerhalb dieser Dissertation steht die Nutzung des MapReduce-Konzeptes zur automatischen Parallelisierung rechenintensiver Entity Resolution-Aufgaben im Mittelpunkt. Entity Resolution ist ein wichtiger Teilbereich der Informationsintegration, dessen Ziel die Entdeckung von Datensätzen einer oder mehrerer Datenquellen ist, die dasselbe Realweltobjekt beschreiben. Im Rahmen der Dissertation werden schrittweise Verfahren präsentiert, welche verschiedene Teilprobleme der MapReduce-basierten Ausführung von Entity Resolution-Workflows lösen. Zur Erkennung von Duplikaten vergleichen Entity Resolution-Verfahren üblicherweise Paare von Datensätzen mithilfe mehrerer Ähnlichkeitsmaße. Die Auswertung des Kartesischen Produktes von n Datensätzen führt dabei zu einer quadratischen Komplexität von O(n²) und ist deswegen nur für kleine bis mittelgroße Datenquellen praktikabel. Für Datenquellen mit mehr als 100.000 Datensätzen entstehen selbst bei verteilter Ausführung Laufzeiten von mehreren Stunden. Deswegen kommen sogenannte Blocking-Techniken zum Einsatz, die zur Reduzierung des Suchraums dienen. Die zugrundeliegende Annahme ist, dass Datensätze, die eine gewisse Mindestähnlichkeit unterschreiten, nicht miteinander verglichen werden müssen. Die Arbeit stellt eine MapReduce-basierte Umsetzung der Auswertung des Kartesischen Produktes sowie einiger bekannter Blocking-Verfahren vor. Nach dem Vergleich der Datensätze erfolgt abschließend eine Klassifikation der verglichenen Kandidaten-Paare in Match beziehungsweise Non-Match. Mit einer steigenden Anzahl verwendeter Attributwerte und Ähnlichkeitsmaße ist eine manuelle Festlegung einer qualitativ hochwertigen Strategie zur Kombination der resultierenden Ähnlichkeitswerte kaum mehr handhabbar. Aus diesem Grund untersucht die Arbeit die Integration maschineller Lernverfahren in MapReduce-basierte Entity Resolution-Workflows. Eine Umsetzung von Blocking-Verfahren mit MapReduce bedingt eine Partitionierung der Menge der zu vergleichenden Paare sowie eine Zuweisung der Partitionen zu verfügbaren Prozessen. Die Zuweisung erfolgt auf Basis eines semantischen Schlüssels, der entsprechend der konkreten Blocking-Strategie aus den Attributwerten der Datensätze abgeleitet ist. Beispielsweise wäre es bei der Deduplizierung von Produktdatensätzen denkbar, lediglich Produkte des gleichen Herstellers miteinander zu vergleichen. Die Bearbeitung aller Datensätze desselben Schlüssels durch einen Prozess führt bei Datenungleichverteilung zu erheblichen Lastbalancierungsproblemen, die durch die inhärente quadratische Komplexität verschärft werden. Dies reduziert in drastischem Maße die Laufzeiteffizienz und Skalierbarkeit der entsprechenden MapReduce-Programme, da ein Großteil der Ressourcen eines Clusters nicht ausgelastet ist, wohingegen wenige Prozesse den Großteil der Arbeit verrichten müssen. Die Bereitstellung verschiedener Verfahren zur gleichmäßigen Ausnutzung der zur Verfügung stehenden Ressourcen stellt einen weiteren Schwerpunkt der Arbeit dar. Blocking-Strategien müssen stets zwischen Effizienz und Datenqualität abwägen. Eine große Reduktion des Suchraums verspricht zwar eine signifikante Beschleunigung, führt jedoch dazu, dass ähnliche Datensätze, z. B. aufgrund fehlerhafter Attributwerte, nicht miteinander verglichen werden. Aus diesem Grunde ist es hilfreich, für jeden Datensatz mehrere von verschiedenen Attributen abgeleitete semantische Schlüssel zu generieren. Dies führt jedoch dazu, dass ähnliche Datensätze unnötigerweise mehrfach bezüglich verschiedener Schlüssel miteinander verglichen werden. Innerhalb der Arbeit werden deswegen Algorithmen zur Vermeidung solch redundanter Ähnlichkeitsberechnungen präsentiert. Als Ergebnis dieser Arbeit wird das Entity Resolution-Framework Dedoop präsentiert, welches von den entwickelten MapReduce-Algorithmen abstrahiert und eine High-Level-Spezifikation komplexer Entity Resolution-Workflows ermöglicht. Dedoop fasst alle in dieser Arbeit vorgestellten Techniken und Optimierungen in einem nutzerfreundlichen System zusammen. Der Prototyp überführt nutzerdefinierte Workflows automatisch in eine Menge von MapReduce-Jobs und verwaltet deren parallele Ausführung in MapReduce-Clustern. Durch die vollständige Integration der Cloud-Dienste Amazon EC2 und Amazon S3 in Dedoop sowie dessen Verfügbarmachung ist es für Endnutzer ohne MapReduce-Kenntnisse möglich, komplexe Entity Resolution-Workflows in privaten oder dynamisch erstellten externen MapReduce-Clustern zu berechnen.
114

Die Durchgriffshaftung im deutschen und russischen Recht der Kapitalgesellschaften : eine rechtsvergleichende Untersuchung /

Rabensdorf, Renate. January 1900 (has links)
Zugleich: Diss. Berlin, 2008. / Literaturverz.
115

Oceňování podniku / Enterprise appreciation

DOKUČAJEVOVÁ RYNEŠOVÁ, Denisa January 2012 (has links)
The aim of my diploma work is the appreciation of the FARMTEC joint stock enterprise as a means of determination of its real solvency to December 31, 2010. The work is divided into two parts. The first part, theoretical, deals with the basic appreciation approaches. It describes different methods and analyses used in determining the value and prospects of the enterprise. The second part, practical, examines the application of the theoretical knowledge. The appreciation itself was determined by using the DCF method "entity".
116

Webbapplikation för administrering av dokument / Web Application for Document Management

Kasselia, Paul January 2015 (has links)
Denna rapport redogör för utredning samt utveckling av en webbapplikation som ska fungera som en stödapplikation för Combitech vid registrering av deras system i en databas. När en komponent behöver bytas ut eller uppgraderas så ska man kunna ta fram alla dokument som beskriver komponenten samt alla system som består utav denna komponent. På så vis kan de uppskatta hur stor inverkan ett utbyte eller en uppgradering av en komponent kommer att ha i ett tidigt skede i deras arbete. Webbapplikationen togs fram efter en omfattande utredning av verktyg som skulle användas under projektets gång, samt utredning kring vilka funktioner applikationen skulle innehålla. Därefter utvecklades webbapplikationen inom ramverket ASP.NET MVC. / This report describes the investigation and the development of a web application that will serve as a support application for Combitech when registering their systems in a database. When component needs to be replaced or upgraded, then you should be able to produce all documents that describe the component, and systems that consist of this component.Thus theycan estimate how much impact a replacement of an upgrade of a component will be in the early stages of their work.The web application was developedfollowing an extensive investigation of the tools that would be used during the project, and an investigation of the functions of the application. The web application was then developed in the ASP. NET MVC framework.
117

Bayesian Nonparametric Modeling of Temporal Coherence for Entity-Driven Video Analytics

Mitra, Adway January 2015 (has links) (PDF)
In recent times there has been an explosion of online user-generated video content. This has generated significant research interest in video analytics. Human users understand videos based on high-level semantic concepts. However, most of the current research in video analytics are driven by low-level features and descriptors, which often lack semantic interpretation. Existing attempts in semantic video analytics are specialized and require additional resources like movie scripts, which are not available for most user-generated videos. There are no general purpose approaches to understanding videos through semantic concepts. In this thesis we attempt to bridge this gap. We view videos as collections of entities which are semantic visual concepts like the persons in a movie, or cars in a F1 race video. We focus on two fundamental tasks in Video Understanding, namely summarization and scene- discovery. Entity-driven Video Summarization and Entity-driven Scene discovery are important open problems. They are challenging due to the spatio-temporal nature of videos, and also due to lack of apriori information about entities. We use Bayesian nonparametric methods to solve these problems. In the absence of external resources like scripts we utilize fundamental structural properties like temporal coherence in videos- which means that adjacent frames should contain the same set of entities and have similar visual features. There have been no focussed attempts to model this important property. This thesis makes several contributions in Computer Vision and Bayesian nonparametrics by addressing Entity-driven Video Understanding through temporal coherence modeling. Temporal Coherence in videos is observed across its frames at the level of features/descriptors, as also at semantic level. We start with an attempt to model TC at the level of features/descriptors. A tracklet is a spatio-temporal fragment of a video- a set of spatial regions in a short sequence (5-20) of consecutive frames, each of which enclose a particular entity. We attempt to find a representation of tracklets to aid tracking of entities. We explore region descriptors like Covari- ance Matrices of spatial features in individual frames. Due to temporal coherence, such matrices from corresponding spatial regions in successive frames have nearly identical eigenvectors. We utilize this property to model a tracklet using a covariance matrix, and use it for region-based entity tracking. We propose a new method to estimate such a matrix. Our method is found to be much more efficient and effective than alternative covariance-based methods for entity tracking. Next, we move to modeling temporal coherence at a semantic level, with special emphasis on videos of movies and TV-series episodes. Each tracklet is associated with an entity (say a particular person). Spatio-temporally close but non-overlapping tracklets are likely to belong to the same entity, while tracklets that overlap in time can never belong to the same entity. Our aim is to cluster the tracklets based on the entities associated with them, with the goal of discovering the entities in a video along with all their occurrences. We argue that Bayesian Nonparametrics is the most convenient way for this task. We propose a temporally coherent version of Chinese Restaurant Process (TC-CRP) that can encode such constraints easily, and results in discovery of pure clusters of tracklets, and also filter out tracklets resulting from false detections. TC-CRP shows excellent performance on person discovery from TV-series videos. We also discuss semantic video summarization, based on entity discovery. Next, we consider entity-driven temporal segmentation of a video into scenes, where each scene is characterized by the entities present in it. This is a novel application, as existing work on temporal segmentation have focussed on low-level features of frames, rather than entities. We propose EntScene: a generative model for videos based on entities and scenes, and propose an inference algorithm based on Blocked Gibbs Sampling, for simultaneous entity discovery and scene discovery. We compare it to alternative inference algorithms, and show significant improvements in terms of segmentatio and scene discovery. Video representation by low-rank matrix has gained popularity recently, and has been used for various tasks in Computer Vision. In such a representation, each column corresponds to a frame or a single detection. Such matrices are likely to have contiguous sets of identical columns due to temporal coherence, and hence they should be low-rank. However, we discover that none of the existing low-rank matrix recovery algorithms are able to preserve such structures. We study regularizers to encourage these structures for low-rank matrix recovery through convex optimization, but note that TC-CRP-like Bayesian modeling is better for enforcing them. We then focus our attention on modeling temporal coherence in hierarchically grouped sequential data, such as word-tokens grouped into sentences, paragraphs, documents etc in a text corpus. We attempt Bayesian modeling for such data, with application to multi-layer segmentation. We first make a detailed study of existing models for such data. We present a taxonomy for such models called Degree-of-Sharing (DoS), based on how various mixture components are shared by the groups of data in these models. We come up with Layered Dirichlet Process which generalizes Hierarchical Dirichlet Process to multiple layers, and can also handle sequential information easily through Markovian approach. This is applied to hierarchical co-segmentation of a set of news transcripts- into broad categories (like politics, sports etc) and individual stories. We also propose a explicit-duration (semi-Markov) approach for this purpose, and provide an efficient inference algorithm for this. We also discuss generative processes for distribution matrices, where each column is a probability distribution. For this we discuss an application: to infer the correct answers to questions on online answering forums from opinions provided by different users.
118

Ocenění společnosti BONAVITA, spol. s.r.o. / Valuation of the company BONAVITA, spol. s.r.o.

Miksová, Šárka January 2015 (has links)
The main goal of this diploma thesis is a valuation of the company BONAVITA, spol. s.r.o. referring to 31st December 2014. Valuation process is based on publicly available data such as company´s annual reports. The diploma thesis consists of two main parts. The first is the theoretical part providing explanation and description of basic terminology, methods and approaches used in this particular valuation process. The practical part begins with a company introduction, followed by strategic and financial analysis in order to analyse the environment and financial health of the company. After that the value drivers are planned and on this basis the financial plan, which consists of a balance sheet and income statement, is formulated for the period from 2015 until 2019. The financial plan is a key element for valuation of the company using the DCF entity method which is compared to the book value of company´s net assets.
119

Ocenění společnosti Jan Becher – Karlovarská Becherovka, a. s. / Valuation of company Jan Becher

Rosenbreyer, Jan January 2015 (has links)
The aim of this thesis is the valuation of company Jan Becher - Karlovarská Becherovka, a. s. at the date of 31st December 2014. The thesis is divided into two main parts. In the theoretical part the approach to the valuation of the company is defined. In the practical part, the above defined approach was used for strategic and financial analysis of the company, followed by the compilation of complex financial plan and definition of discount rate. What follows is the final valuation of the company which was accomplished through discounted cash flow method in the entity approach. At the end of the thesis the determined value through discounted cash flow model was compared to the accounting value of the company.
120

Ocenění společnosti Rodinný pivovar BERNARD a.s. / Valuation of the company Rodinný pivovar BERNARD a.s.

Langr, Jan January 2012 (has links)
The aim of the thesis is to determine the value of equity of the company Rodinný pivovar BERNARD a.s as of July 31st, 2012. To assess the value of the company the DCF Entity approach is used. The thesis itself is divided into two separate parts. The first contains the theoretical basis for company valuation and general description of methods and procedures that are used. The second part introduces the business, analyses its relevant market and macro environment and sets the prognosis for its future development. This prognosis is essential to set up a financial plan which is then used to determine the value of the company for its owners. The final supplementary chapter describes the possibilities of using Monte Carlo simulation as an extension to the classical methods of company valuation.

Page generated in 0.0498 seconds