• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 59
  • 11
  • 7
  • 7
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 149
  • 149
  • 38
  • 37
  • 25
  • 22
  • 22
  • 21
  • 19
  • 18
  • 17
  • 16
  • 15
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

A system architecture centred on knowledge management processes / Sistemos architektūra, grindžiama žinių valdymo procesais

Belevičiūtė, Inga, Belevičiūte, Inga 21 November 2008 (has links)
The thesis “A system architecture centred on knowledge management processes” consists of these chapters: 1. Introduction. 2. Knowledge management and knowledge management tools. 3. Analysis of knowledge management systems, their architectures and solutions. 4. A system architecture centred on knowledge management processes. 5. Knowledge management implementations in organizations. 6. General conclusions. The introduction chapter covers relevance, tasks, object, scientific novelty and practical value of the research, and work approbation in international conferences. The first chapter presents knowledge management definitions discussed by many authors in academia and industry. To enter into this subject, knowledge determinations and interpretations widely discussed in literature are analysed. Later, tasks which knowledge management solves in organizations and tools which help to put it into practice are discussed. In the second chapter architectures of knowledge management systems suggested by other researchers in literature are investigated. Thereafter, a classification of them and an investigation of examples of such systems or solutions are made. After the analysis of knowledge management discipline and existing knowledge management system architectures, an architecture for knowledge management systems which is knowledge management processes centred is suggested in the third chapter. Then, specifications of information and communication technologies which could be used... [to full text] / Disertaciją Sistemos architektūra, grindžiamos žinių valdymo procesais sudaro šie skyriai: 1. Įvadas. 2. Žinių valdymas bei žinių valdymo technologijos. 3. Žinių valdymo sistemų ir jų architektūrų analizė. 4. Sistemos architektūra, grindžiamos žinių valdymo procesais. 5. Žinių valdymo sistemos taikymas organizacijose. 6. Bendrosios išvados. Įvadas apima tyrimo aktualumą, mokslinį naujumą, darbo tikslus ir uždavinius, praktinę tyrimo vertę bei aprobavimą tarptautinėse konferencijose ir seminaruose. Pirmame skyriuje „Žinių valdymas ir žinių valdymo technologijos“ pateikiamas žinių valdymo sąvokos apibrėžimas, apie kurį diskutuoja daugelis autorių, dirbančių mokslo ir pramonės srityse. Žinių valdymas pradedamas nagrinėti nuo žinių apibrėžimo ir jo interpretavimo, plačiai aptariamo literatūroje. Tuomet pereinama prie priemonių, padedančių įdiegti žinių valdymą praktikoje. Antrame skyriuje „Žinių valdymo sistemų ir jų architektūrų analizė“ analizuojamos kitų tyrėjų veikaluose siūlomos žinių valdymo sistemų architektūros. Klasifikuojami ir tiriami tokių sistemų arba sprendimų pavyzdžiai, suskirstant juos į komercinius ir atvirojo kodo šaltinius. Išanalizavus egzistuojančias žinių valdymo sistemų architektūras, trečiame skyriuje „Sistemos architektūra, grindžiamos žinių valdymo procesais“ siūloma žinių valdymo sistemos architektūra, sutelkta į žinių valdymo procesus. Tuomet apibūdinamos informacinės ir komunikacinės technologijos, kurias galima taikyti kiekvienam žinių valdymo... [toliau žr. visą tekstą]
82

Reducing Size and Complexity of the Security-Critical Code Base of File Systems

Weinhold, Carsten 09 July 2014 (has links) (PDF)
Desktop and mobile computing devices increasingly store critical data, both personal and professional in nature. Yet, the enormous code bases of their monolithic operating systems (hundreds of thousands to millions of lines of code) are likely to contain exploitable weaknesses that jeopardize the security of this data in the file system. Using a highly componentized system architecture based on a microkernel (or a very small hypervisor) can significantly improve security. The individual operating system components have smaller code bases running in isolated address spaces so as to provide better fault containment. Their isolation also allows for smaller trusted computing bases (TCBs) of applications that comprise only a subset of all components. In my thesis, I built VPFS, a virtual private file system that is designed for such a componentized system architecture. It aims at reducing the amount of code and complexity that a file system implementation adds to the TCB of an application. The basic idea behind VPFS is similar to that of a VPN, which securely reuses an untrusted network: The core component of VPFS implements all functionality and cryptographic algorithms that an application needs to rely upon for confidentiality and integrity of file system contents. These security-critical cores reuse a much more complex and therefore untrusted file system stack for non-critical functionality and access to the storage device. Additional trusted components ensure recoverability.
83

Assessing the maturity of information architectures for complex dynamic enterprise systems

Mykityshyn, Mark 14 November 2007 (has links)
This dissertation investigates the dynamics that underlie enterprise performance and takes a significant step toward showing how it might be predicted. In a novel approach, a comprehensive Enterprise System Architecture (ESA) is developed that introduces separate layers for strategic and operational processes, respectfully. We identify four broad dimensions that contribute to and influence enterprise performance: (1) enterprise processes, (2) technology-based support of enterprise processes [denoted information systems], (3) technology structure and deployment [denoted information technology], and (4) Enterprise Architecture (EA). Detailed interviews were conducted with ten executives, mostly from the aerospace and defense industry, along with a web-based survey of aerospace and defense industry executives. We empirically determine a value for each dimension of maturity and individually assess it as a predictor of enterprise performance. ESA maturity is calculated as the weighted summation of each of dimensional maturity, and is also evaluated as a predictor of enterprise performance. Results indicate that ESA maturity, the weighted summation of process maturity, information systems maturity, information technology maturity, and enterprise architecture maturity, is a good predictor of enterprise performance. In order to provide some practical utility to our empirical results, we outline an ESA maturity assessment framework to enable decision-makers to assess the overall maturity of an enterprise system. Two other extensions of our research results, the development of a strategic layer analysis / portrayal tool, and enterprise system simulation, are also briefly described.
84

Efficiency of hospitals : Evaluation of Cambio COSMIC system

Li, Haorui January 2007 (has links)
In this modern world, healthcare has becoming a popular word in human life. People pay their attention on their health protection and treatment, but at the same time, they need to bear the high expenditure for their healthcare processing. It is a serious problem that the government income can not afford the large expense in healthcare industry. Especially in some developing countries, healthcare problem has become the problem for the nation development. We would like to choose this basic way to solve this problem directly, to provide the channel to improve the efficiency of healthcare system, Cambio COSMIC. The aim to analysis COSMIC for my case study is to find out the conclusion that how does the architect design the system from the stakeholders requirement to achieve the success of improving the efficiency of healthcare system. And how to measure the success for the system achieving to improve the efficiency of healthcare system is still required to indicate.
85

A Systematic Approach for Tool-Supported Performance Management of Engineering Education

Traikova, Aneta 26 November 2019 (has links)
Performance management of engineering education emerges from the need to assure proper training of future engineers in order to meet the constantly evolving expectations and challenges for the engineering profession. The process of accreditation ensures that engineering graduates are adequately prepared for their professional careers and responsibilities by ensuring that they possess an expected set of mandatory graduate attributes. Engineering programs are required by accreditation bodies to have systematic performance management of their programs that informs a continuous improvement process. Unfortunately, the vast diversity of engineering disciplines, varieties of information systems, and the large number of actors involved in the process makes this task challenging and complex. We performed a systematic literature review of jurisdictions around the world who are doing accreditation and examined how universities across Canada, US and other countries, have addressed tool support for performance management of engineering education. Our initial systematic approach for tool supported performance management evolved from this, and then we refined it through an iterative process of combined action research and design science research. We developed a prototype, Graduate Attribute Information Analysis (GAIA) in collaboration with the School of Electrical Engineering and Computer Science at the University of Ottawa, to support a systematic approach for accreditation of three engineering programs. This thesis contributes to research on the problem by developing a systematic approach, a tool that supports it, a set of related data transformations, and a tool-assessment checklist. Our systematic approach for tool-supported performance management addresses system architecture, a common continuous improvement process, a common set of key performance indicators, and identifies the performance management forms and reports needed to analyze graduate attribute data. The data transformation and analysis techniques we demonstrate ensure the accurate analysis of statistical and historical trends.
86

Data Processing and Collection in Distributed Systems

Andersson, Sara January 2021 (has links)
Distributed systems can be seen in a variety of applications that is in use today. Tritech provides several systems that to some extent consist of distributed systems of nodes. These nodes collect data and the data have to be processed. A problem that often appears when designing these systems, is deciding where the data should be processed, i.e., which architecture is the most suitable one for the system. Decide the architecture for these systems are not simple, especially since it changes rather quickly due to the development in these areas. The thesis aims to perform a study regarding which factors affect the choice of architecture in a distributed system and how these factors relate to each other. To be able to analyze which factors do affect the choice of architecture and to what extent, a simulator was implemented. The simulator received information about the factors as input, and return one or several architecture configurations as output. By performing qualitative interviews, the input factors to the simulator were chosen. The factors that were analyzed in the thesis was: security, storage, working memory, size of data, number of nodes, data processing per data set, robust communication, battery consumption, and cost. From the qualitative interviews as well as from the prestudy five architecture configuration was chosen. The chosen architectures were: thin-client server, thick-client server, three-tier client-server, peer-to-peer, and cloud computing. The simulator was validated regarding the three given use cases: agriculture, the train industry, and industrial Internet of Things. The validation consisted of five existing projects from Tritech. From the results of the validation, the simulator produced correct results for three of the five projects. By using the simulator results, it could be seen which factors affect the choice of architecture more than others and are hard to provide in the same architecture since they are conflicting factors. The conflicting factors were security together with working memory and robust communication. The factor working memory together with battery consumption also showed to be conflicting factors and is hard to provide within the same architecture. Therefore, according to the simulator, it can be seen that the factors that affect the choice of architecture were working memory, battery consumption, security, and robust communication. By using the results of the simulator, a decision matrix was designed whose purpose was to facilitate the choice of architecture. The evaluation of the decision matrix consisted of four projects from Tritech including the three given use cases: agriculture, the train industry, and industrial Internet of Things. The evaluation of the decision matrix showed that the two architectures that received the most points, one of the architectures were used in the validated project. / Distribuerade system kan ses i en mängd olika applikationer som används idag. Tritech jobbar med flera produkter som till viss del består av distribuerade system av noder. Det dessa system har gemensamt är att noderna samlar in data och denna data kommer på ett eller ett annat sätt behöva bearbetas. En fråga som ofta behövs besvaras vid uppsättning av arkitekturen för sådana projekt är huruvida datan ska bearbetas, d.v.s. vilken arkitektkonfiguration som är mest lämplig för systemet. Att ta dessa beslut har visat sig inte alltid vara helt simpelt, och det ändrar sig relativt snabbt med den utvecklingen som sker på dessa områden. Denna uppsats syftar till att utföra en studie om vilka faktorer som påverkar valet av arkitektur för ett distribuerat system samt hur dessa faktorer förhåller sig mot varandra. För att kunna analysera vilka faktorer som påverkar valet av arkitektur och i vilken utsträckning, implementerades en simulator. Simulatorn tog faktorerna som input och returnerade en eller flera arkitekturkonfigurationer som output. Genom att utföra kvalitativa intervjuer valdes faktorerna till simulatorn. Faktorerna som analyserades i denna uppsats var: säkerhet, lagring, arbetsminne, storlek på data, antal noder, databearbetning per datamängd, robust kommunikation, batteriförbrukning och kostnad. Från de kvalitativa intervjuerna och från förstudien valdes även fem stycken arkitekturkonfigurationer. De valda arkitekturerna var: thin-client server, thick-client server, three-tier client-server, peer-to-peer, och cloud computing. Simulatorn validerades inom de tre givna användarfallen: lantbruk, tågindustri och industriell IoT. Valideringen bestod av fem befintliga projekt från Tritech. Från resultatet av valideringen producerade simulatorn korrekta resultat för tre av de fem projekten. Utifrån simulatorns resultat, kunde det ses vilka faktorer som påverkade mer vid valet av arkitektur och är svåra att kombinera i en och samma arkitekturkonfiguration. Dessa faktorer var säkerhet tillsammans med arbetsminne och robust kommunikation. Samt arbetsminne tillsammans med batteriförbrukning visade sig också vara faktorer som var svåra att kombinera i samma arkitektkonfiguration. Därför, enligt simulatorn, kan det ses att de faktorer som påverkar valet av arkitektur var arbetsminne, batteriförbrukning, säkerhet och robust kommunikation. Genom att använda simulatorns resultat utformades en beslutsmatris vars syfte var att underlätta valet av arkitektur. Utvärderingen av beslutsmatrisen bestod av fyra projekt från Tritech som inkluderade de tre givna användarfallen: lantbruk, tågindustrin och industriell IoT. Resultatet från utvärderingen av beslutsmatrisen visade att de två arkitekturerna som fick flest poäng, var en av arkitekturerna den som användes i det validerade projektet
87

Analysis of Verification and Validation Techniques for Educational CubeSat Programs

Weitz, Noah 01 May 2018 (has links) (PDF)
Since their creation, CubeSats have become a valuable educational tool for university science and engineering programs. Unfortunately, while aerospace companies invest resources to develop verification and validation methodologies based on larger-scale aerospace projects, university programs tend to focus resources on spacecraft development. This paper looks at two different types of methodologies in an attempt to improve CubeSat reliability: generating software requirements and utilizing system and software architecture modeling. Both the Consortium Requirements Engineering (CoRE) method for software requirements and the Monterey Phoenix modeling language for architecture modeling were tested for usability in the context of PolySat, Cal Poly's CubeSat research program. In the end, neither CoRE nor Monterey Phoenix provided the desired results for improving PolySat's current development procedures. While a modified version of CoRE discussed in this paper does allow for basic software requirements to be generated, the resulting specification does not provide any more granularity than PolySat's current institutional knowledge. Furthermore, while Monterey Phoenix is a good tool to introduce students to model-based systems engineering (MBSE) concepts, the resulting graphs generated for a PolySat specific project were high-level and did not find any issues previously discovered through trial and error methodologies. While neither method works for PolySat, the aforementioned results do provide benefits for university programs looking to begin developing CubeSats.
88

The Architecture of Blockchain System across the Manufacturing Supply Chain

Lu, Zheyi January 2018 (has links)
With the increasing popularity of blockchain - the cryptocurrency technology, the decentralized potential of the Blockchain technique is driving a new wave across the manufacturing industry. This paper introduce how to use the blockchain technique as a tool for solving supply chain related tasks in manufacture industry, and drive quantum leaps in efficiency, agility and innovation comparing with traditional centralized management system. This paper introduces the blockchain technique with its value properties and the requirement of this technique from manufacture industry. It also presents a clear blockchain architecture based on manufacture industry supply chain management mechanism describing its characteristics, unique consensus algorithms, smart contracts, network, scalability, databases. The paper also gives out a practical supply chain Dapp upon this architecture. / I och med det ökande intresset för kryptovaluta-teknologin Blockchain, går decentraliseringen av Blockchain-tekniken som en ny våg över tillverkningsindustrin. Denna uppsats syftar till att introducera hur tekniken av blockchain kan användas som ett verktyg för att lösa problem relaterade till leverantörskedjan i tillverkningen. Den belyser även vilka fördelar tekniken har gällande effektivitet, flexibilitet och förnyelse jämfört med traditionella centraliserade styrningssystem. Arbetet presenterar fördelarna med blockchain och hur industrin är i behov av denna teknik. Uppsatsen presenterar även en tydlig blockchain konstruerad struktur baserad på tillverkningskedjans mekanism som består av unika algoritmer, nätverk och databaser. Ett praktiskt exempel på en decentraliserad applikation baserat på denna struktur ges även.
89

Разработка торговой стратегии криптовалют для определения точек входа и выхода из торговых позиций на основе алгоритмов машинного обучения : магистерская диссертация / Development of a cryptocurrency trading strategy to determine entry and exit points for trading positions based on machine learning algorithms

Першин, А. Д., Pershin, A. D. January 2023 (has links)
Объектом настоящего исследования являются алгоритмы и методы машинного обучения, и их применение в задачах прогнозирования временных рядов и анализа текста. В данном исследовании предложено применить модифицированную архитектуру рекуррентной нейронной сети (LSTM) для предсказания цены закрытия криптовалютных котировок на следующий день от текущего, а также, применить алгоритмы классификации, такие как: логистическая регрессия, Linear SVC, Gradient Boosting, для определения эмоциональной метки новостной записи для разработки стратегии прогнозирования точек входа и выхода из торговых позиций на рынке криптовалют. Исследование фокусируется на доказательстве того, что применение методов и алгоритмов машинного обучения для создания торговой стратегии для определения точек входа и выхода из торговой позиции, повысит эффективность процесса торговли, а также, ускорит процесс сбора и обработки аналитических данных для технического анализа рынка. Для обучения используемых моделей, разработаны и использованы программные средства (парсеры), с помощью которых извлекаются данные с криптовалютной торговой биржи Binance, а также, криптовалютной социальной сети CryptoPanic. Экспериментальные результаты показывают, что среднем автоматизированный процесс определения точек входа и выхода из торговых позиций быстрее в 2 раза чем при ручном определении, а количество сделок увеличится примерно на 17.5%. В итоге можно сделать вывод о том, что, используя передовые технологии возможно разработать инструмент для повышения эффективности торговли криптовалютой. / The object of this study is the algorithms and methods of machine learning, and their application in the problems of time series forecasting and text analysis. In this study, it is proposed to apply a modified architecture of a recurrent neural network (LSTM) to predict the closing price of cryptocurrency quotes the next day from the current one, and also to apply classification algorithms, such as: logistic regression, Linear SVC, Gradient Boosting, to determine the emotional label of a news entry to develop a strategy for predicting entry and exit points for trading positions in the cryptocurrency market. The study focuses on proving that the use of machine learning methods and algorithms to create a trading strategy to determine entry and exit points from a trading position will increase the efficiency of the trading process, as well as speed up the process of collecting and processing analytical data for technical market analysis. To train the models used, software tools (parsers) were developed and used, with the help of which data is extracted from the Binance cryptocurrency trading exchange, as well as the CryptoPanic cryptocurrency social network. Experimental results show that, on average, the automated process of determining entry and exit points from trading positions is 2 times faster than with manual determination, and the number of transactions will increase by about 17.5%. As a result, we can conclude that, using advanced technologies, it is possible to develop a tool to improve the efficiency of cryptocurrency trading.
90

Towards a comprehensive knowledge management system architecture

Smuts, Johanna Louisa 11 1900 (has links)
Knowledge management has roots in a variety of disciplines, such as philosophy, psychology, social sciences, management sciences and computing. As a result, a wide variety of theories and definitions of knowledge and knowledge management is used in the literature. Irrespective of the theory or definition used, is it recognised that expert knowledge and insight are gained through experience and practice and that it is a key differentiator as an organisational asset. This shift to knowledge as the primary source of value results in the new economy being led by those who manage knowledge effectively. Today’s organisations are creating and leveraging knowledge, data and information at an unprecedented pace – a phenomenon that makes the use of technology not an option, but a necessity. It enables employees to deal with multifaceted environments and problems and make it possible for organisations to expand their knowledge creation capacity. Software tools in knowledge management are a collection of technologies and are not necessarily acquired as a single software solution. Furthermore, these knowledge management software tools have the advantage of using the organisation’s existing information technology infrastructure. Organisations and business decision makers spend a great deal of resources and make significant investments in the latest technology, systems and infrastructure to support knowledge management. It is imperative that these investments are validated properly, made wisely and that the most appropriate technologies and software tools are selected or combined to facilitate knowledge management. The purpose of this interpretive case study is to consider these issues and to focus on an understanding of the key characteristics of a knowledge management system architecture by exploring and describing the nature of knowledge management. Based on the findings of this study, a list of key characteristics that a knowledge management solution must comply with was collated, which expanded the existing knowledge management model towards describing a knowledge management system architecture. / Computing / M.Sc. (Information Systems)

Page generated in 0.1001 seconds