331 |
Vizualizace rozsáhlých grafových dat na webu / Large Graph Data Visualisation on the WebJarůšek, Tomáš January 2020 (has links)
Graph databases provide a form of data storage that is fundamentally different from a relational model. The goal of this thesis is to visualize the data and determine the maximum volume that current web browsers are able to process at once. For this purpose, an interactive web application was implemented. Data are stored using the RDF (Resource Description Framework) model, which represents them as triples with a form of subject - predicate - object. Communication between this database, which runs on server and client is realized via REST API. The client itself is then implemented in JavaScript. Visualization is performed by using the HTML element canvas and can be done in different ways by applying three specially designed methods: greedy, greedy-swap and force-directed. The resulting boundaries were determined primarily by measuring time complexities of different parts and were heavily influenced by user's goals. If it is necessary to visualize as much data as possible, then 150000 triples were set to be the limiting volume. On the other hand, if the goal is maximum quality and application smoothness, then the limit doesn't exceed a few thousand.
|
332 |
OPTIMIZATION OF NON-VIRAL GENE DELIVERY SYSTEM FOR IMAGE-GUIDED THERAPY FOR TRIPLE NEGATIVE BREAST CANCERSchilb, Andrew L. 30 August 2021 (has links)
No description available.
|
333 |
Simulace a měření služeb Triple Play v sítích FTTx / Simulation and measurement of triple play services in FTTx networksHorváth, Tomáš January 2013 (has links)
The aim of this thesis was to ascertain with the measurement methods of passive optical networks and the creation of a simulation model that corresponds to the measured network and comparison of results. The first chapter deals with Triple Play services, which are focused on different services (transmission of television signals, voice and data transmission). Each service consists of a subchapter and presents a detailed description of the parameters associated with the transmission. The second chapter of the theoretical part consists of an analysis of passive optical networks according to termination method of FTTx optical fiber. Partial part of this chapter consists of description of standard passive optical networks APON, BPON, GPON and EPON. The EPON standard is explained in detail. The theoretical part of the thesis is concluded with a description of active elements that will form the laboratory measured network. Testing of designed passive optical network is performed in the practical part of the thesis. The measurement is divided into two chapters. The first chapter consist of measuring of the „inanimate network“, as there are missing active elements in the nominated net. Network measurements were done by a direct measuring method and reflectometric method. The second chapter of the practical part consists of measuring the parameters of quality of service (QoS). This chapter is divided into subchapters, such as: the measurement using RFC 2544 standard, the measurement using ITU-T standard Y.156 (EtherSam) and bit error rate testing (BERT). The practical part ends with a designed simulation network model, which was proposed in OptSim 5.2. The aim of the simulation model was to maximally resemble a real network, due to result comparison.
|
334 |
Simulace Triple play služeb v pasivních optických sítích v prostředí OMNeT++ / Simulation of Triple Play services in passive optical networks with OMNeT++ applicationPuchrík, Matej January 2015 (has links)
The thesis deals with dynamic bandwidth allocation in passive optical networks of NG- PON2 standard. The paper also describes the so-called. triple play services and the practical part is a simulation of these services in passive optical networks NG-PON2 in the simulation environment OMNeT ++. As part of this work modules for passive optical network NG-PON2 were created as an expansion of project INET. Namely ONU, OLT and splitter modules were created. The first four chapters are theoretical and descibe older standards PON networks, further NG-PON2 networks and DBA algorithm then describes triple play services and explains their current status respectively. In another part a description of the programm OMNeT++ a description of the structure of simulation models. The practical part contains a description of modules, implementation of DBA algorithm and its modification, design simulated topology and detailed description of the configuration simulation. At the end of the practical part presents the results of simulations with the corresponding explanations.
|
335 |
Implementace triple-play služeb v heterogenní síti / Implementation of triple-play in heterogeneous networkObršlík, Lukáš January 2014 (has links)
This master thesis deals with implementing triple-play services and providing it’s quality of services in heterogenous comunnication networks. The aim of thesis is to apply the- oretical methods in real case and existing network infrastructure. Practical part aims to create technical solution to prioritize network traffic based on classification of required services. The technical solution is created with conditions of being possible to add more functions and to provide scalability.
|
336 |
Využití optických zesilovačů v sítích NG-PON2 / Using Optical Amplifiers in NG-PON2 NetworksHrmel, Martin January 2014 (has links)
The aim of the thesis was to introduce the standard ITU-T G.989. Introduction deals with passive network architecture, which clarifies the basic functional elements that occur in optical networks. Been described active components and the very principle of data transfer in the distribution network. Next section is devoted to new technologies that are defined for the latest generation of passive networks NG-PON2. There have been a principle of data transmission in such networks and their advantages and disadvantages. I described the principle of data transmission in such networks and their advantages and disadvantages. The practical part deals with creating a functional network of NG-PON2 also using optical amplifiers. Another suggestion was using a digital CATV transmission of QAM modulation under optical networks on a dedicated wavelength of 1550 nm. Also simulate the coexistence of this new standard with previous PON technologies. Finally, the work consisted in implementing the Miller’s code into OptSim v5.2 environment, which compared the transmission characteristics of each line of codes with this Miller’s code. All practical simulations were designed and simulated in program OptSim v5.2.
|
337 |
Návrh výroby součásti polohovacího mechanismu / Design of manufacturing technology for trackpoint partGarguláková, Lucie January 2009 (has links)
The disertation is focused to the technology of fineblanking. It is concerned to a production of part, which will be used as a mechanism for changing the position of seats in cars. The expected production series is 800.000 pieces per year. With reference to the quantity and demanded quality and also accuracy of blank is choosed precisely technology of fineblanking process with pressure edge. The part will be produced on triple-action press produced by Feintool AG Lyss. It is press GKP – F 160 with mechanical-hydraulic drive. This press and specialy create blanking tool produces two blanks per one action.
|
338 |
Corporate Social Responsibility : A brilliant term: but what is the point?Cederholm, Christoffer, Drijovski, Oliver January 2011 (has links)
Syfte Denna uppsats syftar till att kartlägga begreppet CSR och redogöra för dess innebörd. Uppsatsen syftar även till att förklara varför företag arbetar med CSR samt hur kunder förhåller sig till detta. Metod Vi har gjort en multipel fallstudie på företagen Electrolux AB, Swedbank AB och Swedish Match AB. Vi har genomfört telefonintervjuer med respektive företag samt gjort en kundundersökning för att se vad kunderna anser om företagens samhällsansvar. Uppsatsen bygger på en abduktiv forskningsansats och en kvalitativ undersökningsdesign. Teoretisk referensram I den teoretiska referensramen har vi kartlagt vad begreppet CSR betyder och vilka ansvarsområden som omfattas av begreppet. Vidare har vi lyft fram olika åsikter om CSR och beskrivit redovisningen av CSR samt redogjort för kopplingen mellan CSR och ekonomisk lönsamhet. Vi har även beskrivit två relevanta teorier som påverkar företagens CSR-arbete, nämligen intressentteorin och legitimitetsteorin. Resultat och slutsats CSR är ett begrepp som används för att beskriva företagens samhällsansvar. Begreppet omfattar ekonomiska, sociala, miljömässiga, etiska och legala aspekter, vilka samtliga måste samverka med varandra för att uppnå hållbarhet. CSR uppfattas positivt av kunderna och kan därmed ge företagen konkurrensfördelar samt bidra till en god ekonomisk lönsamhet. / Purpose The purpose of this paper is to examine the term CSR and analyze the meaning of the term. This paper also aims to explain why companies work with CSR and how customers’ in general are affected by CSR. Methodology We have done a multiple-case study at Electrolux AB, Swedbank AB, and Swedish Match AB. We performed a telephone interview with one CSR responsible person at each company. We also did a customer survey in order to find out what the customers think about the companies social responsibilities. The paper is based on an abductive reasoning and a qualitative research design. Theoretical perspectives The theoretical framework describes the term CSR, different aspects and opinions about CSR, sustainability reporting, and the correlation between CSR and economic viability. We have also described two relevant theories, the stakeholder theory and the legitimacy theory, which both affect the companies’ work with CSR. Conclusions CSR is a term which describes companies’ responsibility towards the society. The term CSR includes economic, social, environmental, ethical, and legal aspects. All of these aspects must be considered and interact with each other to reach the ultimate goal, sustainability. CSR is essential to customers and may therefore give responsible companies’ a competitive advantage and contribute to long-term economic viability.
|
339 |
Environnement d'assistance au développement de transformations de graphes correctes / Assistance framework for writing correct graph transformationsMakhlouf, Amani 08 February 2019 (has links)
Les travaux de cette thèse ont pour cadre la vérification formelle, et plus spécifiquement le projet ANR Blanc CLIMT (Categorical and Logical Methods in Model Transformation) dédié aux grammaires de graphes. Ce projet, qui a démarré en février 2012 pour une durée de 48 mois, a donné lieu à la définition du langage Small-tALC, bâti sur la logique de description ALCQI. Ce langage prend la forme d’un DSL (Domain Specific Language) impératif à base de règles, chacune dérivant structurellement un graphe. Le langage s’accompagne d’un composant de preuve basé sur la logique de Hoare chargé d’automatiser le processus de vérification d’une règle. Cependant, force est de constater que tous les praticiens ne sont pas nécessairement familiers avec les méthodes formelles du génie logiciel et que les transformations sont complexes à écrire. En particulier, ne disposant que du seul prouveur, il s’agit pour le développeur Small-tALC d’écrire un triplet de Hoare {P} S {Q} et d’attendre le verdict de sa correction sous la forme d’un graphe contre-exemple en cas d’échec. Ce contre-exemple est parfois difficile à décrypter, et ne permet pas de localiser aisément l’erreur au sein du triplet. De plus, le prouveur ne valide qu’une seule règle à la fois, sans considérer l’ensemble des règles de transformation et leur ordonnancement d’exécution. Ce constat nous a conduits à proposer un environnement d’assistance au développeur Small-tALC. Cette assistance vise à l’aider à rédiger ses triplets et à prouver ses transformations, en lui offrant plus de rétroaction que le prouveur. Pour ce faire, les instructions du langage ont été revisitées selon l’angle ABox et TBox de la logique ALCQI. Ainsi, conformément aux logiques de description, la mise à jour du graphe par la règle s’assimile à la mise à jour ABox des individus (les nœuds) et de leurs relations (les arcs) d’un domaine terminologique TBox (le type des nœuds et les étiquettes des arcs) susceptible d’évoluer. Les contributions de cette thèse concernent : (1) un extracteur de préconditions ABox à partir d’un code de transformation S et de sa postcondition Q pour l’écriture d’une règle {P} S {Q} correcte par construction, (2) un raisonneur TBox capable d’inférer des propriétés sur des ensembles de nœuds transformés par un enchaînement de règles {Pi} Si {Qi}, et (3) d’autres diagnostics ABox et TBox sous la forme de tests afin d’identifier et de localiser des problèmes dans les programmes. L’analyse statique du code de transformation d’une règle, combinée à un calcul d’alias des variables désignant les nœuds du graphe, permet d’extraire un ensemble de préconditions ABox validant la règle. Les inférences TBox pour un enchaînement de règles résultent d’une analyse statique par interprétation abstraite des règles ABox afin de vérifier formellement des états du graphe avant et après les appels des règles. A ces deux outils formels s’ajoutent des analyseurs dynamiques produisant une batterie de tests pour une règle ABox, ou un diagnostic TBox pour une séquence de règles / The overall context of this thesis is formal verification, and more specifically the ANR Blanc CLIMT project (Categorical and Logical Methods in Model Transformation) dedicated to graph grammars. This project, which started in February 2012 for 48 months, gave rise to the development of the Small- tALC language, a graph transformation language based on the ALCQI description logic. This language takes the form of an imperative DSL (Domain Specific Language) based on rules; from each rule structurally derives a graph. It goes with a proof component based on Hoare's logic designed to automate the process of rule verification. However, it must be assumed that not all developers are familiar with formal methods of software engineering, and that graph transformations are complex to write. In particular, using exclusively the prover, the Small- tALC developer must write a Hoare triple {P} S {Q} and wait for the feedback in the form of a counterexample graph in case of failure. This counter-example is sometimes difficult to interpret, and so it does not allow to easily locate the error within the triple. Moreover, the prover validates only one rule at once, without considering all the transformation rules and their execution order. This fact led us to propose an assistance framework for Small- tALC to help developers write their triples and prove their transformations, providing them more feedback than the prover does. To this purpose, the Small- tALC instructions have been reviewed according to the ABox and TBox aspects of the ALCQI logic. Thus, in accordance with description logics, updating the graph by the rule corresponds to the ABox updating of individuals (nodes) and their relationships (edges) of a TBox terminology domain (nodes concepts and edges labels) that is also expected to evolve. The contributions of this thesis concern: (1) an ABox precondition extractor from a transformation code S and its post-condition Q in order to produce a correct by construction rule {P} S {Q}, (2) a TBox reasoner to infer properties on sets of nodes transformed by a rule sequence {Pi} Si {Qi}, and (3) other ABox and TBox diagnostics based on tests to identify and locate errors in programs. The static analysis of the code of a transformation rule, combined with an alias calculus of the variables that can not designate the same nodes of the graph, allows to extract a set of ABox preconditions validating the rule. TBox inferences related to a sequence of rules result from a static analysis by abstract interpretation of the ABox rules. These inferences formally check graph states before and after rule calls. Beside these two formal tools, the framework features dynamic analyzers that produce test cases for an ABox rule, or a TBox diagnosis for a sequence of rules
|
340 |
Entwurf eines Frameworks für komplexe Events auf verteilten, reaktiven Datenbanken und Triple-StoresFreudenberg, Markus 02 February 2018 (has links)
Diese Arbeit stellt ein Framework für komplexe Events auf Grundlage von Event-Condition-Action-Rules für verteilte Datenquellen vor. Dabei soll die Definition von atomaren und komplexen Events, dezentral über eine administrative Applikation, möglich sein. Zur Ausführung von Aktionen jeglicher Art, als Reaktion auf ein eingetretenes Ereignis, kann auf die Funktionalität von SOAP-Endpunkten zurückgegriffen werden. Durch die Präsentation einer einfachen, aktiven Hülle für passive Triple-Stores einer Virtuoso-Datenbank soll eine Möglichkeit zur Schaffung reaktiver Triple-Stores demonstriert werden.
|
Page generated in 0.023 seconds