651 |
Design počítačového tomografu / Design of Computer Tomography ScannerRonzová, Gabriela January 2013 (has links)
The presented master thesis concerns an own design concept of a CT scanner that meet the basic technical, ergonomical and social requirements and also brings a new look and shape as solution to the main topic.
|
652 |
Skidová dopravníková technika / Skid conveyor technologyFajkus, Pavel January 2014 (has links)
This diploma thesis deals with planning conveyor technology in welding factory in Kvasiny and concept proposal of stacking equipment for skids. The report contains summary of conveyor technology in Škoda Auto a.s., description of the planning process about connection line from new hall to the existing hall, proposal of stacking equipment constructional variants, mechanism load stress analysis, inner effects results and verification by final elements method (MKP) components of the choice.
|
653 |
Matematické metody zabezpečení přenosu digitálních dat / Mathematical security methods in digital data transferBartušek, Petr January 2014 (has links)
This master’s thesis deals with an analysis of digital security with CRC. In the thesis there is described a principle of coding theory, especially then digital security with CRC, for which there is explained a mathematical principle of their encoding and decoding, software implementation and a description of frequently used generator polynomials. The main aim of the thesis is a testing of undetected errors and a finding of number of these errors. After that it is used for the computation of probability with which undetected errors can occur. The thesis is supplemented with several programs which are programmed in the software Matlab.
|
654 |
Přesuvna / Transfer tableBartel, Jindřich January 2016 (has links)
Diploma thesis deals with the design of transfer table for steel sheets with stroke using screw jacks. The thesis contains a technical balance sheet construction of electromechanical and hydraulic transfer table. The main work is a description of the conceptual solutions and description of selected components. The thesis includes calculations wheel drive and stroke, design and control wheel blocks and traversing chain. End of thesis includes strength finite element analysis with the determination of the maximum stress and strain. The work accompanying drawings.
|
655 |
Mapování vyhledávacích tabulek z jazyka P4 do technologie FPGA / Mapping of Match Tables from P4 Language to FPGA TechnologyKekely, Michal January 2016 (has links)
This thesis deals with design and implementation of mapping of match action tables from P4 language to FPGA technology. Goal of the thesis was to describe key principles, which need to be understood in order to design such a mapping and function of algorithms needed, apply these principles by implementing them and analyze the speed and memory requirements of such an implementation. Outcome provides configurable hardware unit capable of classifying packets and connection between the unit and match action tables from P4 language. The implementation is based on DCFL algorithm and requires less memory compared to HiCuts and HyperCuts algorithms while being comparably fast at worst-case scenarios.
|
656 |
Développement d'un modèle hydrologique de colonne représentant l'interaction nappe - végétation - atmosphère et applications à l'échelle du bassin versant / Development of a soil column model for simulating the water table - vegetation - atmosphere interaction and applications to the catchment scaleMaquin, Mathilde 30 September 2016 (has links)
Dans le cadre de la modélisation climatique, la représentation du cycle de l'eau des surfaces continentales est primordiale. Actuellement, les "modèles de surface continentale" représentent l'évolution des flux d'eau verticaux dans des colonnes de sol de quelques mètres de profondeur et leur interaction avec l'atmosphère. En revanche, l'interaction avec les nappes de faible profondeur n'est pas prise en compte alors que leur présence influence fortement les flux d'évapotranspiration à l'échelle locale, et, en conséquence, le climat à l'échelle régionale. Une difficulté est que les zones où cette interaction apparaît relèvent d'une échelle inférieure à celle du maillage des modèles de surface continentale. L'objectif de cette thèse est de proposer un modèle qui permette de prendre en compte l'impact des nappes de faible profondeur sur les flux d'évapotranspiration pour les modélisations climatiques à l'échelle globale. La contrainte principale associée relève des temps de calculs, qui doivent être réduits pour permettre la réalisation de simulations sur de grandes échelles de temps et d'espace. Dans ce cadre, un nouveau modèle de colonne de sol est proposé. Une fonction de drainage imposée en bas de colonne permet de reproduire l'évolution temporelle du toit de la nappe, en interaction avec les processus d'infiltration et d'évapotranspiration. Le modèle est testé et validé sur des cas tests académiques simples dans un premier temps, puis sur le cas d'un bassin versant réel dans un second temps (bassin versant du Strengbach, en France). Enfin, une méthodologie basée sur ce modèle de colonne et permettant d'estimer les flux d'évapotranspiration en tenant compte de leur variabilité dans l'espace est introduite. Elle est appliquée à un bassin versant dont la superficie est proche de celle d'une maille classique des modèles de surface continentale (bassin versant du Little Washita, aux États-Unis). / The representation of the water cycle on land surfaces is essential for climate modeling. Nowadays, the "Land Surface Models" (LSMs) represent soil columns of a few meters deep and they simulate the temporal evolution of the vertical water flows and the interaction with the atmosphere. However, the interaction with a near-surface water table is not taken into account although it strongly influences the evapotranspiration fluxes at the local scale, and therefore the climate at the regional scale. This interaction, which occurs at a smaller scale than the grid scale of the LSMs, is difficult to model. The objective of this PhD is to propose a model that incorporates the impact of a near-surface water table on evapotranspiration fluxes for global climate models. The computation time of the model must be small enough to enable simulations at large spatial and temporal scales. In this context, a new soil column model is proposed with a drainage function that is imposed at the bottom of the column. This function aims at reproducing the temporal evolution of the water table level in interaction with both the infiltration and the evapotranspiration fluxes. The model is tested and validated on numerical experiments and on a real catchment (Strengbach, France). A methodology based on this column model is introduced to estimate the evapotranspiration fluxes taking into account their subgrid variability. This methodology is applied to a catchment whose area is similar to the one of a classic grid cell of LSMs (Little Washita, USA).
|
657 |
Recovering the Semantics of Tabular Web DataBraunschweig, Katrin 09 October 2015 (has links)
The Web provides a platform for people to share their data, leading to an abundance of accessible information. In recent years, significant research effort has been directed especially at tables on the Web, which form a rich resource for factual and relational data. Applications such as fact search and knowledge base construction benefit from this data, as it is often less ambiguous than unstructured text. However, many traditional information extraction and retrieval techniques are not well suited for Web tables, as they generally do not consider the role of the table structure in reflecting the semantics of the content. Tables provide a compact representation of similarly structured data. Yet, on the Web, tables are very heterogeneous, often with ambiguous semantics and inconsistencies in the quality of the data. Consequently, recognizing the structure and inferring the semantics of these tables is a challenging task that requires a designated table recovery and understanding process.
In the literature, many important contributions have been made to implement such a table understanding process that specifically targets Web tables, addressing tasks such as table detection or header recovery. However, the precision and coverage of the data extracted from Web tables is often still quite limited. Due to the complexity of Web table understanding, many techniques developed so far make simplifying assumptions about the table layout or content to limit the amount of contributing factors that must be considered. Thanks to these assumptions, many sub-tasks become manageable. However, the resulting algorithms and techniques often have a limited scope, leading to imprecise or inaccurate results when applied to tables that do not conform to these assumptions.
In this thesis, our objective is to extend the Web table understanding process with techniques that enable some of these assumptions to be relaxed, thus improving the scope and accuracy. We have conducted a comprehensive analysis of tables available on the Web to examine the characteristic features of these tables, but also identify unique challenges that arise from these characteristics in the table understanding process. To extend the scope of the table understanding process, we introduce extensions to the sub-tasks of table classification and conceptualization. First, we review various table layouts and evaluate alternative approaches to incorporate layout classification into the process. Instead of assuming a single, uniform layout across all tables, recognizing different table layouts enables a wide range of tables to be analyzed in a more accurate and systematic fashion.
In addition to the layout, we also consider the conceptual level. To relax the single concept assumption, which expects all attributes in a table to describe the same semantic concept, we propose a semantic normalization approach. By decomposing multi-concept tables into several single-concept tables, we further extend the range of Web tables that can be processed correctly, enabling existing techniques to be applied without significant changes.
Furthermore, we address the quality of data extracted from Web tables, by studying the role of context information. Supplementary information from the context is often required to correctly understand the table content, however, the verbosity of the surrounding text can also mislead any table relevance decisions. We first propose a selection algorithm to evaluate the relevance of context information with respect to the table content in order to reduce the noise. Then, we introduce a set of extraction techniques to recover attribute-specific information from the relevant context in order to provide a richer description of the table content.
With the extensions proposed in this thesis, we increase the scope and accuracy of Web table understanding, leading to a better utilization of the information contained in tables on the Web.
|
658 |
Improvement of Stiffness and Strength of Backfill Soils Through Optimization of Compaction Procedures and SpecificationsShahedur Rahman (8066420) 04 December 2019 (has links)
Vibration compaction is the most effective way of compacting coarse-grained materials. The effects of vibration frequency and amplitude on the compaction density of different backfill materials (No. 4 natural sand, No. 24 stone sand and No. 5, No. 8, No. 43 aggregates), were studied in this research. The test materials were characterized based on the particle sizes and morphology parameters using digital image analysis technique. Small-scale laboratory compaction tests were carried out with variable frequency and amplitude of vibrations using vibratory hammer and vibratory table. The results show an increase in density with the increase in amplitude and frequency of vibration. However, the increase in density with the increase in amplitude of vibration is more pronounced for the coarse aggregates than for the sands. A comparison of the maximum dry densities of different test materials shows that the dry densities obtained after compaction using the vibratory hammer are greater than those obtained after compaction using the vibratory table at the highest amplitude and frequency of vibration available in both equipment. Large-scale vibratory roller compaction tests were performed in the field for No. 30 backfill soil to observe the effect of vibration frequency and number of passes on the compaction density. Accelerometer sensors were attached to the roller drum (Caterpillar, model CS56B) to measure the frequency of vibration for the two different vibration settings available to the roller. For this roller and soil tested, the results show that the higher vibration setting is more effective. Direct shear tests and direct interface shear tests were performed to study the impact of particle characteristics of the coarse-grained backfill materials on interface shear resistance. A unique relationship was found between the normalized surface roughness and the ratio of critical-state interface friction angle between sand-gravel mixture with steel to the internal critical-state friction angle of the sand-gravel mixture.
|
659 |
Zhodnocení nejčastějších úrazů a zdravotních obtíží u závodních hráčů stolního tenistu / Evaluation of the most common injuries and health problems of professional table tennis playersTenglová, Vendula January 2019 (has links)
Title: Evaluation of the most common injuries and health problems of professional table tennis players Objectives: The main aim of this work is to evaluate the most common injuries and health problems of professional table tennis players in the Czech Republic. Another goal is to determine the level of use of regenerative procedures and compensatory exercises within the training plan of athletes. Methods: The main method used in this thesis was a quantitative research by data analysis from non-standardized questionnaire. The questionnaire was distributed among professional table tennis players who participated actively in the district, regional, league or extra league competition in the Czech Republic or abroad in the 2018/2019 season. A total of 374 questionnaires were processed and evaluated, which corresponds to 62,33 % of the total number of sent questionnaires. Results: Of the 374 table tennis players, 240 were injured (197 men, 43 women), which corresponds to 64,17% of the interviewed players. A total of 361 injuries were recorded. The research confirmed four out of six hypotheses. Statistical significance was noted between the injury and sex, and the statistically significant relationship between the injury and the different playing style was confirmed. Most injuries were in the area of the...
|
660 |
Persistence and Node FailureRecovery in Strongly Consistent Key-Value DatastoreEhsan ul Haque, Muhammad January 2012 (has links)
Consistency preservation of replicated data is a critical aspect for distributed databaseswhich are strongly consistent. Further, in fail-recovery model each process also needs todeal with the management of stable storage and amnesia [1]. CATS is a key/value datastore which combines the Distributed Hash Table (DHT) like scalability and selforganization and also provides atomic consistency of the replicated items. However beingan in memory data store with consistency and partition tolerance (CP), it suffers frompermanent unavailability in the event of majority failure. The goals of this thesis were twofold (i) to implement disk persistent storage in CATS,which would allow the records and state of the nodes to be persisted on disk and (ii) todesign nodes failure recovery-algorithm for CATS which enable the system to run with theassumption of a Fail Recovery model without violating consistency. For disk persistent storage two existing key/value databases LevelDB [2] and BerkleyDB[3] are used. LevelDB is an implementation of log structured merged trees [4] where asBerkleyDB is an implementation of log structured B+ trees [5]. Both have been used as anunderlying local storage for nodes and throughput and latency of the system with each isdiscussed. A technique to improve the performance by allowing concurrent operations onthe nodes is also discussed. The nodes failure-recovery algorithm is designed with a goalto allow the nodes to crash and then recover without violating consistency and also toreinstate availability once the majority of nodes recover. The recovery algorithm is based onpersisting the state variables of Paxos [6] acceptor and proposer and consistent groupmemberships. For fault-tolerance and recovery, processes also need to copy records from the replicationgroup. This becomes problematic when the number of records and the amount of data ishuge. For this problem a technique for transferring key/value records in bulk is alsodescribed, and its effect on the latency and throughput of the system is discussed.
|
Page generated in 0.0482 seconds