• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 134
  • 91
  • 33
  • 32
  • 26
  • 16
  • 13
  • 13
  • 9
  • 7
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 443
  • 92
  • 84
  • 82
  • 61
  • 50
  • 39
  • 37
  • 34
  • 33
  • 32
  • 32
  • 30
  • 29
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
381

Implementation and optimization of LDPC decoding algorithms tailored for Nvidia GPUs in 5G / Implementering och optimering av LDPC avkodningsalgoritmer anpassat för Nvidia GPU:er i 5G

Salomonsson, Benjamin January 2022 (has links)
Low-Density Parity-Check (LDPC) codes are linear error-correcting codes used to establish reliable communication between units on a noisy transmission channel in mobile telecommunications. LDPC algorithms detect and recover altered or corrupted message bits using sparse parity-check matrices in order to decipher messages correctly. LDPC codes have been shown to be fitting coding schemes for the fifth generation (5G) New Radio (NR), according to the third generation partnership project (3GPP).  TietoEvry, a consultant in telecom, has discovered that optimizations of LDPC decoding algorithms can be achieved/obtained with the use of a parallel computing platform called Compute Unified Device Architecture (CUDA), developed by NVIDIA. This platform utilizes the capabilities of a graphics processing unit (GPU) rather than a central processing unit (CPU), which in turn provides parallel computing. An optimized version of an LDPC decoding algorithm, the Min-Sum Algorithm (MSA), is implemented in CUDA and in C++ for comparison in terms of CPU execution time, to explore the capabilities that CUDA offers. The testing is done with a set of 12 sparse parity-check matrices and input-channel messages with different sizes. As a result, the CUDA implementation executes approximately 55% faster than a standard, unoptimized C++ implementation.
382

Evaluation of a content download service based on FLUTE and LDPC for improving the Quality of Experience over multicast wireless networks

De Fez Lava, Ismael 17 April 2014 (has links)
Esta tesis estudia la distribución de ficheros en redes inalámbricas, analizando diferentes mecanismos que permiten optimizar la transmisión en términos de ancho de banda y calidad de experiencia. Concretamente, la tesis se centra en la transmisión de ficheros en canales multicast. Dicha transmisión resulta adecuada en ciertos entornos y tiene múltiples aplicaciones, algunas de las cuales se presentan en este trabajo. La tesis analiza en profundidad FLUTE (File Delivery over Unidirectional Transport), un protocolo para el envío fiable de ficheros en canales unidireccionales, y presenta algunas propuestas para mejorar la transmisión a través de dicho protocolo. En este sentido, una de las bases de este protocolo es el uso de un mecanismo llamado Tabla de Envío de Ficheros (FDT), que se utiliza para describir los contenidos transmitidos. Este trabajo analiza cómo la transmisión de la FDT afecta al funcionamiento del protocolo FLUTE, y proporciona una metodología para optimizar el envío de contenido mediante FLUTE. Por otro lado, en la transmisión de ficheros por multicast resulta esencial ofrecer un servicio fiable. Entre los distintos mecanismos utilizados por FLUTE para ofrecer fiabilidad, este trabajo analiza principalmente los códigos de corrección AL-FEC (Application Layer ¿ Forward Error Correction), los cuales añaden redundancia a la transmisión para minimizar los efectos de las pérdidas en el canal. Al respecto, esta tesis evalúa los códigos LDPC Staircase y LDPC Triangle, comparando su funcionamiento bajo diferentes condiciones de transmisión. Además, en el caso de tener un canal de retorno, una de las principales contribuciones de esta tesis es la propuesta de códigos LDPC adaptativos para servicios de descarga de ficheros. En esta clase de códigos, el servidor de contenidos cambia dinámicamente la cantidad de protección FEC proporcionada en función de las pérdidas que detectan los usuarios. La evaluación demuestra el buen funcionamiento de estos códigos en distintos entornos. / De Fez Lava, I. (2014). Evaluation of a content download service based on FLUTE and LDPC for improving the Quality of Experience over multicast wireless networks [Tesis doctoral]. Editorial Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/37051 / Premios Extraordinarios de tesis doctorales
383

Integrating top-down and bottom-up approaches to design a cost-effective and equitable programme of measures for adaptation of a river basin to global change

Girard, Corentin Denis Pierre 07 January 2016 (has links)
[EN] Adaptation to the multiple facets of global change challenges the conventional means of sustainably planning and managing water resources at the river basin scale. Numerous demand or supply management options are available, from which adaptation measures need to be selected in a context of high uncertainty of future conditions. Given the interdependency of water users, agreements need to be found at the local level to implement the most effective adaptation measures. Therefore, this thesis develops an approach combining economics and water resources engineering to select a cost-effective programme of adaptation measures in the context of climate change uncertainty, and to define an equitable allocation of the cost of the adaptation plan between the stakeholders involved. A framework is developed to integrate inputs from the two main approaches commonly used to plan for adaptation. The first, referred to as "top-down", consists of a modelling chain going from global greenhouse gases emission scenarios to local hydrological models used to assess the impact of climate change on water resources. Conversely, the second approach, called "bottom-up", starts from assessing vulnerability at the local level to then identify adaptation measures used to face an uncertain future. Outcomes from these two approaches are integrated to select a cost-effective combination of adaptation measures through a least-cost optimization model developed at the river basin scale. The model is then used to investigate the trade-offs between different planning objectives defined in terms of environmental flow requirements, irrigated agriculture development, and the cost of the programme of measures. The performances of a programme of measures are finally assessed under different climate projections to identify robust and least-regret adaptation measures. The issue of allocating the cost of the adaptation plan is considered through two complementary perspectives. The outcome of a negotiation process between the stakeholders is modelled through the implementation of cooperative game theory to define cost allocation scenarios. These results are compared with cost allocation rules based on social justice principles to provide contrasted insights into a negotiation process. This innovative framework has been applied in a Mediterranean case study in the Orb River basin (France). Mid-term climate projections, downscaled from 9 General Climate Models, are used to assess the uncertainty associated with climate projections. Demand evolution scenarios have been developed to project agricultural and urban water demands on the 2030 time horizon. The least-cost river basin optimization model developed in GAMS allows the cost-effective selection of a programme of measures from a catalogue of 462 supply and demand management measures. Nine cost allocation scenarios based on different social justice principles have been discussed through face-to-face semi-structured interviews with 15 key informants and compared with solution concepts from cooperative game theory for a 3-player game defined at the river basin scale. The interdisciplinary framework developed in this thesis combines economics and water resources engineering methods, establishing a promising means of bridging the gap between bottom-up and top-down approaches and supporting the creation of cost-effective and equitable adaptation plans at the local level. / [ES] La adaptación a los múltiples aspectos del cambio global supone un reto para los enfoques convencionales de planificación y gestión sostenible de los recursos hídricos a escala de cuenca. Numerosas opciones de gestión de la demanda o de la oferta están disponibles, de entre las cuales es necesario seleccionar medidas de adaptación en un contexto de elevada incertidumbre sobre las condiciones futuras. Dadas las interdependencias existentes entre los usuarios del agua a nivel local, hace falta buscar acuerdos a escala de cuenca para implementar las medidas de adaptación más eficaces. Por este motivo, esta tesis desarrolla una metodología que, combinando economía e ingeniería de los recursos hídricos, busca seleccionar un programa de medidas coste-eficaz frente a las incertidumbres del cambio climático, y asimismo definir un reparto justo del coste de la adaptación entre los actores implicados. El marco metodológico ha sido desarrollado para integrar contribuciones de los dos principales enfoques utilizados para la planificación de la adaptación. El primero, denominado descendente ("top-down"), consiste en una cadena de modelación que va desde los escenarios de emisiones de gases efecto invernadero a nivel global hasta los modelos hidrológicos utilizados a nivel local para evaluar así el impacto del cambio climático sobre los recursos hídricos. Por el contrario, el segundo enfoque denominado ascendente ("bottom-up") empieza por evaluar la vulnerabilidad del sistema a nivel local para después identificar medidas de adaptación frente a un futuro incierto. Los resultados de los métodos mencionados previamente se han integrado con el fin de seleccionar una combinación coste-eficaz de medidas de adaptación a través de un modelo de optimización a menor coste a escala de cuenca. El modelo se utiliza para investigar las soluciones de compromiso ("trade-offs") entre diversos objetivos de planificación como son los caudales ecológicos necesarios, el desarrollo del regadío y el coste del programa de medidas. Seguidamente, se han evaluado los programas de adaptación frente a varias condiciones climáticas para definir así un programa de medidas robusto y de arrepentimiento mínimo frente al cambio climático. En la última parte se aborda el problema del reparto justo de los costes del plan de adaptación, entendiendo que esto es una manera de favorecer su implementación. Para ello, se han modelado los resultados de un proceso de negociación entre los diferentes actores mediante escenarios de reparto basados en la teoría de juegos cooperativos. Posteriormente, se han comparado estos resultados con otras reglas de reparto de costes basadas en principios de justicia social, proporcionando así un punto de vista diferente al proceso de negociación. Este novedoso enfoque ha sido aplicado a una cuenca mediterránea, la cuenca del rio Orb (Francia). Para ello, se han empleado proyecciones climáticas a medio-plazo de datos reescalados de 9 Modelos de Circulación Global. Además, se han desarrollado escenarios de evolución de la demanda en los sectores urbano y agrícola para el horizonte de planificación de 2030. El modelo de optimización a menor coste a escala de cuenca desarrollado en GAMS permite seleccionar un programa de medidas, de entre las 462 medidas de gestión de la oferta o de la demanda. Nueve escenarios de reparto de costes basados en diferentes principios de justicia social han sido debatidos con informantes clave mediante entrevistas y comparados con conceptos de solución de la teoría de juegos cooperativos, considerando un juego de 3 jugadores a escala de cuenca. El marco interdisciplinario desarrollado durante esta tesis combina métodos de economía y de ingeniería de los recursos hídricos de manera prometedora y permite integrar los enfoques "top-down" y "bottom-up", contribuyendo a definir un plan de adaptación coste-eficaz y justo a nivel local. / [CA] L'adaptació als múltiples aspectes del canvi global implica un repte per als enfocaments convencionals de planificació i gestió sostenible dels recursos hídrics a escala de conca. Existeixen nombroses opcions de gestió de la demanda y de la oferta. De entre elles, cal seleccionar mesures d'adaptació en un context d'incertesa elevada sobre les condicions futures. Donades les interaccions entre els usuaris de l'aigua a nivell local, és necessari buscar acords a escala de conca per tal d'implementar les mesures d'adaptació més eficaces. Per aquest motiu, la tesi desenvolupa una metodologia que, mitjançant la combinació d'economia i enginyeria dels recursos hídrics, siga adient per seleccionar un programa de mesures cost-eficaç per a fer front a les incerteses del canvi climàtic i, a més a més, definir un repartiment just del cost d'adaptació entre els actors implicats. El marc metodològic ha estat desenvolupat amb el fi de permetre integrar contribucions del principals enfocaments que s'utilitzen per a la planificació de l'adaptació. El primer, que es denomina descendent ("top-down"), consisteix a una cadena de modelació que va des dels escenaris d'emissions de gas d'efecte hivernacle a nivell global fins als models hidrològics a nivell local per avaluar l'impacte del canvi climàtic sobre els recursos hídrics. Per contra, el segon enfocament, que es denomina ascendent ("bottom-up"), comença per avaluar la vulnerabilitat del sistema a nivell local per a tot seguit identificar mesures d'adaptació de cara a un futur incert. Els resultats del mètodes esmentats prèviament, s'han integrat per a seleccionar una combinació de mesures d'adaptació cost-eficaç mitjançant un model d'optimització a menor cost a escala de conca. El model s'utilitza per investigar les solucions de compromís ("trade-offs") entre els diversos objectius de planificació, com són els cabals ecològics necessaris, el desenvolupament del regadiu i el cost del programa de mesures. A continuació, s'avaluen els programes d'adaptació per a varies condicions climàtiques amb el fi de definir un programa de mesures robust i de penediment mínim per a fer front al canvi climàtic. En la darrera part, s'escomet el problema del repartiment just dels costs del pla d'adaptació, considerant que això és una manera de facilitar la implementació del pla. En conseqüència, els resultats d'un procés de negociació entre els diferents actors han estat modelats mitjançant escenaris de repartiment basats en la teoria de jocs cooperatius. Tot seguit, els resultats s'han comparat amb altres regles de repartiment de costos basades en principis de justícia social. Això ha proporcionat un punt de vista diferent al procés de negociació. Aquest enfocament innovador s'ha aplicat a una conca mediterrània, la conca del riu Orb (França). Amb aquesta finalitat s'han utilitzat projeccions climàtiques a mig termini de dades reescalades de 9 Models de Circulació Global (MCG). A més a més, s'han desenvolupat escenaris d'evolució de la demanada en els sectors agrícola i urbà per a l'horitzó de planificació de 2030. El model d'optimització a menor cost a escala de conca desenvolupat en GAMS permet seleccionar un programa de mesures, de entre les 462 mesures de gestió de la oferta o de la demanda. Els nou escenaris de repartiment de costs han estat debatuts amb informants clau, mitjançant entrevistes, i comparats amb conceptes de solució de la teoria de jocs cooperatius, considerant un joc de 3 jugadors a escala de conca. El marc interdisciplinari desenvolupat al llarg de la tesi combina mètodes d'economia i d'enginyeria dels recursos hídrics de manera prometedora i permet la integració d'enfocaments "top-down" i "bottom-up", fet que contribueix a definir un pla d'adaptació cost-eficaç i just a escala local. / Girard, CDP. (2015). Integrating top-down and bottom-up approaches to design a cost-effective and equitable programme of measures for adaptation of a river basin to global change [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/59461 / Premios Extraordinarios de tesis doctorales
384

Confidence bands in quantile regression and generalized dynamic semiparametric factor models

Song, Song 01 November 2010 (has links)
In vielen Anwendungen ist es notwendig, die stochastische Schwankungen der maximalen Abweichungen der nichtparametrischen Schätzer von Quantil zu wissen, zB um die verschiedene parametrische Modelle zu überprüfen. Einheitliche Konfidenzbänder sind daher für nichtparametrische Quantil Schätzungen der Regressionsfunktionen gebaut. Die erste Methode basiert auf der starken Approximation der empirischen Verfahren und Extremwert-Theorie. Die starke gleichmäßige Konsistenz liegt auch unter allgemeinen Bedingungen etabliert. Die zweite Methode beruht auf der Bootstrap Resampling-Verfahren. Es ist bewiesen, dass die Bootstrap-Approximation eine wesentliche Verbesserung ergibt. Der Fall von mehrdimensionalen und diskrete Regressorvariablen wird mit Hilfe einer partiellen linearen Modell behandelt. Das Verfahren wird mithilfe der Arbeitsmarktanalysebeispiel erklärt. Hoch-dimensionale Zeitreihen, die nichtstationäre und eventuell periodische Verhalten zeigen, sind häufig in vielen Bereichen der Wissenschaft, zB Makroökonomie, Meteorologie, Medizin und Financial Engineering, getroffen. Der typische Modelierungsansatz ist die Modellierung von hochdimensionalen Zeitreihen in Zeit Ausbreitung der niedrig dimensionalen Zeitreihen und hoch-dimensionale zeitinvarianten Funktionen über dynamische Faktorenanalyse zu teilen. Wir schlagen ein zweistufiges Schätzverfahren. Im ersten Schritt entfernen wir den Langzeittrend der Zeitreihen durch Einbeziehung Zeitbasis von der Gruppe Lasso-Technik und wählen den Raumbasis mithilfe der funktionalen Hauptkomponentenanalyse aus. Wir zeigen die Eigenschaften dieser Schätzer unter den abhängigen Szenario. Im zweiten Schritt erhalten wir den trendbereinigten niedrig-dimensionalen stochastischen Prozess (stationär). / In many applications it is necessary to know the stochastic fluctuation of the maximal deviations of the nonparametric quantile estimates, e.g. for various parametric models check. Uniform confidence bands are therefore constructed for nonparametric quantile estimates of regression functions. The first method is based on the strong approximations of the empirical process and extreme value theory. The strong uniform consistency rate is also established under general conditions. The second method is based on the bootstrap resampling method. It is proved that the bootstrap approximation provides a substantial improvement. The case of multidimensional and discrete regressor variables is dealt with using a partial linear model. A labor market analysis is provided to illustrate the method. High dimensional time series which reveal nonstationary and possibly periodic behavior occur frequently in many fields of science, e.g. macroeconomics, meteorology, medicine and financial engineering. One of the common approach is to separate the modeling of high dimensional time series to time propagation of low dimensional time series and high dimensional time invariant functions via dynamic factor analysis. We propose a two-step estimation procedure. At the first step, we detrend the time series by incorporating time basis selected by the group Lasso-type technique and choose the space basis based on smoothed functional principal component analysis. We show properties of this estimator under the dependent scenario. At the second step, we obtain the detrended low dimensional stochastic process (stationary).
385

The non transferable cheque and the liability of the collecting and drawee banks

Papadopoulos, John 12 1900 (has links)
The paper is an attempt to deal with the non-transferable cheque. Three questions have been addressed: (a) Whether sections 58, 79 and 83 apply to non-transferable cheques; (b) whether the non-transferability of a cheque implies only that a cambial transfer is excluded, but transfer by means of a ordinary cession is still possible; (c) whether the collecting and drawee banks can be held liable for damages to the owner of a non-transferable cheque. (a) It is clear that section 58 does not apply to non-transferable cheques. After the decision in Eskom, it is also clear that section 79 does apply to such cheques. Regarding the applicability of section 83 to non-transferable cheques, there is uncertainty. (b) Whether the rights arising from a non-transferable cheque can be transferred by means of an ordinary cession, it is not yet clear. (c) That a collecting bank can be held delictually liable under the extended lex Aquilia was decided in lndac Electronics. By way of analogy, the same applies to a drawee bank acting negligently. / Mercantile Law / LL.M.
386

Beiträge zur Richtighaltung von Kreisformmessgeräten

Miethling, Klaus-Dietmar 04 May 2016 (has links) (PDF)
In der Arbeit werden normativ-technische und verfahrenstechnische Voraussetzungen zur Richtighaltung von Kreisformmessgeräten dargelegt. Dazu wird ein umfassendes Begriffssystem für die allgemeine Beschreibung von Bewegungsabweichungen von bewegten Bauteilen an Werkzeugmaschinen oder Formmessgeräten, z.B. Kreisformmessgeräten, als Grundlage für ihre Tolerierung und Messung vorgeschlagen. Bekannte Messverfahren zur Bestimmung von Rotationsabweichungen der Spindel von Kreisformmessgeräten werden theoretisch und praktisch untersucht. Es wird ein neues Messverfahren, das kontinuierliche Relativlagenmessverfahren, entwickelt und ebenfalls untersucht. Die untersuchten Messverfahren zur Bestimmung von Rotationsabweichungen ermöglichen verschiedene Messunsicherheiten bis zu weniger als 0,02 µm. Vorschläge für die Gestaltung des Prüfschemas zur Richtighaltung von Kreisformmessgeräten werden unterbreitet. auch unter: Zentralbibliothek/Magazin/MPF1443
387

運用使用者輸入欄位屬性偵測防禦資料隱碼攻擊 / Preventing SQL Injection Attacks Using the Field Attributes of User Input

賴淑美, Lai, Shu Mei Unknown Date (has links)
在網路的應用蓬勃發展與上網使用人口不斷遞增的情況之下,透過網路提供客戶服務及從事商業行為已經是趨勢與熱潮,而伴隨而來的風險也逐步顯現。在一個無國界的網路世界,威脅來自四面八方,隨著科技進步,攻擊手法也隨之加速且廣泛。網頁攻擊防範作法的演進似乎也只能一直追隨著攻擊手法而不斷改進。但最根本的方法應為回歸原始的程式設計,網頁欄位輸入資料的檢核。確實做好欄位內容檢核並遵守網頁安全設計原則,嚴謹的資料庫存取授權才能安心杜絕不斷變化的攻擊。但因既有系統對於輸入欄位內容,並無確切根據應輸入的欄位長度及屬性或是特殊表示式進行檢核,以致造成類似Injection Flaws[1]及部分XSS(Cross Site Scripting)[2]攻擊的形成。 面對不斷變化的網站攻擊,大都以系統原始碼重覆修改、透過滲透測試服務檢視漏洞及購買偵測防禦設備防堵威脅。因原始碼重覆修改工作繁重,滲透測試也不能經常施行,購買偵測防禦設備也相當昂貴。 本研究回歸網頁資料輸入檢核,根據輸入資料的長度及屬性或是特殊的表示式進行檢核,若能堅守此項原則應可抵禦大部分的攻擊。但因既有系統程式龐大,若要重新檢視所有輸入欄位屬性及進行修改恐為曠日費時。本文中研究以側錄分析、資料庫SCHEMA的結合及方便的欄位屬性定義等功能,自動化的處理流程,快速產生輸入欄位的檢核依據。再以網站動態欄位檢核的方式,於網站接收使用者需求,且應用程式尚未處理前攔截網頁輸入資料,根據事先明確定義的網站欄位屬性及長度進行資料檢核,如此既有系統即無須修改,能在最低的成本下達到有效防禦的目的。 / With the dynamic development of network application and the increasing population of using internet, providing customer service and making business through network has been a prevalent trend recently. However, the risk appears with this trend. In a borderless net world, threaten comes from all directions. With the progress of information technology, the technique of network attack becomes timeless and widespread. It seems that defense methods have to develop against these attack techniques. But the root of all should regress on the original program design – check the input data of data fields. The prevention of unceasing network attack is precisely check the content of data field and adhere to the webpage security design on principle, furthermore, the authority to access database is essential. Since most existing systems do not have exactly checkpoints of those data fields such as the length, the data type, and the data format, as a result, those conditions resulted in several network attacks like Injection Flaws and XSS. In response to various website attack constantly, the majority remodify the system source code, inspect vulnerabilities by the service of penetration test, and purchase the equipment of Intrusion Prevention Systems(IPS). However, several limitations influence the performance, such as the massive workload of remodify source code, the difficulty to implement the daily penetration test, and the costly expenses of IPS equipment. The fundamental method of this research is to check the input data of data fields which bases on the length, the data type and the data format to check input data. The hypothesis is that to implement the original design principle should prevent most website attacks. Unfortunately, most legacy system programs are massive and numerous. It is time-consuming to review and remodify all the data fields. This research investigates the analysis of network interception, integrates with the database schema and the easy-defined data type, to automatically process these procedures and rapidly generates the checklist of input data. Then, using the method of website dynamic captures technique to receive user request first and webpage input data before the system application commences to process it. According to those input data can be checked by the predefined data filed type and the length, there is no necessary to modify existing systems and can achieve the goal to prevent web attack with the minimum cost.
388

Прилог пројектовању, консолидацији и трансформацијама ограничења торке шеме базе података, заснован на платформски независним моделима / Prilog projektovanju, konsolidaciji i transformacijama ograničenja torke šeme baze podataka, zasnovan na platformski nezavisnim modelima / An Approach to Design, Consolidation and Transformations of Database Schema Check Constraints Based on Platform Independent Models

Obrenović Nikola 10 October 2015 (has links)
<p>Употреба платформски независног моделовања и генерисања<br />прототипова у развоју информационих система скраћује време<br />њиховог развоја и побољшава квалитет тог процеса. При томе,<br />циљ је обезбеђење могућности да развој свих аспеката<br />информационих система буде подржан оваквим приступом.<br />Ова дисертација треба да пружи одговарајући допринос у<br />остварењу наведеног циља. У дисертацији представљени су<br />алгоритми за трансформацију модела ограничења вредности у<br />извршив к&ocirc;д и консолидацију подшема са јединственом<br />шемом базе података, са аспекта ограничења вредности.</p> / <p>Upotreba platformski nezavisnog modelovanja i generisanja<br />prototipova u razvoju informacionih sistema skraćuje vreme<br />njihovog razvoja i poboljšava kvalitet tog procesa. Pri tome,<br />cilj je obezbeđenje mogućnosti da razvoj svih aspekata<br />informacionih sistema bude podržan ovakvim pristupom.<br />Ova disertacija treba da pruži odgovarajući doprinos u<br />ostvarenju navedenog cilja. U disertaciji predstavljeni su<br />algoritmi za transformaciju modela ograničenja vrednosti u<br />izvršiv k&ocirc;d i konsolidaciju podšema sa jedinstvenom<br />šemom baze podataka, sa aspekta ograničenja vrednosti.</p> / <p>The usage of platform-independent modelling and generation of<br />prototypes in information systems development reduces the<br />development time and improves the process quality. By that, the<br />goal is to have all elements of an information system supported by<br />this approach.<br />This dissertation should provide a contribution towards fulfilling the<br />given goal. In the dissertation, author presents algorithms for<br />check constraint model into executable code transformations and<br />algorithms for testing subschema consolidation with respect to<br />check constraints.</p>
389

Surface-based Synthesis of 3D Maps for Outdoor Unstructured Environments

Melkumyan, Narek January 2009 (has links)
Doctor of Philosophy(PhD) / This thesis is concerned with the theoretical and practical development of a surface-based mapping algorithm for reliable and robust localization and mapping in prior unknown and unstructured environments. A surface-based map consists of a set of compressed surfaces, processed and represented without geometrical modelling. Each surface in the surface-based map represents an object in the environment. The ability to represent the exact shapes of objects via individual surfaces during the mapping process makes the surface-based mapping algorithm valuable in a number of navigation applications, such as mapping of prior unknown indoor and outdoor unstructured environments, target tracking, path planning and collision avoidance. The ability to unify representations of the same object taken from different viewpoints into a single surface makes the algorithm capable of working in multi-robot mapping applications. A surface-based map of the environment is build incrementally by acquiring the 3D range image of the scene, extracting the objects' surfaces from the 3D range image, aligning the set of extracted surfaces relative to the map and unifying the aligned set of surfaces with surfaces in the map. In the surface unification process the surfaces representing the same object are unified to make a single surface. The thesis introduces the following new methods which are used in the surface-based mapping algorithm: the extraction of surfaces from 3D range images based on a scanned surface continuity check; homogenization of the representation of the non-homogenously sampled surfaces; the alignment of the surface set relative to a large set of surfaces based on surface-based alignment algorithm; evaluating the correspondence between two surfaces based on the overlap area between surfaces; unification of the two surfaces belonging to the same object; and surface unification for a large set of surfaces. The theoretical contributions of this thesis are demonstrated with a series of practical implementations in different outdoor environments.
390

Experimental Studies On A New Class Of Combinatorial LDPC Codes

Dang, Rajdeep Singh 05 1900 (has links)
We implement a package for the construction of a new class of Low Density Parity Check (LDPC) codes based on a new random high girth graph construction technique, and study the performance of the codes so constructed on both the Additive White Gaussian Noise (AWGN) channel as well as the Binary Erasure Channel (BEC). Our codes are “near regular”, meaning thereby that the the left degree of any node in the Tanner graph constructed varies by at most 1 from the average left degree and so also the right degree. The simulations for rate half codes indicate that the codes perform better than both the regular Progressive Edge Growth (PEG) codes which are constructed using a similar random technique, as well as the MacKay random codes. For high rates the ARG (Almost Regular high Girth) codes perform better than the PEG codes at low to medium SNR’s but the PEG codes seem to do better at high SNR’s. We have tried to track both near codewords as well as small weight codewords for these codes to examine the performance at high rates. For the binary erasure channel the performance of the ARG codes is better than that of the PEG codes. We have also proposed a modification of the sum-product decoding algorithm, where a quantity called the “node credibility” is used to appropriately process messages to check nodes. This technique substantially reduces the error rates at signal to noise ratios of 2.5dB and beyond for the codes experimented on. The average number of iterations to achieve this improved performance is practically the same as that for the traditional sum-product algorithm.

Page generated in 0.0661 seconds