• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 134
  • 91
  • 33
  • 32
  • 26
  • 16
  • 13
  • 13
  • 9
  • 7
  • 6
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 443
  • 92
  • 84
  • 82
  • 61
  • 50
  • 39
  • 37
  • 34
  • 33
  • 32
  • 32
  • 30
  • 29
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

A separação dos poderes e os freios e contrapeso na Constituição de 1988: a atuação do Poder Judiciário

Camargo, Beatriz Meneghel Chagas 11 December 2017 (has links)
Submitted by Filipe dos Santos (fsantos@pucsp.br) on 2017-12-21T11:26:21Z No. of bitstreams: 1 Beatriz Meneghel Chagas Camargo.pdf: 1131638 bytes, checksum: 901a754fca66a7e2d1aaf9c639a3132b (MD5) / Made available in DSpace on 2017-12-21T11:26:21Z (GMT). No. of bitstreams: 1 Beatriz Meneghel Chagas Camargo.pdf: 1131638 bytes, checksum: 901a754fca66a7e2d1aaf9c639a3132b (MD5) Previous issue date: 2017-12-11 / The present study aimed to analyze whether the checks and balances in the Federal Constitution of 1988 ensure the balance between the Executive, Legislative and Judicial Powers. The separation of powers and the mechanisms of checks and balances were conceived with the intention of, in containing the abuse of the holders of the powers, to guarantee the fundamental rights of the individuals. The question that arises in this study is whether the way in which such instruments of mutual control are distributed among the three Powers, in the Federal Constitution of 1988, responds to the current Brazilian reality, in which the Judiciary has been gaining a prominent position / O presente estudo tem como objetivo analisar se os freios e contrapesos na Constituição Federal de 1988 asseguram o equilíbrio entre os Poderes Executivo, Legislativo e Judiciário. A separação dos poderes e os mecanismos de freios e contrapesos foram concebidos com o intuito de, ao conter o abuso dos titulares dos poderes, assegurar os direitos fundamentais dos indivíduos. Questão que se coloca neste estudo é se a forma como distribuídos tais instrumentos de controle recíproco entre os três Poderes, na Constituição Federal de 1988, atende à atual realidade brasileira, em que o Poder Judiciário vem ganhando posição de preeminência
192

Stanovení pracovního rozsahu a účinnosti vodního trkače / Determination of the working range and efficiency of the water hammer pump

Růžička, Jakub January 2019 (has links)
This thesis focuses on setting the range and efficiency of a water ram pump through experimental measurements in a laboratory. The measurements were carried out on three different constructions of water ram pump, each of the designed and assembled by the author of this thesis himself, to find out whether the weight on the pulsating valve has an influence on the water ram pump efficiency. Further on this thesis aims at the comparison of up to now methods of water transport arrangement used at a summer camp and the usage of the water ram pump and its advantages. The principal of water ram pump originates in hydraulic impact that is being precisely described in a great detail as well. On of the chapters even includes the placement manual for the water ram pump and its overall assembly and connexion. Because the presence of floating debris is inevitable, this thesis also deals with the possibility to solve the incoming water impurities by adding an inflow gating to prevent such problem. This inflow gating has only been designed in a Flow 3D programme to introduce the idea and functions.
193

A Program Evaluation of Check and Connect for Successful School Completion

Riggans-Curtis, Nicole 01 January 2017 (has links)
School leaders at an urban public high school implemented the Check and Connect (C&C) program to improve student engagement outcomes for at-risk students in 2010-2011. No formal program evaluation of C&C had been conducted in the 2012-2013, 2013-2014, and 2014-2015 school years to show whether the program was effective. The purpose of this study was to investigate the relationship between successful school completion and participation in the C&C program. A quantitative, quasi-experimental program evaluation was conducted to determine whether C&C's student-related variables including cohort, gender, ethnicity, socioeconomic status, and truancy predicted students' successful school completion. Archival data of students eligible for graduation (N = 668) were analyzed using chi square tests and logistic regression. Results showed that the model, including C&C participation and all student-related variables, was significant in explaining the variance for successful school completion. Follow-up analyses revealed that C&C participation for the 2013 graduation cohort only, females, and low truancy students were significantly more likely to complete school, suggesting a need for further investigation of the program's implementation strategy. An evaluation report was developed with recommendations to evaluate C&C for implementation fidelity and to consider the use of observable indicators to recruit students for C&C participation who may require targeted or intensive interventions for successful school completion. This endeavor may contribute to positive social change by informing stakeholders of C&C's effectiveness, helping leaders make future decisions about how to approach program implementation and evaluation, and increasing successful school completion.
194

Efficient Decoding Algorithms for Low-Density Parity-Check Codes / Effektiva avkodningsalgoritmer för low density parity check-koder

Blad, Anton January 2005 (has links)
<p>Low-density parity-check codes have recently received much attention because of their excellent performance and the availability of a simple iterative decoder. The decoder, however, requires large amounts of memory, which causes problems with memory consumption. </p><p>We investigate a new decoding scheme for low density parity check codes to address this problem. The basic idea is to define a reliability measure and a threshold, and stop updating the messages for a bit whenever its reliability is higher than the threshold. We also consider some modifications to this scheme, including a dynamic threshold more suitable for codes with cycles, and a scheme with soft thresholds which allow the possibility of removing a decision which have proved wrong. </p><p>By exploiting the bits different rates of convergence we are able to achieve an efficiency of up to 50% at a bit error rate of less than 10^-5. The efficiency should roughly correspond to the power consumption of a hardware implementation of the algorithm.</p>
195

Integer programming-based decomposition approaches for solving machine scheduling problems

Sadykov, Ruslan 26 June 2006 (has links)
The aim in this thesis is to develop efficient enumeration algorithms to solve certain strongly NP-hard scheduling problems. These algorithms were developed using a combination of ideas from Integer Programming, Constraint Programming and Scheduling Theory. In order to combine different techniques in one algorithm, decomposition methods are applied. The main idea on which the first part of our results is based is to separate the optimality and feasibility components of the problem and let different methods tackle these components. Then IP is ``responsible' for optimization, whereas specific combinatorial algorithms tackle the feasibility aspect. Branch-and-cut and branch-and-price algorithms based on this idea are proposed to solve the single-machine and multi-machine variants of the scheduling problem to minimize the sum of the weights of late jobs. Experimental research shows that the algorithms proposed outperform other algorithms available in the literature. Also, it is shown that these algorithms can be used, after some modification, to solve the problem of minimizing the maximum tardiness on unrelated machines. The second part of the thesis deals with the one-machine scheduling problem to minimize the weighted total tardiness. To tackle this problem, the idea of a partition of the time horizon into intervals is used. A particularity of this approach is that we exploit the structure of the problem to partition the time horizon. This particularity allowed us to propose two new Mixed Integer Programming formulations for the problem. The first one is a compact formulation and can be used to solve the problem using a standard MIP solver. The second formulation can be used to derive lower bounds on the value of the optimal solution of the problem. These lower bounds are of a good quality, and they can be obtained relatively fast.
196

Joint Equalization and Decoding via Convex Optimization

Kim, Byung Hak 2012 May 1900 (has links)
The unifying theme of this dissertation is the development of new solutions for decoding and inference problems based on convex optimization methods. Th first part considers the joint detection and decoding problem for low-density parity-check (LDPC) codes on finite-state channels (FSCs). Hard-disk drives (or magnetic recording systems), where the required error rate (after decoding) is too low to be verifiable by simulation, are most important applications of this research. Recently, LDPC codes have attracted a lot of attention in the magnetic storage industry and some hard-disk drives have started using iterative decoding. Despite progress in the area of reduced-complexity detection and decoding algorithms, there has been some resistance to the deployment of turbo-equalization (TE) structures (with iterative detectors/decoders) in magnetic-recording systems because of error floors and the difficulty of accurately predicting performance at very low error rates. To address this problem for channels with memory, such as FSCs, we propose a new decoding algorithms based on a well-defined convex optimization problem. In particular, it is based on the linear-programing (LP) formulation of the joint decoding problem for LDPC codes over FSCs. It exhibits two favorable properties: provable convergence and predictable error-floors (via pseudo-codeword analysis). Since general-purpose LP solvers are too complex to make the joint LP decoder feasible for practical purposes, we develop an efficient iterative solver for the joint LP decoder by taking advantage of its dual-domain structure. The main advantage of this approach is that it combines the predictability and superior performance of joint LP decoding with the computational complexity of TE. The second part of this dissertation considers the matrix completion problem for the recovery of a data matrix from incomplete, or even corrupted entries of an unknown matrix. Recommender systems are good representatives of this problem, and this research is important for the design of information retrieval systems which require very high scalability. We show that our IMP algorithm reduces the well-known cold-start problem associated with collaborative filtering systems in practice.
197

Parallel VLSI Architectures for Multi-Gbps MIMO Communication Systems

January 2011 (has links)
In wireless communications, the use of multiple antennas at both the transmitter and the receiver is a key technology to enable high data rate transmission without additional bandwidth or transmit power. Multiple-input multiple-output (MIMO) schemes are widely used in many wireless standards, allowing higher throughput using spatial multiplexing techniques. MIMO soft detection poses significant challenges to the MIMO receiver design as the detection complexity increases exponentially with the number of antennas. As the next generation wireless system is pushing for multi-Gbps data rate, there is a great need for high-throughput low-complexity soft-output MIMO detector. The brute-force implementation of the optimal MIMO detection algorithm would consume enormous power and is not feasible for the current technology. We propose a reduced-complexity soft-output MIMO detector architecture based on a trellis-search method. We convert the MIMO detection problem into a shortest path problem. We introduce a path reduction and a path extension algorithm to reduce the search complexity while still maintaining sufficient soft information values for the detection. We avoid the missing counter-hypothesis problem by keeping multiple paths during the trellis search process. The proposed trellis-search algorithm is a data-parallel algorithm and is very suitable for high speed VLSI implementation. Compared with the conventional tree-search based detectors, the proposed trellis-based detector has a significant improvement in terms of detection throughput and area efficiency. The proposed MIMO detector has great potential to be applied for the next generation Gbps wireless systems by achieving very high throughput and good error performance. The soft information generated by the MIMO detector will be processed by a channel decoder, e.g. a low-density parity-check (LDPC) decoder or a Turbo decoder, to recover the original information bits. Channel decoder is another very computational-intensive block in a MIMO receiver SoC (system-on-chip). We will present high-performance LDPC decoder architectures and Turbo decoder architectures to achieve 1+ Gbps data rate. Further, a configurable decoder architecture that can be dynamically reconfigured to support both LDPC codes and Turbo codes is developed to support multiple 3G/4G wireless standards. We will present ASIC and FPGA implementation results of various MIMO detectors, LDPC decoders, and Turbo decoders. We will discuss in details the computational complexity and the throughput performance of these detectors and decoders.
198

Efficient Decoding Algorithms for Low-Density Parity-Check Codes / Effektiva avkodningsalgoritmer för low density parity check-koder

Blad, Anton January 2005 (has links)
Low-density parity-check codes have recently received much attention because of their excellent performance and the availability of a simple iterative decoder. The decoder, however, requires large amounts of memory, which causes problems with memory consumption. We investigate a new decoding scheme for low density parity check codes to address this problem. The basic idea is to define a reliability measure and a threshold, and stop updating the messages for a bit whenever its reliability is higher than the threshold. We also consider some modifications to this scheme, including a dynamic threshold more suitable for codes with cycles, and a scheme with soft thresholds which allow the possibility of removing a decision which have proved wrong. By exploiting the bits different rates of convergence we are able to achieve an efficiency of up to 50% at a bit error rate of less than 10^-5. The efficiency should roughly correspond to the power consumption of a hardware implementation of the algorithm.
199

Automatisk kvalitetskontroll av terminologi i översättningar / Automatic quality checking of terminology in translations

Edholm, Lars January 2007 (has links)
<p>Kvalitet hos översättningar är beroende av korrekt användning av specialiserade termer, som kan göra översättningen lättare att förstå och samtidigt minska tidsåtgång och kostnader för översättningen (Lommel, 2007). Att terminologi används konsekvent är viktigt, och något som bör granskas vid en kvalitetskontroll av exempelvis översatt dokumentation (Esselink, 2000). Det finns idag funktioner för automatisk kontroll av terminologi i flera kommersiella program. Denna studie syftar till att utvärdera sådana funktioner, då ingen tidigare större studie av detta har påträffats.</p><p>För att få en inblick i hur kvalitetskontroll sker i praktiken genomfördes först två kvalitativa intervjuer med personer involverade i detta på en översättningsbyrå. Resultaten jämfördes med aktuella teorier inom området och visade på stor överensstämmelse med vad exempelvis Bass (2006) förespråkar.</p><p>Utvärderingarna inleddes med en granskning av täckningsgrad hos en verklig termdatabas jämfört med subjektivt markerade termer i en testkorpus baserad på ett autentiskt översättningsminne. Granskningen visade dock på relativt låg täckningsgrad. För att öka täckningsgraden modifierades termdatabasen, bland annat utökades den med längre termer ur testkorpusen.</p><p>Därefter kördes fyra olika programs funktion för kontroll av terminologi i testkorpusen jämfört med den modifierade termdatabasen. Slutligen modifierades även testkorpusen, där ett antal fel placerades ut för att få en mer idealiserad utvärdering. Resultaten i form av larm för potentiella fel kategoriserades och bedömdes som riktiga eller falska larm. Detta utgjorde basen för mått på kontrollernas precision och i den sista utvärderingen även deras recall.</p><p>Utvärderingarna visade bland annat att det för terminologi i översättningar på engelska - svenska var mest fördelaktigt att matcha termdatabasens termer som delar av ord i översättningens käll- och målsegment. På så sätt kan termer med olika böjningsformer fångas utan stöd för språkspecifik morfologi. En orsak till många problem vid matchningen var utseendet på termdatabasens poster, som var mer anpassat för mänskliga översättare än för maskinell läsning.</p><p>Utifrån intervjumaterialet och utvärderingarnas resultat formulerades rekommendationer kring införandet av verktyg för automatisk kontroll av terminologi. På grund av osäkerhetsfaktorer i den automatiska kontrollen motiveras en manuell genomgång av dess resultat. Genom att köra kontrollen på stickprov som redan granskats manuellt ur andra aspekter, kan troligen en lämplig omfattning av resultat att gå igenom manuellt erhållas. Termdatabasens kvalitet är avgörande för dess täckningsgrad för översättningar, och i förlängningen också för nyttan med att använda den för automatisk kontroll.</p> / <p>Quality in translations depends on the correct use of specialized terms, which can make the translation easier to understand as well as reduce the required time and costs for the translation (Lommel, 2007). Consistent use of terminology is important, and should be taken into account during quality checks of for example translated documentation (Esselink, 2000). Today, several commercial programs have functions for automatic quality checking of terminology. The aim of this study is to evaluate such functions since no earlier major study of this has been found.</p><p>To get some insight into quality checking in practice, two qualitative interviews were initially carried out with individuals involved in this at a translation agency. The results were compared to current theories in the subject field and revealed a general agreement with for example the recommendations of Bass (2006).</p><p>The evaluations started with an examination of the recall for a genuine terminology database compared to subjectively marked terms in a test corpus based on an authentic translation memory. The examination however revealed a relatively low recall. To increase the recall the terminology database was modified, it was for example extended with longer terms from the test corpus.</p><p>After that, the function for checking terminology in four different commercial programs was run on the test corpus using the modified terminology database. Finally, the test corpus was also modified, by planting out a number of errors to produce a more idealized evaluation. The results from the programs, in the form of alarms for potential errors, were categorized and judged as true or false alarms. This constitutes a base for measures of precision of the checks, and in the last evaluation also of their recall.</p><p>The evaluations showed that for terminology in translations of English to Swedish, it was advantageous to match terms from the terminology database using partial matching of words in the source and target segments of the translation. In that way, terms with different inflected forms could be matched without support for language﷓specific morphology. A cause of many problems in the matching process was the form of the entries in the terminology database, which were more suited for being read by human translators than by a machine.</p><p>Recommendations regarding the introduction of tools for automatic checking of terminology were formulated, based on the results from the interviews and evaluations. Due to factors of uncertainty in the automatic checking, a manual review of its results is motivated. By running the check on a sample that has already been manually checked in other aspects, a reasonable number of results to manually review can be obtained. The quality of the terminology database is crucial for its recall on translations, and in the long run also for the value of using it for automatic checking.</p>
200

Performance Evaluation of Low Density Parity Check Forward Error Correction in an Aeronautical Flight Environment

Temple, Kip 10 1900 (has links)
ITC/USA 2014 Conference Proceedings / The Fiftieth Annual International Telemetering Conference and Technical Exhibition / October 20-23, 2014 / Town and Country Resort & Convention Center, San Diego, CA / In some flight test scenarios the telemetry link is noise limited at long slant ranges or during signal fade events caused by antenna pattern nulls. In these situations, a mitigation technique such as forward error correction (FEC) can add several decibels to the link margin. The particular FEC code discussed in this paper is a variant of a low-density parity check (LDPC) code and is coupled with SOQPSK modulation in the hardware tested. This paper will briefly cover lab testing of the flight-ready hardware then present flight test results comparing a baseline uncoded telemetry link with a LDPC-coded telemetry link. This is the first known test dedicated to this specific FEC code in a real-world test environment with flight profile tailored to assess the viability of an LDPC-coded telemetry link.

Page generated in 0.0698 seconds