• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 24
  • 24
  • 8
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

REDUCED COMPLEXITY TRELLIS DETECTION OF SOQPSK-TG

Nelson, Tom 10 1900 (has links)
ITC/USA 2006 Conference Proceedings / The Forty-Second Annual International Telemetering Conference and Technical Exhibition / October 23-26, 2006 / Town and Country Resort & Convention Center, San Diego, California / The optimum detector for shaped offset QPSK (SOQPSK) is a trellis detector which has high complexity (as measured by the number of detection filters and trellis states) due to the memory inherent in this modulation. In this paper we exploit the cross-correlated, trellis-coded, quadrature modulation (XTCQM) representation of SOQPSK-TG to formulate a reduced complexity detector. We show that a factor of 128 reduction in the number of trellis states of the detector can be achieved with a loss of only 0.2 dB in bit error rate performance as compared to optimum at P(b) = 10^(-5).
2

Saltwater spill site assessment and remediation in Northern Alberta

White, D'Arcy 07 December 2012 (has links)
This study focuses on the Alberta environmental site assessment process for salt contamination resulting from pipeline failures in the boreal forest of Alberta. A complex saltwater spill site is used as a case study to determine the effects of various parameters of interest, including electrical conductivity and sodium adsorption ratio. This study reviews the practical efficacy of the Alberta environmental site assessment process to ensure sites meet the legislated requirement for remediation closure in a timely and environmentally sound manner. The study includes a comparison of various parameters of interest on a case study site collected over a three-year period and reviews available remediation alternatives. The study provides a summary interpretation of how the existing regulatory process affects decisions to ensure site decontamination and sustainability of the boreal forest ecosystem where the upstream oil and gas industry operate, and includes recommendations for policy improvement.
3

Basel III a jeho dopady na bankovní sektor / Basel III and its impact on the banking sector

Hercíková, Alena January 2012 (has links)
The following pages of my master thesis aim to acquaint the reader with the major changes brought about by Basel III banking regulation. This new regulatory framework was created in response to the financial crisis (beginning in 2007), which revealed some weaknesses in the original Basel II regulation, and its purpose is to prevent future similar situations in the financial market by increasing the stability and resilience of the banking sector. Impacts of Basel III are reflected primarily in increased demand for quality capital used by banks and maintaining sufficient liquidity. As shown by the results of the analysis, these factors have further effect on interest spread of banks and the real economy.
4

The Market Valuation of the Deferred Tax Assets and Liabilities of Commercial Banks

Collum, Nina S 11 December 2015 (has links)
Bank regulators limit the amount of deferred tax assets includable in the capital ratio calculations which measure the bank’s financial health. The increased balances in bank deferred tax assets after the beginning of the financial crisis raised concerns that applying this deferred tax asset regulatory capital limitation may contribute to the need for banks to raise more capital. Value relevance is the ability of information disclosed in the financial statements to capture and summarize firm value. Deferred tax value relevance literature generally omits the regulated industries. Because fair value accounting plays a much larger role the banking industry, the market value of a bank has a different relationship to its book value than its unregulated counterparts. Using annual bank data from 2004 through 2012 for publicly held banks in the United States, this study empirically examines the value relevance of the banks’ net deferred tax assets and liabilities over time (pre-crisis versus crisis periods). Findings indicate that although the deferred tax liabilities are value relevant in both the pre-crisis and crisis periods, the value relevance of the net deferred tax assets is limited to crisis period (increased net deferred tax asset balances). An additional test shows that investors view the increased levels of net deferred tax assets in relation to total assets as concerns about the bank’s financial health. This study also examines the whether investors value the net deferred tax assets of less financially healthy banks (low Tier I capital ratios) differently from the healthier banks. Findings indicate that the coefficient of net deferred tax assets for the less financially healthy banks is negative and significant. Using another measure of financial health (high probability of failure) finds similar results. This study extends the value relevance literature to the deferred tax accounts of commercial banks. It also shows that the deferred tax asset accounts are valued differently than the other assets, supporting the deferred tax asset limitation for capital ratio calculations. Finally, this study provides information useful to analysts’ valuations of the banks’ deferred tax accounts.
5

COMMON DETECTORS FOR TIER 1 MODULATIONS

Nelson, Tom, Perrins, Erik, Rice, Michael 10 1900 (has links)
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada / The ARTM Tier 1 waveforms include two versions of Feher patented QPSK (FQPSK-B and FQPSK-JR) and a version of shaped offset QPSK (SOQPSK-TG). In this paper we examine three common detector architectures for the ARTM Tier 1 modulations: a symbol-by-symbol detector, a cross correlated trellis coded modulation (XTCQM) detector, and a continuous phase modulation (CPM) detector. We show that when used to detect Tier 1 modulations, these detectors perform well even without knowledge of the modulation used by the transmitter. The common symbol-by-symbol detector suffers a loss of 1.5 dB for SOQPSK-TG and 1.6 dB for FQPSK-JR in bit error rate performance relative to the theoretical optimum for these modulations. The common XTCQM detector provides a bit error rate performance that is 0.1 dB worse than optimum for SOQPSK-TG and that matches optimum performance for FQPSK-JR. The common CPM detector achieves a bit error rate performance that is 0.25 dB worse than optimum for SOQPSK-TG and that approximately matches optimum for FQPSK-JR. The common XTCQM detector provides the best bit error rate performance, but this detector also has the highest complexity.
6

EXPERIMENTAL RESULTS FOR PCM/FM, TIER 1 SOQPSK, AND TIER II MULTI-H CPM WITH TURBO PRODUCT CODES

Geoghegan, Mark 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Improving the spectral-efficiency of aeronautical telemetry has been a principal area of research over the last several years due to the increasing demand for more data and the limitation of available spectrum. These efforts have lead to the development of the ARTM Tier 1 SOQPSK and Tier II Multi-h CPM waveforms which improve the spectral efficiency by two and three times, as compared to legacy PCM/FM, while maintaining similar detection efficiency. Now that more spectrally efficient waveform options are becoming available, another challenge is to further increase the detection performance. Better detection efficiency translates into additional link margin that can be used to extend the operating range, support higher data throughput, or significantly improve the quality of the received data. It is well known that Forward Error Correction (FEC) is one means of achieving this objective at the cost of additional overhead and increased receiver complexity. However, as mentioned above, spectral efficiency is also vitally important meaning that the FEC must also have a low amount of overhead. Unfortunately, low overhead and high coding gain are generally conflicting trades, although recent work has shown that Turbo Product Codes (TPC) are a particularly attractive candidate. Computer simulations predict that very impressive gains in detection performance are possible for a relatively small increase in bandwidth. The main drawbacks are the additional complexity of the decoding circuitry and an increase in receive side latency. This paper presents the latest simulation and hardware performance results of PCM/FM, SOQPSK, and Multi-h CPM with TPC.
7

EXPERIMENTAL RESULTS FOR PCM/FM, TIER 1 SOQPSK, AND TIER II MULTI-H CPM WITH CMA EQUALIZATION

Geoghegan, Mark 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / It is widely recognized that telemetry channels, particularly airborne channels, are afflicted by multipath propagation effects. It has also been shown that adaptive equalization can be highly effective in mitigating these effects. However, numerous other factors influence the behavior of adaptive equalization, and the type of modulation employed is certainly one of these factors. This is particularly true on modulations that exhibit different operating bandwidths. Computer simulations using the Constant Modulus Algorithm (CMA) have recently been reported for PCM/FM, ARTM Tier 1 SOQPSK, and Tier II SOQPSK. These encouraging results have led to a hardware implementation of a CMA equalizer. This paper presents the latest results from this work.
8

Essays on Other Comprehensive Income

Black, Dirk January 2014 (has links)
<p>In Chapter 1, I review the existing literature on the investor and contracting usefulness of other comprehensive income (OCI) components. In Chapter 2, I perform empirical tests focused on one aspect of investor usefulness of accounting information: risk-relevance. I examine whether OCI component volatilities are associated with investors' returns volatility using a sample of bank holding companies from 1998 to 2012. The results indicate that the volatilities of unrealized gains and losses on available-for-sale (AFS) securities and cash-flow hedges, typically deemed beyond managers' control, are negatively associated with risk, while volatilities of OTTI losses, over which managers have relatively more control, are positively associated with risk. The results are consistent with investors perceiving the volatility of non-OTTI AFS unrealized gains and losses as relatively less important, less risky, or less risk-relevant, than the volatility of OTTI losses, and perceiving the volatility of OTTI losses as an informative signal about risk. In Chapter 3, I find that Tier 1 Capital including more components of accumulated other comprehensive income (AOCI), as stipulated by Basel III, is no more volatile than pre-Basel-III Tier 1 Capital, and that the volatilities of the AOCI components new to Tier 1 Capital are not positively associated with risk. In Chapter 4, I discuss future research.</p> / Dissertation
9

Finansinspektionens krav på högre kärnprimärkapital : En studie av de svenska storbankerna

Hoffmann, Frida, Ljungqvist Jansson, Kajsa January 2013 (has links)
Sammanfattning – ”Finansinspektionens krav på högre kärnprimärkapital: En studie av de svenska storbankerna” Datum: 2013-05-31 Nivå: Kandidatuppsats i företagsekonomi, 15 ECTS Institution: Akademin för ekonomi, samhälle och teknik, EST, Mälardalens Högskola Författare: Frida Hoffmann &amp; Kajsa Ljungqvist Jansson Titel: Finansinspektionens krav på högre kärnprimärkapital: En studie av de svenska storbankerna Handledare: Staffan Boström Nyckelord: Kärnprimärkapital, de svenska storbankerna, Finansinspektionen Frågeställning: Vilka strategier har de svenska storbankerna använt sig av för att uppnå kravet på högre kärnprimärkapital samt fördela kostnaderna som medföljer? Vilka konsekvenser har det hittills fått för bankerna då de redan börjat anpassa sig till kravet? Syfte: Syftet med studien är att beskriva vilka strategier som legat till grund för att de fyra svenska storbankerna skall klara av det högre kravet på kärnprimärkapital som Finansinspektionen ställer. Undersökningen ämnar även utvärdera vilka konsekvenser förändringarna har fått hittills för bankerna och vem som har belastats med kostnaden för det ökade kärnprimärkapitalet. Metod: Metoden som använts var av kvalitativ karaktär där sekundärdata som samlats in varit från böcker, rapporter och artiklar. Primärdata har samlats in från personliga intervjuer och analyserades sedan med hjälp av sekundära datan. Slutsats: De svenska storbankerna har implementerat tydliga strategier för att uppfylla Finansinspektionens krav. Det visade sig att de banker som använt sig av att se över sin prissättning också var de banker som ligger i topp vad gäller kärnprimärkapitalrelationen. Konsekvensen av det nya kravet har blivit ett större fokus på att ha rätt kunder samt kostnadsfokus för att på så vis täcka upp för de kostnader som kravet medför. / Abstract – “The higher demand on Core tier 1 capital from Finansinspektionen:                 study of the major Swedish banks”  Date: 2013-05-31 Level: Bachelor thesis in business administration, 15 ECTS Institution: School of Business, Society and Engineering Authors: Frida Hoffmann &amp; Kajsa Ljungqvist Jansson Title: The higher demand on Core tier 1 capital from Finansinspektionen: study of the major Swedish banks Tuthor: Staffan Boström Keywords: Core tier 1 capital, the four major Swedish banks, Finansinspektionen Research Question: What strategies have the four major Swedish banks used to fulfill the recommendations from Finansinspektionen regarding higher level of Core tier 1 capital and how have they allocated the costs included? What consequences have the banks experienced so far as they have started to adjust to the new requirements? Purpose: The purpose of this study is to describe what strategies have formed the basis of the four major Swedish banks to manage with the increasing demands on the Core tier 1 capital from Finansinspektionen. The study also intends to evaluate what consequences the changes have had so far and who have been charged with the costs that the demand brings. Method: The method used was of a qualitative nature where the secondary data collected was from books and reports. The primary data collected was from personal interviews and was then later analyzed using the secondary data collected. Conclusion: All of the major Swedish banks now meet the requirements that Finansinspektionen imposes on its Core tier 1 capital. The numbers that differ between banks depends on what strategy has been used. It was found during the study that the banks that revised their pricing was also the banks that are at the top in terms of Core tier 1 capital ratio. The consequence of the new requirement appears to have been a greater focus on having the right customers and the right cost to manage the costs that the new requirement brings.
10

Measuring The Adoption and The Effects of RPKI Route Validation and Filtering : Through active control plane and data plane measurements

Ricardo Hernández Torres, Sergio January 2022 (has links)
The BGP (Border Gateway Protocol) is responsible for establishing routing at the core of the Internet, yet it was not designed with security in mind. The Internet routing protocol is currently not secure — but its security can be enhanced. Initially conceived as a small community of trusted peers, the Internet has grown over time into a robust network of complex processes and securing these has become a priority. Thanks to the research community, the RPKI (Resource Public Key Infrastructure) protocol was designed to provide a layer of security to routing — by securing the origin, i.e., attesting that the source of the routing announcements is authorized to do so. As RPKI route validation has been recently widely adopted by multiple large carrier networks, many research projects have sought to measure the adoption of RPKI. This work aims to measure the adoption and the effects of RPKI route validation and filtering through the use of active experiments. A peering session was first established with one of the largest Tier-1 ISP: Arelion (formerly known as Telia Carrier) to announce and propagate a prefix with RPKI Valid, Invalid, and Unknown records. Then, the visibility of the prefix (in the control plane) and reachability of the prefix (in the data plane) was measured using visibility feeds from public BGP Route Collectors and reachability feeds from RIPE Atlas probes. The obtained results confirmed that some, but not all previously believed major networks, drop RPKI Invalid prefixes, affecting the destination network’s visibility. For networks that could still reach the destination, the data plane probes demonstrated that parameters such as the RTT and the hop count were not generally affected. A small increase in the destination network visibility was observed when comparing RPKI Valid with Unknown routes. All RPKI Valid Invalid and Unknown effects and their behavior are deeply analyzed. Data sets have been made publicly available for other researchers to analyze the data, and ensure the future of a more secure Internet. / BGP (Border Gateway Protocol) används för att sprida routinginformation mellan routrar i de tusentals nätverk som tillsammans bildar Internet, men det utformades inte med säkerhet i åtanke. Protokollet är i grunden inte säkert - men det kan bli det. Det som ursprungligen var en liten grupp sammanlänkade universitetsnätverk växte med tiden till att bli Internet, ett robust globalt nätverk med komplexa processer för utbyte av routinginformation. I ett modernt samhälle där vi kommit till att förlita oss på dess existens och funktion så har det blivit en prioritet att säkra dessa. Tack vare initiativ tagna i forsknings- och utvecklingsgruppen IETF (Internet Engineering Taskforce) utformades RPKI (Resource Public Key Infrastructure) för att tillhandahålla ett säkerhetslager för routing – genom att säkra ursprunget till routinginformation. Eftersom RPKI-validering nyligen har anammats av flera stora operatörsnätverk, har många forskningsprojekt försökt mäta användningen av RPKI. Detta arbete syftar till att mäta användningen och effekterna av RPKI-validering och filtrering genom användning av aktiva experiment. En BGP peering-session etablerades först med en av de större Tier-1 ISP: Arelion (tidigare känd som Telia Carrier) för att originera och sprida ett IP prefix med RPKI Valid, Invalid och Unknown poster. Sedan mättes prefixets synlighet (i kontrollplanet) och prefixets nåbarhet (i dataplanet) med hjälp av synlighetsflöden från offentliga BGP Route Collectors och nåbarhetsflöden från RIPE Atlas-prober. De erhållna resultaten bekräftade att vissa, men inte alla, stora nätverk blockerar RPKI Invalid prefix, vilket påverkar dess synlighet och nåbarhet. För nätverk som fortfarande kunde nå destinationen visade dataplanssonderna att parametrar som RTT och hoppantal inte påverkades generellt. En liten ökning av destinationsnätverkets synlighet observerades vid jämförelse av RPKI Valid med Unknown rötter. Alla RPKI Valid Invalid och Unknown effekter och deras beteende analyseras djupt. Datauppsättningar har gjorts offentligt tillgängliga för andra forskare för att analysera data och säkerställa framtiden för ett säkrare Internet.

Page generated in 0.0398 seconds