Spelling suggestions: "subject:"tradeoff"" "subject:"tradeoffs""
61 |
Efficient Lattice Decoders for the Linear Gaussian Vector Channel: Performance & Complexity AnalysisAbediseid, Walid 15 September 2011 (has links)
The theory of lattices --- a mathematical approach for representing infinite discrete points in Euclidean space, has become a powerful tool to analyze many point-to-point digital and wireless communication systems, particularly, communication systems that can be well-described by the linear Gaussian vector channel model. This is mainly due to the three facts about channel codes constructed using lattices: they have simple structure, their ability to achieve the fundamental limits (the capacity) of the channel, and most importantly, they can be decoded using efficient decoders called lattice decoders.
Since its introduction to multiple-input multiple-output (MIMO) wireless communication systems, sphere decoders has become an attractive efficient implementation of lattice decoders, especially for small signal dimensions and/or moderate to large signal-to-noise ratios (SNRs). In the first part of this dissertation, we consider sphere decoding algorithms that describe lattice decoding. The exact complexity analysis of the basic sphere decoder for general space-time codes applied to MIMO wireless channel is known to be difficult. Characterizing and understanding the complexity distribution is important, especially when the sphere decoder is used under practically relevant runtime constraints. In this work, we shed the light on the (average) computational complexity of sphere decoding for the quasi-static, LAttice Space-Time (LAST) coded MIMO channel.
Sphere decoders are only efficient in the high SNR regime and low signal dimensions, and exhibits exponential (average) complexity for low-to-moderate SNR and large signal dimensions. On the other extreme, linear and non-linear receivers such as minimum mean-square error (MMSE), and MMSE decision-feedback equalization (DFE) are considered attractive alternatives to sphere decoders in MIMO channels. Unfortunately, the very low decoding complexity advantage that these decoders can provide comes at the expense of poor performance, especially for large signal dimensions. The problem of designing low complexity receivers for the MIMO channel that achieve near-optimal performance is considered a challenging problem and has driven much research in the past years. The problem can solved through the use of lattice sequential decoding that is capable of bridging the gap between sphere decoders and low complexity linear decoders (e.g., MMSE-DFE decoder).
In the second part of this thesis, the asymptotic performance of the lattice sequential decoder for LAST coded MIMO channel is analyzed. We determine the rates achievable by lattice coding and sequential decoding applied to such a channel. The diversity-multiplexing tradeoff under such a decoder is derived as a function of its parameter--- the bias term. In this work, we analyze both the computational complexity distribution and the average complexity of such a decoder in the high SNR regime. We show that there exists a cut-off multiplexing gain for which the average computational complexity of the decoder remains bounded. Our analysis reveals that there exists a finite probability that the number of computations performed by the decoder may become excessive, even at high SNR, during high channel noise. This probability is usually referred to as the probability of a decoding failure. Such probability limits the performance of the lattice sequential decoder, especially for a one-way communication system. For a two-way communication system, such as in MIMO Automatic Repeat reQuest (ARQ) system, the feedback channel can be used to eliminate the decoding failure probability.
In this work, we modify the lattice sequential decoder for the MIMO ARQ channel, to predict in advance the occurrence of decoding failure to avoid wasting the time trying to decode the message. This would result in a huge saving in decoding complexity. In particular, we will study the throughput-performance-complexity tradeoffs in sequential decoding algorithms and the effect of preprocessing and termination strategies. We show, analytically and via simulation, that using the lattice sequential decoder that implements a simple yet efficient time-out algorithm for joint error detection and correction, the optimal tradeoff of the MIMO ARQ channel can be achieved with significant reduction in decoding complexity.
|
62 |
Determinantes da estrutura de capital das empresas do setor de construção civil, incorporação e exploração imobiliária dos Estados Unidos e da América LatinaSavelli, André Mário Lucchesi 16 August 2018 (has links)
Submitted by Aline Amarante (1146629@mackenzie.br) on 2018-11-08T17:02:43Z
No. of bitstreams: 2
ANDRE MARIO LUCCHESI SAVELLI.pdf: 1354981 bytes, checksum: 8add692f02598ae46cbf1c8ab52960ee (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Giovanna Brasil (1154060@mackenzie.br) on 2018-11-08T17:41:14Z (GMT) No. of bitstreams: 2
ANDRE MARIO LUCCHESI SAVELLI.pdf: 1354981 bytes, checksum: 8add692f02598ae46cbf1c8ab52960ee (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2018-11-08T17:41:14Z (GMT). No. of bitstreams: 2
ANDRE MARIO LUCCHESI SAVELLI.pdf: 1354981 bytes, checksum: 8add692f02598ae46cbf1c8ab52960ee (MD5)
license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Previous issue date: 2018-08-16 / Debt financing policies within business conglomerates have been one of the most debated
issues in corporate finance throughout the years. Diverse theories that deal with capital
structure actually justify multiple variables that could ultimately influence picking a particular
type of financing mechanism. This study uses the Trade-off and Pecking Order theories as it
tries to identify determinant factors in the capital structure of companies in the areas of civil
construction and at-large real estate development in Argentina, Brazil, Chile, Colombia,
United States, Mexico and Peru. As such, some variables were taken from the work
performed by Morri and Cristanziani (2009) and Morri and Artegiani (2015) in order to
appropriately select the corresponding survey hypotheses. The study undertakes Panel Data
methodology and uses company data extracted from Compustat, Economática and Capital IQ databases from 2007 to 2017. The results demonstrate American companies having a negative relationship between the level of debt and profit margins, and also between growth and volatility from operations; and, a positive relation between the level of debt and the size of the considered companies. However, companies in Latin America portray a negative relationship between the level of debt and profit margins, and between the level of debt and the cost of debt. Furthermore, the results show that during the post-crisis period (2011-2017), companies in Latin America produced greater debt leverage. When the sample size was analyzed by individual sectors, the statistical analyses confirm that debt leverage is greater for developers and real estate than it is for heavy construction companies. / A política de endividamento nas organizações é um dos assuntos mais discutidos em finanças
corporativas ao longo dos anos. As teorias que tratam sobre o tema de estrutura de capital
justificam inúmeras variáveis que podem influenciar a escolha de financiamento das
empresas. Este estudo busca identificar os determinantes da estrutura de capital das empresas
do setor de construção civil, incorporação e exploração imobiliária nos países Argentina,
Brasil, Chile, Colômbia, Estados Unidos, México e Peru, à luz das principais teorias sobre o
tema: a Tradeoff e a Pecking Order. Para atingir esse objetivo, foram tomadas por base
algumas variáveis consideradas nos trabalhos de Morri e Cristanziani (2009) e Morri e
Artegiani (2015), que serviram para a escolha das hipóteses de pesquisa. O estudo utiliza a
metodologia de Panel Data, e as informações referentes às empresas são extraídas da base de
dados Compustat, Economática e Capital IQ na periocidade anual entre 2007 e 2017. Os
resultados demonstram que as empresas americanas apresentam relação negativa entre o nível
de endividamento e a rentabilidade, o crescimento e a volatilidade do resultado operacional, e
relação positiva do endividamento com o tamanho. No entanto, as empresas da América
Latina apresentam relação negativa entre o nível de endividamento com a rentabilidade, e
com o custo da dívida. Além disso, os resultados demonstram que no período pós crise (2011
- 2017) as empresas da América Latina apresentaram maior alavancagem. Quando a amostra é
analisada por setor de atuação, os testes estatísticos comprovam que as incorporadoras e
exploradoras de imóveis são mais alavancadas que as empresas de construção pesada.
|
63 |
Towards Sustainable Cloud Computing: Reducing Electricity Cost and Carbon Footprint for Cloud Data Centers through Geographical and Temporal Shifting of WorkloadsLe, Trung January 2012 (has links)
Cloud Computing presents a novel way for businesses to procure their IT needs. Its elasticity and on-demand provisioning enables a shift from capital expenditures to operating expenses, giving businesses the technological agility they need to respond to an ever-changing marketplace. The rapid adoption of Cloud Computing, however, poses a unique challenge to Cloud providers—their already very large electricity bill and carbon footprint will get larger as they expand; managing both costs is therefore essential to their growth.
This thesis squarely addresses the above challenge. Recognizing the presence of Cloud data centers in multiple locations and the differences in electricity price and emission intensity among these locations and over time, we develop an optimization framework that couples workload distribution with time-varying signals on electricity price and emission intensity for financial and environmental benefits. The framework is comprised of an optimization model, an aggregate cost function, and 6 scheduling heuristics.
To evaluate cost savings, we run simulations with 5 data centers located across North America over a period of 81 days. We use historical data on electricity price, emission intensity, and workload collected from market operators and research data archives. We find that our framework can produce substantial cost savings, especially when workloads are distributed both geographically and temporally—up to 53.35% on electricity cost, or 29.13% on carbon cost, or 51.44% on electricity cost and 13.14% on carbon cost simultaneously.
|
64 |
Kapitalstruktur i svenska SMF inom hotellindustrin : En kvantitativ studie om faktorer som belyser kapitalstruktur utifrån pecking order teorinAbu Baker, Nessrin January 2021 (has links)
Syftet med studien var att analysera faktorer inom hotellbranschen med inriktning på små-och medelstora företags kapitalstruktur utifrån de centrala faktorerna från teorierna trade-off och pecking order teorin. Däremot utgick hypoteserna utifrån pecking order teorin och testades därefter genom en regressionsanalys. Studien har utgått ifrån en kvantitativ studie där data hämtades från databasen Retriever Research och baserades på 97 svenska hotell mellan år 2011- 2019. Vid analys av kapitalstrukturen på hotellbranschen har tre beroende variabler tillämpats (totala skulder, långfristiga skulder och kortfristiga skulder) samt de fem olika oberoende variabler (tillväxt, ålder, storlek, lönsamhet och tillgångsstruktur). Utifrån en regressionsanalys har resultatet visat sig att 6 hypoteser av 15 stödjer pecking order teorin, varav 3 av 15 hypoteser stödjer trade off teorin. De resterande 6 hypoteserna saknade signifikans och anses sakna samband mellan den beroende variabeln och oberoende variabeln. Sammanfattningsvis hade tillgångsstruktur den starka förklaringsstryrkan till totala skulder och långfristiga skulder. Däremot fann studien även signifikans med dem övriga faktorer, bland annat tillväxtmöjlighet med totala- och långfristiga skulder, ålder och långfristiga skulder och sist mellan storlek och långfristiga skulder som alla stödjer antaganden från pecking order teorin. Dessutom förkastades sambanden mellan ålder och kortfristiga skulder, samt mellan storlek och totala- och kortfristiga skulder eftersom hypoteserna stödde trade-off teorin. Sist kunde inget samband finna mellan tillväxtmöjlighet och kortfristiga skulder, ålder och totala skulder, tillgångsstruktur och kortfristiga skulder. Överlag fann författaren inget samband mellan lönsamhet och de tre olika beroende variablerna. / This study aims to analyze factors within the hotel industry with a focus on the capital structure of small- and medium sized companies based on the trade-off theory and pecking order theory. However, the hypotheses were based on the pecking order theory and were then tested through a regression analysis. The study was based on a quantitative study where the data was retrieved from the database Retriever Research and is based on 97 Swedish hotels between the years 2011- 2019. When analyzing the capital structure of the hotel industry, three dependent variables have been applied (total debt, long-term debt and short-term debt) as well as the five different independent variables (growth, age, size, profitability and asset structure). Based on a regression analysis, it has been shown that 6 hypotheses out of 15 support the pecking order theory, of which 3 out of 15 hypotheses support the trade off theory. The remaining 6 hypotheses were not significant and are considered to be unrelated between the dependent variable and the independent variable. In summary, the asset structure had greater explanatory power to total debt and long-term debt. However, the study also found significance with other factors, including growth potential with total-long-term debt, age and long-term debt and last between size and long-term debt, that all support assumptions from the pecking order theory. In addition the study rejected the relationship between age and short-term debt, and between size and total and total-short term debt because the hypotheses supported the trade-off theory. Finally, no relationship could be found between growth potential and short-term debt, age and total debt, asset structure and short-term debt. Overall, the author found no relationship between profitability and the three different dependent variables.
|
65 |
Evaluating Economic Impacts of Different Silvicultural Approaches in Bottomland Hardwood Forests of the Lower Mississippi Alluvial Valley (LMAV)Nepal, Sunil 09 December 2016 (has links)
The purpose of this research was to model the growth and yield of bottomland hardwood forests of the Lower Mississippi Alluvial Valley and to explain the economic tradeoffs of even- and uneven-aged management. The US Forest Service (USFS) Forest Vegetation Simulator was used to model growth and yield for four different bottomland hardwood forest types using USFS inventory data. Even- and uneven-aged management scenarios were optimized for timber revenue maximization using the Land Expectation Value formula. Analyses suggested that growth and yield of even-aged and uneven-aged management approaches differ in terms of end products and harvesting time. The even-aged management scenarios performed better over the uneven-aged management scenarios with few exceptions; however, the magnitude of the economic tradeoff depended upon initial stand conditions and required rates of return. These analyses will allow landowners to understand how much economic gain or loss they may realize by adopting an alternative management.
|
66 |
Institutional investor sentiment, beta, and stock returnsWang, Wenzhao 09 March 2020 (has links)
Yes / This paper examines the role of institutional investor sentiment in determination of the beta-return relation. Empirical evidence documents a positive (negative) beta-return relation over bearish (bullish) periods, implying that institutional investors can also be sentiment traders.
|
67 |
On the Tradeoff Of Average Delay, Average Service Cost, and Average Utility for Single Server Queues with Monotone PoliciesSukumaran, Vineeth Bala January 2013 (has links) (PDF)
In this thesis, we study the tradeoff of average delay with average service cost and average utility for both continuous time and discrete time single server queueing models without and with admission control. The continuous time and discrete time queueing models that we consider are motivated by cross-layer models for point-to-point links with random packet arrivals and fading at slow and fast time scales. Our studies are motivated by the need to optimally tradeoff the average delay of the packets (a network layer performance measure) with the average service cost of transmitting the packets, e.g. the average power required for transmission (a physical layer performance measure) under a lower bound constraint on the average throughput, in various point-to-point communication scenarios.
The tradeoff problems are studied for a class of monotone and stationary scheduling policies and under the assumption that the service cost rate and utility rate are respectively convex and concave functions of the service rate and arrival rate. We also consider the problem of optimally trading off the average delay and average error rate of randomly arriving message symbols which are transmitted over a noisy point-to-point link, in which case the service cost function is non-convex.
The solutions to the tradeoff problems that we address in the thesis are asymptotic in nature, and are similar in spirit to the Berry-Gallager asymptotic bounds. It is intuitive that to keep a queue stable under a lower bound constraint on the average utility a minimum number of customers have to be served per unit time. This in turn implies that queue stability requires a minimum average service cost expenditure. In the thesis we obtain an asymptotic characterization of the minimum average delay for monotone stationary policies subject to an upper bound constraint on the average service cost and a lower bound constraint on the average utility, in the asymptotic regime where the average service cost constraint is made arbitrarily close to the above minimum average service cost.
In the thesis, we obtain asymptotic lower bounds on the minimum average delay for the cases for which lower bounds were previously not known. The asymptotic characterization of the minimum average delay for monotone stationary policies, for both continuous time and discrete time models, is obtained via geometric bounds on the stationary probability of the queue length, in the above asymptotic regime. The restriction to monotone stationary policies enables us to obtain an intuitive explanation for the behaviour of the asymptotic lower bounds using the above geometric bounds on the stationary probability distribution of the queue length. The geometric bounds on the stationary probability of the queue length also lead to a partial asymptotic characterization of the structure of any optimal monotone stationary policy, in the above asymptotic regime, which was not available in previous work. Furthermore, the geometric bounds on the stationary probability can be extended to analyse the tradeoff problem in other scenarios, such as for other continuous time queueing models, multiple user communication models, queueing models with service time control, and queueing models with general holding costs.
Usually, queueing models with integer valued queue evolution, are approximated by queueing models with real valued queue evolution and strictly convex service cost functions for analytical tractability. Using the asymptotic bounds, we show that for some cases the average delay does not grow to infinity in the asymptotic regime, although the approximate model suggests that the average delay does grow to infinity. In other cases where the average delay does grow to infinity in the asymptotic regime, our results illustrate that the tradeoff behaviour of the approximate model is different from that of the original integer valued queueing model unless the service cost function is modelled as the piecewise linear lower convex envelope of the service cost function for the original model.
|
68 |
Importance des compromis liés à l'eau chez une espèce caractéristique des milieux bordiers, la vipère aspic (Vipera aspis) / Importance of water-based tradeoffs in a reptile using ecotone habitats, the Aspic viper (Vipera aspis)Dupoué, Andréaz 06 November 2014 (has links)
L'un des enjeux majeurs en écologie est de comprendre et prédire la réponse des organismes aux variations environnementales. Pour cela, la clarification des mécanismes proximaux est une étape indispensable pour comprendre des patrons écologiques généraux comme l'utilisation de l'habitat ou la distribution des espèces. Dans ce contexte, l'approche écophysiologique est particulièrement pertinente. Si la ressource trophique et les contraintes énergétiques ont attiré un intérêt considérable, les compromis liés à l'eau demeurent actuellement peu considérés. Pourtant l'eau est aussi une ressource capitale pouvant être limitante. La régulation de la balance hydrique pourrait donc jouer un rôle clé dans les compromis physiologiques et comportementaux.L'objectif principal de ce doctorat est d'évaluer l'importance des compromis liés à l'eau chez une espèce caractéristique des milieux bordiers (haies, lisières), la vipère aspic (Vipera aspis). Cette espèce vivipare est particulièrement dépendante des conditions thermiques, notamment pour sa reproduction. Notre hypothèse générale est qu'au même titre que les conditions thermiques, les conditions hydriques confrontent l'organisme à des compromis physiologiques et comportementaux importants, particulièrement lors de la reproduction (i.e., gestation). Nous avons combiné des études descriptives (thermorégulation, pertes hydriques) et expérimentales (manipulation de l’accès à l'eau) qui suggèrent l'existence d'un compromis entre la thermorégulation et la balance hydrique. Ces contraintes liées à l'eau sont associées à des ajustements physiologiques et comportementaux qui doivent être considérés pour comprendre les stratégies reproductrices et identifier de possibles conflits intergénérationnels (mère-embryons). Au même titre que les contraintes énergétiques ou thermiques, les besoins en eau sont donc essentiels à considérer pour aborder des questions écologiques et évolutives générales. / A major goal in ecology is to understand and predict species responses to environmental variations. Clarifying the proximate factors involved is a crucial step to unravel general ecological patterns such as habitat use or species distribution. In this context, the use of an ecophysiological approach can be particularly relevant. Trophic resource and energy tradeoffs attracted considerable interest but water-based tradeoffs remain relatively overlooked to date. However water is a critical, often limiting resource that must be considered. The regulation of water balance may have a key influence on physiological and behavioral tradeoffs. The main objective in this thesis is to evaluate the importance of water-based tradeoffs in a species characteristic of "ecotone" habitats (hedgerows, edges), the Aspic viper (Vipera aspis). This species is viviparous and highly depends on thermal conditions during reproduction. Our general hypothesis is that, as thermal conditions, hydric conditions should expose individuals to important physiological and behavioral tradeoffs, especially during reproduction (i.e., pregnancy).We combined descriptive (thermoregulation, water losses) and experimental studies (manipulation of water availability) that suggest a significant tradeoff between thermoregulation and water balance regulation. Water-based tradeoffs induce physiological and behavioral adjustments that are relevant to understand reproductive strategies and identify possible transgenerational (mother-embryos) conflicts. As energy or thermal resource, water should therefore also be considered to address ecological or evolutive questions.
|
69 |
The BUMP model of response planning: a neuroengineering account of speed-accuracy tradeoffs, velocity profiles, and physiological tremor in movementBye, Robin Trulssen, Electrical Engineering & Telecommunications, Faculty of Engineering, UNSW January 2009 (has links)
Speed-accuracy tradeoffs, velocity profiles, and physiological tremor are fundamental characteristics of human movement. The principles underlying these phenomena have long attracted major interest and controversy. Each is well established experimentally but as yet they have no common theoretical basis. It is proposed that these three phenomena occur as the direct consequence of a movement response planning system that acts as an intermittent optimal controller operating at discrete intervals of ~100 ms. The BUMP model of response planning describes such a system. It forms the kernel of adaptive model theory which defines, in computational terms, a basic unit of motor production or BUMP. Each BUMP consists of three processes: (i) analysing sensory information, (ii) planning a desired optimal response, and (iii) executing that response. These processes operate in parallel across successive sequential BUMPs. The response planning process requires a discrete time interval in which to generate a minimum acceleration trajectory of variable duration, or horizon, to connect the actual response with the predicted future state of the target and compensate for executional error. BUMP model simulation studies show that intermittent adaptive optimal control employing two extremes of variable horizon predictive control reproduces almost exactly findings from several authoritative human experiments. On the one extreme, simulating spatially-constrained movements, a receding horizon strategy results in a logarithmic speed-accuracy tradeoff and accompanying asymmetrical velocity profiles. On the other extreme, simulating temporally-constrained movements, a fixed horizon strategy results in a linear speed-accuracy tradeoff and accompanying symmetrical velocity profiles. Furthermore, simulating ramp movements, a receding horizon strategy closely reproduces experimental observations of 10 Hz physiological tremor. A 100 ms planning interval yields waveforms and power spectra equivalent to those of joint-angle, angular velocity and electromyogram signals recorded for several speeds, directions, and skill levels of finger movement. While other models of response planning account for one or other set of experimentally observed features of speed-accuracy tradeoffs, velocity profiles, and physiological tremor, none accounts for all three. The BUMP model succeeds in explaining these disparate movement phenomena within a single framework, strengthening this approach as the foundation for a unified theory of motor control and planning.
|
70 |
Reinforcement in Biology : Stochastic models of group formation and network constructionMa, Qi January 2012 (has links)
Empirical studies show that similar patterns emerge from a large number of different biological systems. For example, the group size distributions of several fish species and house sparrows all follow power law distributions with an exponential truncation. Networks built by ant colonies, slime mold and those are designed by engineers resemble each other in terms of structure and transportation efficiency. Based on the investigation of experimental data, we propose a variety of simple stochastic models to unravel the underlying mechanisms which lead to the collective phenomena in different systems. All the mechanisms employed in these models are rooted in the concept of selective reinforcement. In some systems the reinforcement can build optimal solutions for biological problem solving. This thesis consists of five papers. In the first three papers, I collaborate with biologists to look into group formation in house sparrows and the movement decisions of damsel fish. In the last two articles, I look at how shortest paths and networks are constructed by slime molds and pheromone laying ants, as well as studying speed-accuracy tradeoffs in slime molds' decision making. The general goal of the study is to better understand how macro level patterns and behaviors emerges from micro level interactions in both spatial and non-spatial biological systems. With the combination of mathematical modeling and experimentation, we are able to reproduce the macro level patterns in the studied biological systems and predict behaviors of the systems using minimum number of parameters.
|
Page generated in 0.0478 seconds