151 |
AvaliaÃÃo TÃcnica e Operacional da EstaÃÃo de Tratamento de Esgotos por Lodos Ativados do Campus do Pici da Universidade Federal do Cearà / Evaluation technical and operational of the activated sludge Wastewater Treatment Plant of the "Campus do Pici" of the Federal University of CearÃMarcelo Silva Peixoto 29 August 2008 (has links)
CoordenaÃÃo de AperfeiÃoamento de NÃvel Superior / O tratamento de esgoto à fundamental para a manutenÃÃo da qualidade dos corpos dâÃgua. O presente trabalho investigou os aspectos tÃcnicos e operacionais da estaÃÃo de tratamento de esgoto (ETE) por lodos ativados com aeraÃÃo prolongada do Campus do Pici, situada na Universidade Federal do Cearà e propÃs medidas corretivas para compatibilizar as eficiÃncias reais com as de projeto. Por meio da anÃlise dos parÃmetros fÃsicos, quÃmicos e biolÃgicos de qualidade de Ãgua no afluente e efluente avaliou-se a eficiÃncia real da ETE, sendo que os principais parÃmetros operacionais foram avaliados no tratamento preliminar, tanque de aeraÃÃo e decantador secundÃrio. De posse dos dados coletados, compararam-se os dados de eficiÃncia com os valores citados na literatura para a tecnologia de lodos ativados por aeraÃÃo prolongada, assim como os valores obtidos na avaliaÃÃo operacional da ETE. Adicionalmente, foram verificados quais parÃmetros atendiam aos limites da Portaria no 154 da SuperintendÃncia Estadual de Meio ambiente do Cearà (Semace). Constatou-se que a ETE està submetida à sobrecarga hidrÃulica, comprometendo desta forma a eficiÃncia de remoÃÃo de matÃria orgÃnica e patogÃnicos, alÃm da baixa remoÃÃo de nutrientes, que faz com que o efluente nÃo atenda aos limites da supracitada portaria para amÃnia e fÃsforo. Como principal intervenÃÃo na ETE, recomendou-se a construÃÃo de mÃdulos complementares para distribuiÃÃo da vazÃo afluente de forma a sanar o problema de sobrecarga hidrÃulica, alÃm da possibilidade de inclusÃo de tratamento em nÃvel terciÃrio para o problema dos nutrientes. Sà apÃs a realizaÃÃo desta intervenÃÃo, poderÃo ser propostas medidas para a otimizaÃÃo da ETE e adequaÃÃo do efluente a todos os padrÃes de lanÃamento estabelecidos e avaliados na presente investigaÃÃo. / Wastewater treatment is very important to maintain the quality of water bodies. The present work evaluated the technical and operational aspects of an activated sludge Wastewater Treatment Plant (WTP), classified as prolonged aeration, located at the âCampus do Piciâ of the Federal University of CearÃ, and propose some interventions to make compatible real and design efficiencies. The physical-chemical and biological water analysis of influent and effluent allowed to monitor the real WTP efficiency. Some operational parameters were assessed in the preliminary treatment, aeration tank and secondary settler. After the monitoring time, the real organic matter, nutrients and pathogens removal efficiencies were compared to values reported in literature for the activated sludge system in extended aeration, as well as for the operational parameters. Additionally, it was verified which parameters were below the limits reported in the regulation (Portaria no 154 of Secretaria de Meio ambiente do CearÃ, Semace). It was concluded that the WTP was hydraulically overloaded, which is compromising the organic matter and pathogens removal, beside the low nutrients efficiency. Because of this, the WTP effluent is not accomplishing the regulation limits for ammonium and phosphorous. As a major intervention, we suggest the construction of other modules to distribute the inflow, with the aim of decrease the hydraulic overload problem, beside the possibility of including a tertiary treatment to decrease the nutrients concentrations. Only after these improvements we can think of ways to optimize the WTP to achieve the design efficiencies and effluent limits.
|
152 |
Liberdade de informação jornalística e seus limites frente à democracia brasileira atualMorte, Luciana Tudisco Oliveira 27 May 2013 (has links)
Made available in DSpace on 2016-03-15T19:33:57Z (GMT). No. of bitstreams: 1
Luciana Tudisco Oliveira Morte.pdf: 521205 bytes, checksum: 4019650ef7a238a086ce45fd8068f205 (MD5)
Previous issue date: 2013-05-27 / Both freedom of expression and the press are in evidence in the world today. Freedom of the press has a key role in monitoring and complaints of crimes and / or conduct detrimental to society, especially with regard to acts of public power. Exercising widely freedom of communication, guard means, in modern society, the formation of public opinion and free solidification of democratic rule of law. Given this important role taken by the freedom of communication, some argue a full and unlimited freedom, ruling out any form of regulation, oversight or limitation of such activity by the state. However, as with other constitutional freedoms, freedoms of speech and press are limits to its exercise. This work aims to address the freedom of speech and press forward to their limits. / Tanto a liberdade de expressão quanto a de imprensa estão em evidência no mundo atual. A última possui um papel fundamental na fiscalização e denúncias de crimes e/ou condutas lesivas à sociedade, especialmente, no que diz respeito aos atos do poder público. Exercer amplamente as liberdades de comunicação, significa resguardar, na sociedade moderna, a formação da opinião pública livre e a solidificação do Estado democrático de direito. Em face dessa relevante função assumida pela liberdade de comunicação, há quem defenda uma plena e ilimitada liberdade, afastando qualquer forma de regulação, fiscalização ou limitação de tal atividade por parte do Estado. No entanto, assim como ocorre com as demais liberdades constitucionais, as liberdades de expressão e de imprensa encontram limites aos seus exercícios. O presente trabalho tem por objetivo abordar a liberdade de expressão e de imprensa frente aos seus limites.
|
153 |
Proposição de um método de harmonização da velocidade baseado em modelo de previsão de conflitos veicularesCaleffi, Felipe January 2018 (has links)
Técnicas como a harmonização da velocidade procuram gerir e controlar o tráfego com base nas condições de tráfego das rodovias em tempo real. A harmonização da velocidade utiliza limites de velocidade variáveis (L.V.V.) para fornecer aos condutores uma velocidade de operação mais apropriada, normalmente inferior ao limite de velocidade estático indicado, em resposta as condições dinâmicas das vias. O L.V.V. tem demonstrado capacidade de melhorar a mobilidade e a segurança nas rodovias. Com isso, modelos de avaliação de risco de colisão em tempo real são frequentemente adotados para quantificar os riscos de ocorrência de colisões em estudos de implantação do L.V.V. Na maioria dos estudos sobre L.V.V., modelos de probabilidade de colisão são adotados apenas para mensurar o desempenho do sistema. Estes algoritmos de controle de L.V.V. não levam em conta o risco de colisões em períodos futuros, e assim não usam impactos do L.V.V. para escolher o plano de controle com relação à segurança. No Brasil, estratégias de harmonização da velocidade não são empregadas. Como as condições de tráfego nas rodovias brasileiras não são homogêneas, e cada faixa de tráfego normalmente possui médias de velocidades, intensidades de fluxo e composições de tráfego diferentes, técnicas como o L.V.V Podem oferecer benefícios ao harmonizar as velocidades entre as faixas e assim retardar o aparecimento de congestionamentos, reduzir o número de ultrapassagens e o risco de colisões. Dessa forma, este trabalho busca avaliar a relação entre as características do tráfego e a probabilidade de ocorrer conflitos entre veículos, para assim desenvolver um modelo matemático capaz de expressar tal relação – usando como estudo de caso um trecho da rodovia BR-290/RS, situada na região metropolitana da cidade de Porto Alegre. Este modelo matemático alimenta um algoritmo L.V.V., empregado em um micro simulador de tráfego, para controlar o tráfego com o objetivo de aumentar a segurança. Resultados indicam que o modelo proposto classificou corretamente 87% dos conflitos efetivamente ocorridos em campo. Os resultados de simulação indicam que o emprego do sistema L.V.V. contribuiu significativamente para a redução da probabilidade de conflitos. Ainda, o L.V.V. aumentou as velocidades médias nos períodos de fluxo elevado, e também reduziu o desvio padrão das velocidades – oferecendo um tráfego mais homogêneo – que contribui para a redução do número de trocas de faixa e, consequentemente, para um aumento da segurança. / Techniques such as speed harmonization seek to manage and control traffic based on road traffic conditions in real time. Speed harmonization uses variable speed limits (VSL) to provide drivers with a more appropriate speed, usually below the stated static speed limit, in response to dynamic road conditions. The VSL has demonstrated its ability to improve mobility and road safety. Thus, real-time collision risk assessment models are often adopted to quantify the risk of collisions occurring in VSL implantation studies. In most VSL studies, collision probability models are utilized only to measure the system performance. These VSL control algorithms do not take into account the risk of collisions in future periods, and thus do not use the VSL impacts to choose the control plan concerning safety. In Brazil, Speed harmonization strategies are not employed yet. As the traffic conditions on Brazilian highways are not homogeneous, and each traffic range usually has different average speeds, flow intensities, and traffic compositions, VSL techniques can offer benefits by harmonizing speeds between lanes, slowing down congestion, reducing the number of overtaking and the risk of collisions.( Continue) Thus, this work seeks to evaluate the relationship between traffic characteristics and the probability of conflicts between vehicles, in order to develop a mathematical model capable of expressing such a relation - using as a case the BR-290/RS freeway, located in the Porto Alegre metropolitan area. This mathematical model will then feed a VSL algorithm, employed in a micro traffic simulator, to control traffic and increase safety. Results indicate that the proposed model correctly classified 87% of the conflicts actually occurred in the field. The simulation results indicate that the VSL contributed significantly to reducing the conflicts likelihood. Even more, the VSL increased the average speeds for high flow periods, and also reduced the standard deviation of speeds - offering a more homogeneous traffic - which contributes for reduction in the number of lane changes and, consequently, to an increase in safety.
|
154 |
Limits of Liberal Peace in West Africa: Civil War in Mali and French Military InterventionFrancis, David J. January 2017 (has links)
The civil war in Mali and the perception of threat posed by Islamist Jihadists and Al-Qaeda-linked terrorists to international peace and security led to the French military intervention in January 2013 to end the terrorist take-over of Mali, prevent the collapse of the state and spread of insecurity and instability in the conflict-prone and fragile regions of West Africa and the Sahel as well as protect France’s strategic national interests. But what were the real reasons for France’s pre-emptive military intervention in Mali and what does the French and its allied UN, ECOWAS, African Union conflict stabilisation intervention say about donor-driven peacebuilding in Africa, often framed as Liberal peacebuilding intervention? / It will be published by Rienner later this year. David Francis said he would let us know when it is. - sm 05/01/2017
Emailed the publisher for permission 21/12/2016.
22/12/2016 - Lynne Rienner say they're not publishing this book!!! - emailed D Francis! - sm
© 2017 Publishers. Reproduced with permission from the publisher. / The full text may be made available after publication and on receipt of permission from the publisher.
|
155 |
A Computational Model to Predict Safety Limits for Aided Music ListeningBoley, J., Johnson, Earl E. 01 June 2018 (has links)
No description available.
|
156 |
Bounded Point Derivations on Certain Function SpacesDeterding, Stephen 01 January 2018 (has links)
Let 𝑋 be a compact subset of the complex plane and denote by 𝑅𝑝(𝑋) the closure of rational functions with poles off 𝑋 in the 𝐿𝑝(𝑋) norm. We show that if a point 𝑥0 admits a bounded point derivation on 𝑅𝑝(𝑋) for 𝑝 > 2, then there is an approximate derivative at 𝑥0. We also prove a similar result for higher order bounded point derivations. This extends a result of Wang, which was proven for 𝑅(𝑋), the uniform closure of rational functions with poles off 𝑋. In addition, we show that if a point 𝑥0 admits a bounded point derivation on 𝑅(𝑋) and if 𝑋 contains an interior cone, then the bounded point derivation can be represented by the difference quotient if the limit is taken over a non-tangential ray to 𝑥0. We also extend this result to the case of higher order bounded point derivations. These results were first shown by O'Farrell; however, we prove them constructively by explicitly using the Cauchy integral formula.
|
157 |
The Limits of Arbitrage and Stock Mispricing: Evidence from Decomposing the Market to Book RatioAlShammasi, Naji Mohammad 12 1900 (has links)
The purpose of this paper is to investigate the effect of the "limits of arbitrage" on securities mispricing. Specifically, I investigate the effect of the availability of substitutes and financial constraints on stock mispricing. In addition, this study investigates the difference in the limits of arbitrage, in the sense that it will lead to lower mispricing for these stocks, relative to non-S&P 500 stocks. I also examine if the lower mispricing can be attributed to their lower limits of arbitrage.
Modern finance theory and efficient market hypothesis suggest that security prices, at equilibrium, should reflect their fundamental value. If the market price deviates from the intrinsic value, then a risk-free profit opportunity has emerged and arbitrageurs will eliminate mispricing and equilibrium is restored. This arbitrage process is characterized by large number of arbitrageurs which have infinite access to capital. However, a better description of reality is that there are few numbers of arbitrageurs to the extent that they are highly specialized; and they have limited access to capital. Under these condition arbitrage is no more a risk-free activity and can be limited by several factors such as arbitrage risk and transaction costs.
Other factors that are discussed in the literature are availability of substitutes and financial constraints. The former arises as a result of the specialization of arbitrageurs in the market in which they operate, while the latter arises as a result of the separation between arbitrageurs and capital. In this dissertation, I develop a measure of the availability of substitutes that is based on the propensity scores obtained from propensity score matching technique. In addition, I use the absolute value of skewness of returns as a proxy of financial constraints.
Previous studies used the limits of arbitrage framework to explain pricing puzzles such as the closed-end fund discounts. However, closed-end fund discounts are highly affected by uncertainty of managerial ability and agency problems. This study overcomes this problem by studying the effect of limits of arbitrage on publicly traded securities. The results show that there is a significant relationship between proxies of limits of arbitrage and firm specific mispricing. More importantly, empirical results indicate that stocks that have no close substitutes have higher mispricing. In addition, stocks that have high skewness show higher mispricing.
Subsequent studies show that the S&P 500 stocks have different levels of liquidity, analysts’ coverage and volatility. These characteristics affect the ability of arbitrageurs to eliminate mispricing. Preliminary univariate tests show that S&P 500 stocks have, on average, lower mispricing and limits of arbitrage relative to non-S&P 500 stocks. In addition, the multivariate test shows that S&P 500 members have, on average, lower mispricing relative to non-S&P 500 stocks.
|
158 |
Novel techniques for estimation and tracking of radioactive sourcesBaidoo-Williams, Henry Ernest 01 December 2014 (has links)
Radioactive source signal measurements are Poisson distributed due to the underlying radiation process. This fact, coupled with the ubiquitous normally occurring radioactive materials (NORM), makes it challenging to localize or track a radioactive source or target accurately. This leads to the necessity to either use highly accurate sensors to minimize measurement noise or many less accurate sensors whose measurements are averaged to minimize the noise. The cost associated with highly accurate sensors places a bound on the number that can realistically be deployed. Similarly, the degree of inaccuracy in cheap sensors also places a lower bound on the number of sensors needed to achieve realistic estimates of location or trajectory of a radioactive source in order to achieve reasonable error margins.
We first consider the use of the smallest number of highly accurate sensors to localize radioactive sources. The novel ideas and algorithms we develop use no more than the minimum number of sensors required by triangulation based algorithms but avoid all the pitfalls manifest with triangulation based algorithms such as multiple local minima and slow convergence rate from algorithm reinitialization. Under the general assumption that we have a priori knowledge of the statistics of the intensity of the source, we show that if the source or target is known to be in one open half plane, then N sensors are enough to guarantee a unique solution, N being the dimension of the search space. If the assumptions are tightened such that the source or target lies in the open convex hull of the sensors, then N+1 sensors are required. Suppose we do not have knowledge of the statistics of the intensity of the source, we show that N+1 sensors is still the minimum number of sensors required to guarantee a unique solution if the source is in the open convex hull of the sensors.
Second, we present tracking of a radioactive source using cheap low sensitivity binary proximity sensors under some general assumptions. Suppose a source or target moves in a straight line, and suppose we have a priori knowledge of the radiation intensity of the source, we show that three binary sensors and their binary measurements depicting the presence or absence of a source within their nominal sensing range suffices to localize the linear trajectory. If we do not have knowledge of the intensity of the source or target, then a minimum of four sensors suffices to localize the trajectory of the source.
Finally we present some fundamental limits on the estimation accuracy of a stationary radioactive source using ideal mobile measurement sensors and provide a robust algorithm which achieves the estimation accuracy bounds asymptotically as the expected radiation count increases.
|
159 |
Fundamental Estimation and Detection Limits in Linear Non-Gaussian SystemsHendeby, Gustaf January 2005 (has links)
<p>Many methods used for estimation and detection consider only the mean and variance of the involved noise instead of the full noise descriptions. One reason for this is that the mathematics is often considerably simplified this way. However, the implications of the simplifications are seldom studied, and this thesis shows that if no approximations are made performance is gained. Furthermore, the gain is quantified in terms of the useful information in the noise distributions involved. The useful information is given by the intrinsic accuracy, and a method to compute the intrinsic accuracy for a given distribution, using Monte Carlo methods, is outlined.</p><p>A lower bound for the covariance of the estimation error for any unbiased estimator s given by the Cramér-Rao lower bound (CRLB). At the same time, the Kalman filter is the best linear unbiased estimator (BLUE) for linear systems. It is in this thesis shown that the CRLB and the BLUE performance are given by the same expression, which is parameterized in the intrinsic accuracy of the noise. How the performance depends on the noise is then used to indicate when nonlinear filters, e.g., a particle filter, should be used instead of a Kalman filter. The CRLB results are shown, in simulations, to be a useful indication of when to use more powerful estimation methods. The simulations also show that other techniques should be used as a complement to the CRLB analysis to get conclusive performance results.</p><p>For fault detection, the statistics of the asymptotic generalized likelihood ratio (GLR) test provides an upper bound on the obtainable detection performance. The performance is in this thesis shown to depend on the intrinsic accuracy of the involved noise. The asymptotic GLR performance can then be calculated for a test using the actual noise and for a test using the approximative Gaussian noise. Based on the difference in performance, it is possible to draw conclusions about the quality of the Gaussian approximation. Simulations show that when the difference in performance is large, an exact noise representation improves the detection. Simulations also show that it is difficult to predict the exact influence on the detection performance caused by substituting the system noise with Gaussian noise approximations.</p> / <p>Många metoder som används i estimerings- och detekteringssammanhang tar endast hänsyn till medelvärde och varians hos ingående brus istället för att använda en fullständig brusbeskrivning. En av anledningarna till detta är att den förenklade brusmodellen underlättar många beräkningar. Dock studeras sällan de effekter förenklingarna leder till. Denna avhandling visar att om inga förenklingar görs kan prestandan förbättras och det visas också hur förbättringen kan relateras till den intressanta informationen i det involverade bruset. Den intressanta informationen är den inneboende noggrannheten (eng. intrinsic accuracy) och ett sätt för att bestämma den inneboende noggrannheten hos en given fördelning med hjälp av Monte-Carlo-metoder presenteras.</p><p>Ett mått på hur bra en estimator utan bias kan göras ges av Cramér-Raos undre gräns (CRLB). Samtidigt är det känt att kalmanfiltret är den bästa lineära biasfria estimatorn (BLUE) för lineära system. Det visas här att CRLB och BLUE-prestanda ges av samma matematiska uttryck där den inneboende noggrannheten ingår som en parameter. Kunskap om hur informationen påverkar prestandan kan sedan användas för att indikera när ett olineärt filter, t.ex. ett partikelfilter, bör användas istället för ett kalmanfilter. Med hjälp av simuleringar visas att CRLB är ett användbart mått för att indikera när mer avancerade metoder kan vara lönsamma. Simuleringarna visar dock också att CRLB-analysen bör kompletteras med andra tekniker för att det ska vara möjligt att dra definitiva slutsatser.</p><p>I fallet feldetektion ger de asymptotiska egenskaperna hos den generaliserade sannolikhetskvoten (eng. generalized likelihood ratio, GLR) en övre gräns för hur bra detektorer som kan konstrueras. Det visas här hur den övre gränsen beror på den inneboende noggrannheten hos det aktuella bruset. Genom att beräkna asymptotisk GLR-testprestanda för det sanna bruset och för en gaussisk brusapproximation går det att dra slutsatser om huruvida den gaussiska approximationen är tillräckligt bra för att kunna användas. I simuleringar visas att det är lönsamt att använda sig av en exakt brusbeskrivning om skillnaden i prestanda är stor mellan de båda fallen. Simuleringarna indikerar också att det kan vara svårt att förutsäga den exakta effekten av en gaussisk brusapproximation.</p> / Report code: LiU-Tek-Lic-2005:54
|
160 |
Breaking the Customer Code : A model to Translate Customer Expectations into Specification LimitsGregorio, Ruben January 2010 (has links)
<p>Today, firms compete with services rather than goods. Large service organizations are beginning to use Six Sigma as continuous improvement tool. An important part of the Six Sigma methodology is the calculation of number of defects in the process, i.e. points outside the specification limits. Unlike goods quality, which can be measured objectively by number of defects, in service goods the setting up of specification limits is a complicated issue because it is marked by the use and expectations among the different customers. As Six Sigma was originally created for manufacturing, this crucial fact is not contemplated in the Six-Sigma roadmap Define- Measure-Analyze-Improve-Control (DMAIC).</p><p>The aim of this thesis is to develop a new model to help the Service Division, Siemens Industrial Turbomachinery AB to set the specification limits according to the customer expectations.</p><p>A review of relevant literature is used to develop a new integrated model with ideas from the Kano model, SERVQUAL, Taguchi loss function, Importance Performance Analysis (IPA) and a new model, the ”Trade-Off Importance”. A survey was carried out for 18 external customers and internal stakeholders.</p><p>The model has demonstrated its robustness and credibility to set the specification limits. Additionally it is a very powerful tool to set the strategic directions and for service quality measurement. As far as we know, this thesis is the first attempt to create a roadmap to set the specification limits in services. Researchers should find a proposed model to fill the research gap. From a managerial standpoint, the practical benefits in Siemens Industrial Turbomachinery AB, suggest a new way of communicating to customers.</p>
|
Page generated in 0.0267 seconds