301 |
Mitigation of Ammonia Emissions from Broiler Houses Using a Biodegradable Litter AmendmentSenyondo, Namanda Sara 06 May 2013 (has links)
Broilers are raised indoors on high density farms with bedding/litter to trap their manure. Ammonia gas, which is produced as the manure decomposes, has adverse effects on human health, bird welfare and the environment. Using litter amendments can reduce the amount and, consequently, the effects of ammonia emitted from broiler houses. The objective of this study was to determine the effectiveness of a biodegradable litter amendment (BLA) in reducing ammonia emitted from a broiler house.
A pilot scale test was set up with six adjacent, individually ventilated rooms and a stocking density of 0.07 m² per bird. The birds were fed with a standard commercial, corn and soybean meal based diet and water was provided ad libitum. The first flock was grown on 10 cm of fresh, kiln-dried pine shavings, while subsequent flocks were grown on top-dressed reused litter. The two treatments (control (CTL) and BLA) were randomly assigned to the six rooms after flock 1, to give three replicates per treatment. The exhaust air from the rooms was sampled for ammonia concentration for two days each week starting at four days of age to determine the amount of ammonia emitted.
Over three subsequent flocks, the total mass of ammonia emitted from rooms treated with BLA was 31% to 47% lower than the control. Ammonia emitted per bird grown on treated litter and per kg of harvested bird weight was 32% to 44% lower, and the exhaust fans ran 7% to 22% less than CTL over the same period. For both BLA and CTL, the amount of ammonia emitted generally increased with bird age and litter reuse. The study showed that BLA effectively reduced ammonia emitted from a broiler house and that there are potential energy savings from using the amendment. However, ammonia emitted from the BLA rooms during the final flock was 57% higher than CTL, which was attributed to insufficient water (less than 18% moisture by weight) to support the reaction between BLA and ammonia. / Ph. D.
|
302 |
Overloaded Array Processing with Spatially Reduced Search Joint DetectionHicks, James E. Jr. 22 August 2000 (has links)
An antenna array is overloaded when the number of cochannel signals in its operating environment exceeds the number of elements. Conventional space-time array processing for narrow-band signals fails in overloaded environments. Overloaded array processing (OAP) is most difficult when signals impinging on the array are near equal power, have tight excess bandwidth, and are of identical signal type. In this thesis, we first demonstrate how OAP is theoretically possible with the joint maximum likelihood (JML) receiver. However, for even a modest number of interfering signals, the JML receiverà Âs computational complexity quickly exceeds the real-time ability of any computer. This thesis proposes an iterative joint detection technique, Spatially Reduced Search Joint Detection, (SRSJD), which approximates the JML receiver while reducing its computational complexity by several orders of magnitude. This complexity reduction is achieved by first exploiting spatial separation between interfering signals with a linear pre-processing stage, and second, performing iterative joint detection with a (possibly) tail-biting and time"-varying trellis. The algorithm is sub-optimal but is demonstrated to well approximate the optimum receiver in modest signal to interference ratios. SRSJD is shown to demodulate over 2M zero excess bandwidth synchronous QPSK signals with an M element array. Also, this thesis investigates a temporal processing technique similar to SRSJD, Temporally Reduced Search Joint Detection (TRSJD), that separates co-channel, asynchronous, partial response signals. The technique is demonstrated to separate two near equal power QPSK signals with r= .35 root raised-cosine pulse shapes." / Master of Science
|
303 |
Cost Effective Rollover Mitigation StrategySchneider, Shawn Patrick 27 April 2010 (has links)
A cost effective method of rollover mitigation in vehicles is presented. The method was designed so that some of the system states were measured by sensors that are already available on most vehicles and so that other states could be measured with relatively low cost sensors. Also, the control algorithm was designed to be implementable using a series of look up tables and computationally efficient equations to enable the use of low-cost controller platforms. These look up tables and equations can be modified to change the conservativeness of the method as well as to configure the method for use on almost any 4-wheeled vehicle. Lastly, the proposed mitigation technique was designed to be directly implementable with existing vehicle hardware.
To develop this method, a vehicle model was created using several advanced computer packages including SolidWorks 2008™, MATLAB ®, Simulink®, and SimMechanics™. Once created, the model was outfitted with virtual sensors that represent data from realistic sensor types. A detection algorithm was designed around the hypothesis of a stability boundary utilizing the sensor data to detect impending rollover. Finally, a mitigation algorithm was designed to limit throttle and braking upon impending rollover. This algorithm was defined using the basic principles of end-stop control, but was adapted to work appropriately with this scenario. To conclude this research, two simple maneuvers were used to verify the effectiveness of this system to mitigate vehicle rollover.
This research was government sponsored and in some instances utilized secured data. Due to the nature of this material, some data has been omitted from this document. / Master of Science
|
304 |
Hydric soil properties as influenced by land-use in Southeast Virginia wet flatsBurdt, Amanda Corrine 08 May 2003 (has links)
The accuracy of the growing season used by regulators in hydric soil and wetland hydrology and the validity of ignoring land use in these definitions is questionable. This study compared measured air and soil temperature with various growing season dates and indicators, and determined the relationships between the hydrology, air and soil temperature. Water table depths, air temperature at 1-m height, soil temperature at 15-, 30-, and 50-cm depths, and CO₂ efflux were measured at 12 plots representing three landuse treatments (forest, field, and bare ground) at two restored wet flats in the thermic Great Dismal Swamp ecosystem. The forest was driest treatment. The forest air was the warmest in winter and coldest in summer, opposite of the bare ground. The forest soil at 50 cm was the warmest in winter and coolest in summer, opposite of the bare ground. Land use affected hydrology, air, and soil temperatures through the presence of surface litter and differences in shading, albedo, and ET. The regulatory frost-free period fell in between the measured frost-free period and the measured 5°C soil temperature period. Based on CO₂ efflux and soil temperature at 50 cm, the biological growing season of native plants and microbes should be year-round for forested areas, one week shorter for early-successional fields, and two weeks shorter for active cropland rather than March to November for all land uses. Changing the growing season definition of forested, thermic wet flats to year-round designation must be considered and studied carefully to avoid jeopardizing wetland hydrology qualifications. / Master of Science
|
305 |
[pt] AVALIAÇÃO DE RISCO EM ENCOSTA FLORESTADA NO CAMPUS GÁVEA DA PUC-RIO / [en] RISK ASSESSMENT ON A FORESTED SLOPE AT PUC-RIO S GÁVEA CAMPUSMARIA BEATRIZ DA C A DOS SANTOS 16 May 2023 (has links)
[pt] Riscos geológicos, tais como os escorregamentos, são os maiores
causadores dos desastres naturais em ambientes urbanos. Além dos impactos
ambientais negativos, este fenômeno implica em consequências socioeconômicas,
uma vez que existe a necessidade de reparos imediatos da área degradada. A
avaliação de risco, principalmente em áreas urbanizadas, é necessária para se evitar
os possíveis desastres e assegurar a qualidade de vida da população no entorno. No
presente trabalho, a avaliação de risco foi realizada em três seções da encosta do
Morro Dois Irmãos, localizada na Gávea, Rio de Janeiro, que sofreram sucessivas
rupturas após intensas precipitações. A avaliação e classificação do risco pela
abordagem qualitativa foi determinada utilizando a metodologia proposta pelo
Ministério das Cidades e IPT e do GIDES-CPRM através da realização da vistoria
em campo. A encosta foi classificada como de risco alto a muito alto ao
escorregamento. Os resultados da abordagem quantitativa, obtidos pelas análises
determinísticas de estabilidade com base no método de Morgenstern-Price,
evidenciaram que as superfícies de ruptura crítica das seções apresentam fator de
segurança menor que 1,5. Já nas análises probabilísticas de estabilidade FOSM,
foram obtidas probabilidades de ruptura de 0,19, 0,14 e 0,08 sendo as seções
classificadas como Perigoso no que diz respeito ao nível de desempenho
esperado. Com isso, o índice de vulnerabilidade local encontra-se na faixa de 0,7 a
0,8. Para as seções analisadas, são necessárias ações como forma de mitigar e
recuperar a área ambiental e socialmente. / [en] Geological hazards, such as landslides, are the biggest cause of natural
disasters in urban environments. In addition to the negative environmental impacts,
this phenomenon implies socioeconomic consequences, since there is a need for
immediate repairs to the degraded area. Risk assessment, especially in urbanized
areas, is necessary to avoid possible disasters and ensure the quality of life of the
surrounding population. In the present work, the risk assessment was carried out in
three sections of the slope of Morro Dois Irmãos, located in Gávea, Rio de Janeiro,
which suffered successive ruptures after intense rainfall. The assessment and
classification of risk by the qualitative approach was determined using the
methodology proposed by the Ministry of Cities and IPT and GIDES-CPRM through
field inspection. The slope was classified as high to very high slip risk. The results
of the quantitative approach, obtained by deterministic stability analysis based on
the Morgenstern-Price method, showed that the critical rupture surfaces of the
sections have a safety factor lower than 1.5. In the FOSM stability probabilistic
analyses, rupture probabilities of 0.19, 0.14 and 0.08 were obtained, with the
sections classified as Dangerous with regard to the expected performance level.
As a result, the local vulnerability index is in the range of 0.7 to 0.8. For the
analyzed sections, actions are needed as a way to mitigate and recover the area
environmentally and socially.
|
306 |
Analysis of In-Lieu Fee Programs in providing Wetland and Stream Compensatory MitigationTutko, Benjamin Thomas 16 October 2017 (has links)
The nation's Section 404 permitting program, of the Clean Water Act (CWA), represents one of the longest regulatory histories of designing and implementing credit trading programs to satisfy regulatory requirements. The role and the function of in-lieu fee (ILF) programs in supporting this regulatory structure have undergone a substantial change. For the first time in the history of the Sec. 404 program, 33 CFR Part 332 and 40 CFR Part 230, Subpart J (the "2008 mitigation rule" or "rule"), prioritizes the use of off-site mitigation over on-site-mitigation. Additionally, the rule prioritizes advanced, third-party mitigation; especially as achieved through mitigation banks; over any off-site compensatory mitigation provided by ILF programs (33 CFR 332.3(b)(1)). This new regulatory environment favors the use of commercial mitigation bank credits while acknowledging that the limited permittee demand of off-site mitigation credits, in particular areas, justifies the continuing need for ILF programs (Corps and EPA 2008, p.19606,19611). This research examines how regulatory officials use ILF programs under the 2008 mitigation rule, and, it determines the extent to which ILF programs are capable of fulfilling the role envisioned for them under the 2008 mitigation rule. Simulation results indicate that commercial mitigation banks cannot meet risk adjusted returns under limited credit demand conditions. ILF programs offer some additional financial capacity to fill the void in commercial bank coverage; but, this potential is limited in low demand conditions. Furthermore, empirical case studies of a Virginia and Georgia provide evidence that regulatory officials rely on ILF programs to provide off-site compensatory mitigation almost exclusively in the absence of private credit supply, as intended in the 2008 rule. Evidence in Georgia and Virginia also indicate that, in some situations, ILF programs face difficulties in providing mitigation under the constraints of limited demand and more stringent regulatory requirements. / Master of Science / National permitting programs require people that impact wetlands or streams to offset unavoidable, adverse impacts by improving wetlands or streams elsewhere, a process called compensatory mitigation. A new regulatory rule, approved in 2008 (33 CFR Part 332 and 40 CFR Part 230, Subpart J), prioritizes that mitigation is provided at larger projects off-site of the impact. Key policy questions of “who should provide the mitigation?” and “when the mitigation should be provided” were an important part of the debate during the rule’s development. Wetland and stream mitigation may be provided by commercial (for profit) businesses, called mitigation banks. Commercial banks make wetland/stream improvement projects before permitted (adverse) impacts occur in anticipation of selling wetland/stream “credits” (quantified levels of improvement). Off-site mitigation may also be provided by in-lieu fee (ILF) programs operated by the government or nonprofit organizations. ILF programs first accept funds from permittees and then construct mitigation projects once sufficient funds have been collected, thus creating a lag between adverse impact and compensatory mitigation.
The 2008 regulatory rule favors the use of commercial mitigation bank credits over ILF credits, but allows regulatory officials, under certain circumstances, to use ILF credits when commercial bank credits of the appropriate type are unavailable. This research examines how regulatory officials use ILF programs, and it investigates the extent to which ILF programs are financially capable of providing off-site mitigation in situations where the appropriate commercial credits are unavailable. A financial simulation model is developed to examine the feasibility of mitigation projects under different costs and credit demand conditions. Results indicate that commercial mitigation banks cannot meet financial objectives under limited credit demand conditions. ILF programs offer some additional financial capacity to fill the void in commercial bank coverage, but ILF programs also face financial limitations under conditions with low demand for credits. Empirical case studies of Virginia and Georgia provide evidence that regulatory officials rely on ILF programs to provide off-site compensatory mitigation almost exclusively in the absence of a private credit supply, as intended in the 2008 rule. However, evidence in Virginia and Georgia also affirm that ILF programs face difficulties in providing mitigation in some situations of limited demand and stringent regulatory requirements.
|
307 |
Determining an Appropriate Organic Matter Loading Rate for a Created Coastal Plain Forested WetlandBergschneider, Cara Renee 14 September 2005 (has links)
Past research indicates that created non-tidal wetlands in the mid-Atlantic region are considerably lower in soil organic matter than native forested hydric soils. However, optimal loading rates for created wetland soil reconstruction have not been rigorously established. Our objective was to determine appropriate organic amendment loading rates for a Coastal Plain mitigation wetland based on 1) soil properties reflective of hydric soil development, 2) the formation of redoximorphic features, and 3) the growth and vigor of hydrophytic vegetation. The study contained wet (CCW-Wet) and dry (CCW-Dry) experiments, each receiving 6 compost treatments (0 Mg/ha untilled and 0, 56, 112, 224, and 336 Mg/ha tilled). Over the 1.5-year monitoring period, redox potential decreased and redoximorphic feature formation increased with compost loadings up to 112 Mg/ha. Surface bulk density decreased with loadings up to 224 Mg/ha, while no treatment differences were noted in sub-surface bulk density. In the CCW-Dry experiment, soil moisture peaked in the 224 Mg/ha treatment, while soil moisture in CCW-Wet increased consistently across all loadings. Total biomass in CCW-Wet and Betula nigra L. growth in both experiments increased with loading rate. Total biomass in CCW-Dry and Quercus palustris Muench. growth in both experiments peaked at 112 Mg/ha, although differences were not significant. Collectively, these findings indicate that 112 Mg/ha of high quality organic amendment was optimal for inducing hydric soil conditions and positive hydrophytic vegetation response. Incorporating compost at rates exceeding 112 Mg/ha is challenging and leads to higher surface elevations and redox levels in the initial growing season. / Master of Science
|
308 |
A Numerical Investigation of the Seismic Response of the Aggregate Pier Foundation SystemGirsang, Christian Hariady 02 January 2002 (has links)
The response of an aggregate pier foundation system during seismic loading was investigated. The factors and phenomena governing the performance of the aggregate pier and the improved ground were identified and clarified. The key factors affecting the performance of the aggregate pier include soil density, stiffness modulus, and drainage capacity. The improved ground is influenced by soil stratification, soil properties, pore pressure dissipation, and earthquake time history.
Comprehensive numerical modeling using FLAC were performed. The focus of the study in this research was divided into three parts: the studies of the ground acceleration, the excess pore water pressure ratio and the shear stress in soil matrix generated during seismic loading. Two earthquake time histories scaled to different peak acceleration were used in the numerical modeling: the 1989 Loma Prieta earthquake (pga = 0.45g) and the 1988 Saguenay earthquake (pga = 0.05g).
The main results of the simulation showed the following effects of aggregate pier on liquefiable soil deposits: 1) The aggregate pier amplifies the peak horizontal acceleration on the ground surface (amax), 2) The aggregate pier reduces the liquefaction potential up to depth where it is installed, 3) Pore pressures are generally lower for soils reinforced with aggregate pier than unreinforced soils except for very strong earthquake, 4) The maximum shear stresses in soil are much smaller for reinforced soils than unreinforced soils.
The excess pore water pressure ratio and the shear stress in the soil matrix calculated by FLAC were generally lower than those predicted by available procedures. / Master of Science
|
309 |
Multi-Layered Dual-Band Dual-Polarized Reflectarray Design Toward Rim-Located Reconfigurabable Reflectarrays for Interference Mitigation in Reflector AntennasBora, Trisha 14 June 2024 (has links)
The rise of satellites in Low Earth Orbit (LEO) is causing more terrestrial electromagnetic interference in the important L- and X-band frequencies which are crucial for astronomical observations. This thesis introduces reflectarray design which can serve as a basis for an interference mitigation solution for radio telescopes. In the envisioned application, When the reflectarray is placed around the circumference of an existing radio telescope, it can drive a null into the radio telescopes radiation pattern sidelobe distribution. Since the reflectarray only occupies a small potion of the rim of the paraboloidal main reflector, its presence does not significantly effect the main lobe peak gain. Since Iridium and Starlink are the target mega-constellations, the reflectarray must be dual band. To cover the operational bandwidths of these constellations, the target bandwidth in the L-band (Iridium) is 0.7%, and that in the X-band (Starlink) is 17.1%. This makes the design of the reflectarray challenging as the frequencies are widely separated and the bandwidth in the X-band is wide The work of this thesis marks a first step in this effort. It includes a reflectarray design containing a multi-layer stack consisting of: (1) a grounded substrate, (2) an X-band slot loaded unit cell geometry, (3) a dielectric superstrate, and (4) an L-band layer containing crossed dipoles. The dual band reflectarray is dual linearly polarized to maintain symmetric response. The reflectarray is designed and simulated using full-wave solvers. The results show that the reflectarray designs are capable of pattern shaping at both bands and operate across the required bandwidths. This architecture could serve as a basis for future reflectarrays capable of nulling satellite interference from mega-constellations in observatory applications in the future. / Master of Science / The signal clarity issues stemming from the increasing number of satellites in Low Earth Orbit (LEO), particularly in the vital L- and X-band frequencies essential for global communications and radio astronomy, are the motivation of this thesis. The endeavor concentrates on designing a dual-band dual-polarized reflectarray antenna which may ultimately be used to help mitigate interference in these bands in radio telescopes. The work is focused on the frequency ranges utilized by the major satellite networks Iridium and Starlink, which operate within the L-band (1616-1626.5 MHz) and X-band (10.7-12.7 GHz). Recognizing the significance of these frequencies for global communication and also to radio astronomy, the reflectarray is designed to contribute to a an interference mitigation system which would ultimately allow for coexistence between radio telescopes and communications systems satellites. Targeting bandwidth achievements of 0.7% for the L-band and 17.1% for the X-band, the focus is on nulling interference arising across these frequency bands and thereby increasing the sensitivity of the radio telescope operating amongst these mega-constellations. The thesis documents a multilayered reflectarray antenna, containing a wide-band X-band layer of slot antennas on one layer and an L-band superstrate layer containing crossed dipoles at another, both of which utilize dual linear polarization for symmetric operation. The completed reflectarray can operate simultaneously in both bands. It has been shown in the two papers cited by {ellingson2021sidelobe,budhu2024design} that reflectarrays placed along the rim of radio telescopes main reflector can be used to drive nulls in the sidelobe envelope of its radiation pattern thereby nulling incoming interference. The antenna design of this thesis suggests a possible candidate for these interference mitigation systems where both bands are targeted.
|
310 |
Investigating Trade-Offs in Mitigating Double-Fetches Introduced by Compile-Time Optimizations : Analysing the Impact of Security Measures on Software PerformanceFransson, William January 2024 (has links)
In software security, balancing the need for robust protection with performance considerations is a critical challenge. Mitigation techniques are essential to defend against various types of attacks, but they can also introduce performance overheads. Meanwhile, compilers provide optimizations that aim to enhance performance but inadvertently introduce security vulnerabilities, such as double-fetches. This thesis explores the trade-offs associated with disabling compiler optimisation options to enhance security against such vulnerabilities. By examining various optimisation levels (-O1, -O2, -O3, -Ofast) in GNU Compiler Collectio (GCC) and LLVM compilers, we quantitatively analyse their impact on execution time, memory usage, and complexity of the binaries. Our study reveals that while opting out of optimisations can significantly improve security by eliminating double-fetch bugs, it also leads to increased execution time and larger binary sizes. These findings underscore developers' need to make informed choices about optimisations, balancing security concerns with performance requirements. This work contributes to a deeper understanding of the security-performance trade-offs in software development and provides a foundation for further research into compiler optimisations and security.
|
Page generated in 0.0335 seconds