• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 745
  • 110
  • 75
  • 34
  • 22
  • 18
  • 14
  • 12
  • 10
  • 7
  • 7
  • 7
  • 7
  • 7
  • 7
  • Tagged with
  • 1344
  • 259
  • 235
  • 232
  • 194
  • 159
  • 154
  • 140
  • 125
  • 117
  • 105
  • 96
  • 92
  • 90
  • 89
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
561

Accounting for Greenhouse Gas Emissions and Toxic Air Pollutants in Trucking Efficiency and Productivity

Heng, Yen January 2011 (has links)
Air pollution is a threat to the environment and human health. Freight trucking in particular is the main source of freight transportation emissions. Heavy-duty trucks emit large amounts of toxic air pollutants that cause serious diseases and harm public health. In addition, heavy-duty trucks emit great amounts of greenhouse gas (GHG), which is the leading cause of global warming. Despite increased environmental restrictions on air pollution and rising trucking greenhouse gas emissions in the past decades, no economic study has examined the potential GHG and air pollution reductions in the trucking sector and the associated private abatement costs to the industry. This study accounts for GHG emissions and toxic air pollutants in measuring and evaluating efficiency and productivity for the trucking industry in the 48 contiguous states. Moreover, the private costs of abatement to the industry were also estimated. When only GHG was incorporated in the production model, the results showed that each state could expand desirable output and reduce GHG by an average of 11 percent per year between 2000 and 2007. The Malmquist-Luenberger productivity indexes showed that omitting or ignoring GHG in trucking service production yielded biased estimates. On the other hand, due to increased environmental regulations, most of the toxic air pollutants decreased dramatically between 2002 and 2005. The analytical results showed that inefficiency decreased during this period. The private costs of abatement averaged $73 million per state in 2005. When GHG and six toxic air pollutants were incorporated in the production model, the estimated private abatement cost was $76 million per state, which was equivalent to 0.7 percent of the industry output in 2005.
562

Santa Barbara Tea Fire Multi-Hazard Mitigation Benefit Cost Analysis

Flamm, David S 01 June 2009 (has links)
ABSTRACT Santa Barbara Tea Fire Multi-Hazard Mitigation Benefit Cost Analysis David S Flamm This study examines the benefits and costs associated with the outright purchase of properties for hazard mitigation (“property acquisition mitigation”) in Santa Barbara, California which reduced four properties’ exposure to multiple hazards. The results indicate that the estimated overall benefit-cost ratio for property acquisition mitigation projects is 1.75:1 when the exposed properties meet a threshold of eminent threat for total loss. This study further suggests that when property acquisitions are performed in an area threatened by multiple hazards the mitigation becomes two to three times more beneficial than in an area threatened by a single hazard. Possible implications and future benefits associated with this mitigation and mitigations like this are also explored. Multi-hazard mitigation is an action taken to reduce or eliminate long-term risks from natural or human-caused hazards. A hazard is any condition or event with the potential to cause fatalities, injuries, property damage, infrastructure damage, economic interruptions, environmental damage, or other loss. The study area for the Tea Fire BCA (Benefit Cost Analysis) is subject to multiple hazards, primarily landslides, wildfires, and earthquakes. In an attempt to reduce the exposure to landslides a mitigation project was completed in 1998. This project included purchase of four properties by the City of Santa Barbara using federal and local funds. The undeveloped properties were left empty as open space to eliminate the exposure to risk. The project, originally intended to mitigate landslide risk, mitigated risk exposure to multiple hazards. The mitigation was put to the test during the Santa Barbara Tea Fire, a wildfire which burned approximately 2,000 acres of Santa Barbara County land in November, 2008. The following steps were followed to determine the overall loss avoidance: 1. Obtain building values before mitigation 2. Obtain current comparable building values 3. Determine burn recurrence in study area 4. Obtain fire damage estimates from FEMA BCA tool based on “before mitigation” building and contents values 5. Calculate “loss avoidance” and adjust for inflation using FEMA BCA tool 6. Add additional avoided losses not considered in BCA (e.g., emergency management costs) 7. Subtract new losses resulting from the project 8. Determine multi-hazard recurrence in study area Keywords: Hazard Mitigation, Benefit Cost Analysis, Loss Avoidance.
563

Řízení rizik vybraného podnikatelského subjektu / Risk Management of Selected Business Entity

Zemanová, Kateřina January 2012 (has links)
Theses "Risk Management of the selected business entity" is focused on risk management, analysis and optimization in the consumer cooperative COOP. Consumer cooperative covers 121 sales units in the district Zdar nad Sazavou and devoted mainly selling food and non-food goods, which are facing high competition from hypermarkets and supermarkets. The first part contains the definition of consumer cooperatives deals with general characteristics of the risks division and possible measures for its reduction, which is nowadays a very important part of risk analysis. The second part deals with the analysis in five categories of risk and risk assessment. These suggestions are based on analysis of solutions found to eliminate the risks, which will help to create a stronger team position in the market.
564

Empirical Evidence on the Effectiveness of Energy Economic Policy Instruments from the Residential and SMEs Sector

Thonipara, Anita 15 April 2020 (has links)
No description available.
565

Congestion Mitigation for Planned Special Events: Parking, Ridesharing and Network Configuration

January 2019 (has links)
abstract: This dissertation investigates congestion mitigation during the ingress of a planned special event (PSE). PSEs would impact the regular operation of the transportation system within certain time periods due to increased travel demand or reduced capacities on certain road segments. For individual attendees, cruising for parking during a PSE could be a struggle given the severe congestion and scarcity of parking spaces in the network. With the development of smartphones-based ridesharing services such as Uber/Lyft, more and more attendees are turning to ridesharing rather than driving by themselves. This study explores congestion mitigation during a planned special event considering parking, ridesharing and network configuration from both attendees and planner’s perspectives. Parking availability (occupancy of parking facility) information is the fundamental building block for both travelers and planners to make parking-related decisions. It is highly valued by travelers and is one of the most important inputs to many parking models. This dissertation proposes a model-based practical framework to predict future occupancy from historical occupancy data alone. The framework consists of two modules: estimation of model parameters, and occupancy prediction. At the core of the predictive framework, a queuing model is employed to describe the stochastic occupancy change of a parking facility. From an attendee’s perspective, the probability of finding parking at a particular parking facility is more treasured than occupancy information for parking search. However, it is hard to estimate parking probabilities even with accurate occupancy data in a dynamic environment. In the second part of this dissertation, taking one step further, the idea of introducing learning algorithms into parking guidance and information systems that employ a central server is investigated, in order to provide estimated optimal parking searching strategies to travelers. With the help of the Markov Decision Process (MDP), the parking searching process on a network with uncertain parking availabilities can be modeled and analyzed. Finally, from a planner’s perspective, a bi-level model is proposed to generate a comprehensive PSE traffic management plan considering parking, ridesharing and route recommendations at the same time. The upper level is an optimization model aiming to minimize total travel time experienced by travelers. In the lower level, a link transmission model incorporating parking and ridesharing is used to evaluate decisions from and provide feedback to the upper level. A congestion relief algorithm is proposed and tested on a real-world network. / Dissertation/Thesis / Doctoral Dissertation Civil, Environmental and Sustainable Engineering 2019
566

Exploring the Relationship between Design and Outdoor Thermal Comfort in Hot and Dry Climate

January 2019 (has links)
abstract: Moderate physical activity, such as walking and biking, positively affects physical and mental health. Outdoor thermal comfort is an important prerequisite for incentivizing an active lifestyle. Thus, extreme heat poses significant challenges for people who are outdoors by choice or necessity. The type and qualities of built infrastructure determine the intensity and duration of individual exposure to heat. As cities globally are shifting priorities towards non-motorized and public transit travel, more residents are expected to experience the city on their feet. Thus, physical conditions as well as psychological perception of the environment that affect thermal comfort will become paramount. Phoenix, Arizona, is used as a case study to examine the effectiveness of current public transit and street infrastructure to reduce heat exposure and affect the thermal comfort of walkers and public transit users. The City of Phoenix has committed to public transit improvements in the Transportation 2050 plan and has recently adopted a Complete Streets Policy. Proposed changes include mobility improvements and creating a safe and comfortable environment for non-motorized road participants. To understand what kind of improvements would benefit thermal comfort the most, it is necessary to understand heat exposure at finer spatial scales, explore whether current bus shelter designs are adequate in mitigating heat-health effects, and comprehensively assess the impact of design on physical, psychological and behavioral aspects of thermal comfort. A study conducted at bus stops in one Phoenix neighborhood examined grey and green infrastructure types preferred for cooling and found relationships between perception of pleasantness and thermal sensation votes. Walking interviews conducted in another neighborhood event examined the applicability of a framework for walking behavior under the stress of heat, and how differences between the streets affected perceptions of the walkers. The interviews revealed that many of the structural themes from the framework of walking behavior were applicable, however, participants assessed the majority of the elements in their walk from a heat mitigation perspective. Finally, guiding questions for walkability in hot and arid climates were developed based on the literature review and results from the empirical studies. This dissertation contributes to filling the gap between walkability and outdoor thermal comfort, and presents methodology and findings that can be useful to address walkability and outdoor thermal comfort in the world’s hot cities as well as those in temperate climates that may face similar climate challenges in the future as the planet warms. / Dissertation/Thesis / Doctoral Dissertation Sustainability 2019
567

The Corona pandemic - a focusing event for insufficient governmental action on climate change mitigation?

Glaser, Sofia January 2020 (has links)
This study seeks to examine whether the Corona pandemic has potential to serve as a focusing event for the problem of insufficient governmental action on climate change mitigation. The study is built on the Multiple Streams Framework by John W. Kingdon, with a main focus on the focusing event theory. According to this, focusing events can come in three forms: as crises and disasters, personal experiences of policymakers, and as symbols. Kingdon’s theoretical discussions, alongside my own developments of his work, provides the basis for a set of analytical questions through which the answer to the research question is provided. The analysis reveals that while the pandemic indeed can be considered a crisis or disaster and personal experiences of policymakers, establishing whether these could focus attention to the specific problem of insufficient governmental action on climate change mitigation requires further research, as the perceived cause of the crisis or disaster and personal experience must be established. However, the paper finds that the pandemic indeed has potential to serve as a symbol for the specific insufficient governmental action, for instance by stressing that deforestation increases the risk of zoonotic outbreaks, such as the Corona pandemic.
568

Risk allocation and mitigation methods for financing cross border projects

Rezvanian, Amirabolfazi 24 February 2013 (has links)
Compared to other areas of Finance, the field of Project Finance is a relatively unexplored area for both empirical and theoretical research. And in particular, most of the research to date has focused more narrowly on risk management through financial instruments. From another point of view and by looking at different types of projects, Cross Border projects are usually considered 'high risk', mostly due to a lack of adequate overseas environmental information and overseas project experience. Given this setting, this research aims to explore risks attributed to Cross Border Project Financed projects and understand why South African companies should or should not use Project Finance for their Cross Border projects.There were two phases to the research. The first phase consisted of an analysis of literature on Project Finance, the Cross Border project context and Risk Management processes and, the further analysis of fourteen case studies where Cross Border projects have used Project Finance. This was with the aim of extracting risks and relevant allocation and mitigation methods. The second phase consisted of ten interviews with South African Project Finance experts, based on findings from phase one. This phase’s aim was to explore the practical risk allocation and mitigation methods and compare them to what was said in theory, making recommendations for further research into Project Finance in South Africa.The first phase resulted in a broad description of the theory of risks associated with Cross Border Project Financed projects and those specific risks and allocation or mitigation methods addressed in Cross Border projects that have used Project Finance as their financing vehicle. The second phase produced a comparative scheme between what is being addressed in theory as risk allocation and mitigation methods and what is being exercised in South African Project Financed projects. This comparison showed that Project Finance is a recommended financing vehicle for Cross Border projects provided that required due diligence and homework are done upfront. It was concluded that there is a gap between theory and practice in terms of risk allocation and mitigation methods developed for Cross Border Project Financed projects. This research provided a framework to introduce similarities and differences between theory and practice and ended up with a set of recommendations for further research into Project Finance. / Dissertation (MBA)--University of Pretoria, 2012. / Gordon Institute of Business Science (GIBS) / unrestricted
569

The feasibility of carbon-subsidized afforestation projects : a case study of China

Hou, Guolong 11 November 2020 (has links)
Afforestation projects in China have substantially contributed to national CO2 sequestration and play an important role in international climate change mitigation. However, these nation-wide afforestation projects are usually funded by the national government, with very large and unsustainable investments. It is important to find alternative sources of funding to finance afforestation, and convince poor farmers to become involved in afforestation projects. Carbon-subsidized afforestation could be the solution. The current study aims to find i) whether farmers need additional subsidies to reforest their marginal farmland; if so, ii) whether the value of carbon sequestration of afforestation can offset farmers' net costs. To do this, first I determine the amount of carbon sequestration though afforestation. Second, I assess the value of carbon sequestration, the costs and benefits of afforestation projects, and the costs and benefits of crop production. Third, I investigate the optimal rotation period of the plantations considering a joint production of timber and carbon, for different species. Results show that total carbon sequestration through tree biomass and soil carbon following afforestation differs among tree species and stand age as well as across regions. Economic trees sequester less carbon than ecological trees and bamboo. Among economic trees, nut trees with an inedible hard shell sequester more carbon than fruit trees. The regional context significantly influences the carbon sequestration potential, with more carbon sequestered in southern and eastern regions than in northern regions. Bamboo also shows a remarkable carbon sequestration potential, which is even greater than Chinese fir and Poplar in northern regions. Although afforestation programs have huge potential to store carbon, the voluntary acceptance by landowners crucially depends on their economic outcome. I found that usually carbon credits can compensate for the opportunity costs of alternative land uses, except i) when highly profitable croplands are afforested, in which case carbon credits are not sufficient, and ii) when croplands that generates low incomes are afforested, in which case carbon credits are not needed. Fruit trees are the most cost-effective option for afforestation. Bamboo afforestation is economically attractive if carbon revenues is included. The minimum price of carbon credit decreases with increasing project duration because more carbon is stored when time increases. This does not hold for fast-growing trees like Eucalyptus, for which the minimum price increases with extended project duration. Given the temporal variations of joint production of timber and carbon sequestration, the carbon accounting regimes (tCER, temporary Certified Emission Reductions and lCER, long-term Certified Emission Reductions) have a significant impact on the optimal rotation as well as on the revenue. Forest managers have an incentive to use tCER accounting to finance slow-growing plantations, and lCER for fast-growing ones. I perform a sensitivity analysis detects the changes of rotation period with different carbon prices and discount rates. While the optimal decision for slow-growing species (e.g. Chinese fir) is highly sensitive to changes in both variables under tCER accounting, the results concerning fast-growing species (e.g. Eucalyptus) are most sensitive under the lCER accounting regime. In contrast, carbon revenues have a minimal impact on the optimal rotation of Poplar plantations, no matter which regime is applied. I conclude that carbon-subsidized afforestation is a feasible way to offset the opportunity costs of retired farmland and support the livelihood of farmers. The findings can contribute to the efficient and sustainable management of forestry projects using carbon sequestration, while the methodology can also be applied to other regions in the world.
570

Data and image domain deep learning for computational imaging

Ghani, Muhammad Usman 22 January 2021 (has links)
Deep learning has overwhelmingly impacted post-acquisition image-processing tasks, however, there is increasing interest in more tightly coupled computational imaging approaches, where models, computation, and physical sensing are intertwined. This dissertation focuses on how to leverage the expressive power of deep learning in image reconstruction. We use deep learning in both the sensor data domain and the image domain to develop new fast and efficient algorithms to achieve superior quality imagery. Metal artifacts are ubiquitous in both security and medical applications. They can greatly limit subsequent object delineation and information extraction from the images, restricting their diagnostic value. This problem is particularly acute in the security domain, where there is great heterogeneity in the objects that can appear in a scene, highly accurate decisions must be made quickly, and the processing time is highly constrained. Motivated primarily by security applications, we present a new deep-learning-based MAR approach that tackles the problem in the sensor data domain. We treat the observed data corresponding to dense, metal objects as missing data and train an adversarial deep network to complete the missing data directly in the projection domain. The subsequent complete projection data is then used with an efficient conventional image reconstruction algorithm to reconstruct an image intended to be free of artifacts. Conventional image reconstruction algorithms assume that high-quality data is present on a dense and regular grid. Using conventional methods when these requirements are not met produces images filled with artifacts that are difficult to interpret. In this context, we develop data-domain deep learning methods that attempt to enhance the observed data to better meet the assumptions underlying the fast conventional analytical reconstruction methods. By focusing learning in the data domain in this way and coupling the result with existing conventional reconstruction methods, high-quality imaging can be achieved in a fast and efficient manner. We demonstrate results on four different problems: i) low-dose CT, ii) sparse-view CT, iii) limited-angle CT, and iv) accelerated MRI. Image domain prior models have been shown to improve the quality of reconstructed images, especially when data are limited. A novel principled approach is presented allowing the unified integration of both data and image domain priors for improved image reconstruction. The consensus equilibrium framework is extended to integrate physical sensor models, data models, and image models. In order to achieve this integration, the conventional image variables used in consensus equilibrium are augmented with variables representing data domain quantities. The overall result produces combined estimates of both the data and the reconstructed image that is consistent with the physical models and prior models being utilized. The prior models used in both image and data domains in this work are created using deep neural networks. The superior quality allowed by incorporating both data and image domain prior models is demonstrated for two applications: limited-angle CT and accelerated MRI. A major question that arises in the use of neural networks and in particular deep networks is their stability. That is, if the examples seen in the application environment differ from the training environment will the performance be robust. We perform an empirical stability analysis of data and image domain deep learning methods developed for limited-angle CT reconstruction. We consider three types of perturbations to test stability: adversarially optimized, random, and structural perturbations. Our empirical analysis reveals that the data-domain learning approach proposed in this dissertation is less susceptible to perturbations as compared to the image-domain post-processing approach. This is a very encouraging result and strongly supports the main argument of this dissertation that there is value in using data-domain learning and it should be a part of our computational imaging toolkit.

Page generated in 0.2621 seconds