• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 47
  • 24
  • 20
  • 6
  • 5
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 134
  • 134
  • 48
  • 44
  • 28
  • 25
  • 22
  • 21
  • 20
  • 17
  • 15
  • 15
  • 11
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Zhodnocení nákladů životního cyklu u vybraného produktu / Life cycle costing of selected product

Honzíková, Drahomíra January 2010 (has links)
The goal of Master's Thesis is an identification of life cycle costs of vacuum cleaners. Next point is an analysis of these life cycle cost by using value analysis. The theoretical part is focused on definitions of Life Cycle Costing (LCC) and Value Analysis (VA). The practical part, which is fundamental, contains identification of life cycle cost of selected vacuum cleaners, life cycle costing of selected vacuum cleaners and their value analysis. Result of this Master's Thesis is an interpretation of the results and the suggestion of steps for consumers and producers of vacuum cleaners.
92

Favoriser l’adoption des mobilités actives : proposition d’une démarche de conception centrée usage pour accompagner un territoire dans l’élaboration de ses politiques de transport / Promote adoption of active mobility : a use-centered design approach to help territory to develop transport policies

Convolte, Aline 20 November 2018 (has links)
Les politiques de transport qui visent la promotion de la marche, du vélo, ou des transports en commun représentent un véritable enjeu stratégique pour un territoire. Cependant, c’est aussi un vrai défi en matière d’ingénierie de conception et d'innovation. Cette thèse a pour principal objectif d’identifier des pistes d’amélioration concernant le processus actuel de conception des politiques de transport. Nous nous sommes concentrés sur la définition d’un modèle de conception centré sur l’usage, ouvert à l’interdisciplinarité capable de répondre aux enjeux afférents aux politiques de transport. Au travers de ce modèle, nous proposons de poser un regard critique sur le processus actuel de conception des politiques de transport / Transportation policies that promote walking, cycling, or public transports are a real strategic issue for an area. However, it's also a real challenge in design engineering. The main objective of this thesis is to identify ways for improvement concerning the current process of designing transport policies. We focused on defining a user-centered design model, open to interdisciplinarity, capable of responding to transportation policy issues. Through this model, we propose to take a critical look at the current process of designing transport policies
93

Utilisation des données historiques dans l'analyse régionale des aléas maritimes extrêmes : la méthode FAB / Using historical data in the Regional Analysis of extreme coastal events : the FAB method

Frau, Roberto 13 November 2018 (has links)
La protection des zones littorales contre les agressions naturelles provenant de la mer, et notamment contre le risque de submersion marine, est essentielle pour sécuriser les installations côtières. La prévention de ce risque est assurée par des protections côtières qui sont conçues et régulièrement vérifiées grâce généralement à la définition du concept de niveau de retour d’un événement extrême particulier. Le niveau de retour lié à une période de retour assez grande (de 1000 ans ou plus) est estimé par des méthodes statistiques basées sur la Théorie des Valeurs Extrêmes (TVE). Ces approches statistiques sont appliquées à des séries temporelles d’une variable extrême observée et permettent de connaître la probabilité d’occurrence de telle variable. Dans le passé, les niveaux de retour des aléas maritimes extrêmes étaient estimés le plus souvent à partir de méthodes statistiques appliquées à des séries d’observation locales. En général, les séries locales des niveaux marins sont observées sur une période limitée (pour les niveaux marins environ 50 ans) et on cherche à trouver des bonnes estimations des extrêmes associées à des périodes de retour très grandes. Pour cette raison, de nombreuses méthodologies sont utilisées pour augmenter la taille des échantillons des extrêmes et réduire les incertitudes sur les estimations. En génie côtier, une des approches actuellement assez utilisées est l’analyse régionale. L’analyse régionale est indiquée par Weiss (2014) comme une manière très performante pour réduire les incertitudes sur les estimations des événements extrêmes. Le principe de cette méthodologie est de profiter de la grande disponibilité spatiale des données observées sur différents sites pour créer des régions homogènes. Cela permet d’estimer des lois statistiques sur des échantillons régionaux plus étendus regroupant tous les événements extrêmes qui ont frappé un ou plusieurs sites de la région (...) Cela ainsi que le caractère particulier de chaque événement historique ne permet pas son utilisation dans une analyse régionale classique. Une méthodologie statistique appelée FAB qui permet de réaliser une analyse régionale tenant en compte les données historiques est développée dans ce manuscrit. Élaborée pour des données POT (Peaks Over Threshold), cette méthode est basée sur une nouvelle définition d’une durée d’observation, appelée durée crédible, locale et régionale et elle est capable de tenir en compte dans l’analyse statistique les trois types les plus classiques de données historiques (données ponctuelles, données définies par un intervalle, données au-dessus d’une borne inférieure). En plus, une approche pour déterminer un seuil d’échantillonnage optimal est définie dans cette étude. La méthode FAB est assez polyvalente et permet d’estimer des niveaux de retour soit dans un cadre fréquentiste soit dans un cadre bayésien. Une application de cette méthodologie est réalisée pour une base de données enregistrées des surcotes de pleine mer (données systématiques) et 14 surcotes de pleine mer historiques collectées pour différents sites positionnés le long des côtes françaises, anglaises, belges et espagnoles de l’Atlantique, de la Manche et de la mer du Nord. Enfin, ce manuscrit examine la problématique de la découverte et de la validation des données historiques / The protection of coastal areas against the risk of flooding is necessary to safeguard all types of waterside structures and, in particular, nuclear power plants. The prevention of flooding is guaranteed by coastal protection commonly built and verified thanks to the definition of the return level’s concept of a particular extreme event. Return levels linked to very high return periods (up to 1000 years) are estimated through statistical methods based on the Extreme Value Theory (EVT). These statistical approaches are applied to time series of a particular extreme variable observed and enables the computation of its occurrence probability. In the past, return levels of extreme coastal events were frequently estimated by applying statistical methods to time series of local observations. Local series of sea levels are typically observed in too short a period (for sea levels about 50 years) in order to compute reliable estimations linked to high return periods. For this reason, several approaches are used to enlarge the size of the extreme data samples and to reduce uncertainties of their estimations. Currently, one of the most widely used methods in coastal engineering is the Regional Analysis. Regional Analysis is denoted by Weiss (2014) as a valid means to reduce uncertainties in the estimations of extreme events. The main idea of this method is to take advantage of the wide spatial availability of observed data in different locations in order to form homogeneous regions. This enables the estimation of statistical distributions of enlarged regional data samples by clustering all extreme events occurred in one or more sites of the region. Recent investigations have highlighted the importance of using past events when estimating extreme events. When historical data are available, they cannot be neglected in order to compute reliable estimations of extreme events. Historical data are collected from different sources and they are identified as data that do not come from time series. In fact, in most cases, no information about other extreme events occurring before and after a historical observation is available. This, and the particular nature of each historical data, do not permit their use in a Regional Analysis. A statistical methodology that enables the use of historical data in a regional context is needed in order to estimate reliable return levels and to reduce their associated uncertainties. In this manuscript, a statistical method called FAB is developed enabling the performance of a Regional Analysis using historical data. This method is formulated for POT (Peaks Over Threshold) data. It is based on the new definition of duration of local and regional observation period (denominated credible duration) and it is able to take into account all the three typical kinds of historical data (exact point, range and lower limit value). In addition, an approach to identify an optimal sampling threshold is defined in this study. This allows to get better estimations through using the optimal extreme data sample in the FAB method.FAB method is a flexible approach that enables the estimation of return levels both in frequentist and Bayesian contexts. An application of this method is carried out for a database of recorded skew surges (systematic data) and for 14 historical skew surges recovered from different sites located on French, British, Belgian and Spanish coasts of the Atlantic Ocean, the English Channel and the North Sea. Frequentist and Bayesian estimations of skew surges are computed for each homogeneous region and for every site. Finally, this manuscript explores the issues surrounding the finding and validation of historical data
94

A framework for computer-based cost evaluation of highway projects in developing countries.

Fay, Jean-Michel January 1976 (has links)
Thesis. 1976. M.S.--Massachusetts Institute of Technology. Alfred P. Sloan School of Management. / Microfiche copy available in Archives and Dewey. / Bibliography: leaves 135-138. / M.S.
95

Revenue and operational impacts of depeaking flights at hub airports

Katz, Donald Samuel 13 November 2012 (has links)
Post deregulation, many U.S. airlines created hubs with banked schedules, however, in the past decade these same airlines began to experiment with depeaking their schedules to reduce costs and improve operational performance. To date there has been little research that has investigated revenue and operational shifts associated with depeaked schedules; yet understanding the trade-offs among revenue, costs, and operational performance at a network level is critical before airlines will consider future depeaking and related congestion-management strategies. This study develops data cleaning and analysis methodologies based on publicly available data that are used to quantify airport-level and network-level revenue and operational changes associated with schedule depeaking. These methodologies are applied to six case studies of airline depeaking over the past decade. Results show that depeaking is associated with revenue per available seat mile (RASM) increasing slower than the rest of the network and the industry as a whole. Depeaking is associated with improved operations for both the depeaking airlines and competitors. Airports benefit from increases in non-aeronautical sales associated with connecting passengers spending more time in the terminal. The underlying reasons driving airlines' scheduling decisions during depeaking vary greatly by case. Results from the study provide insights for airlines that are considering depeaking and the airports which are affected. The results suggest that losses in RASM and no improvement in operations could potentially lead an airline to repeak, and that RASM is prone to fall when a strong competitive threat exists.
96

Quantifying the benefits of ancillary transportation asset management

Akofio-Sowah, Margaret-Avis 16 November 2011 (has links)
Historically, transportation asset management has focused on roadways and bridges, but more recently, many agencies are looking to extend their programs to ancillary assets such as traffic signs and guardrails. This thesis investigates the state of practice of managing these assets in order to assess the data and system needs for successful program implementation, and further reviews the opportunities for making a business case for formal management procedures based on quantified benefits of managing ancillary assets. The asset classes, selected from a review of asset management literature, include culverts, earth retaining structures, guardrails, mitigation features, pavement markings, sidewalks and curbs, street lights, traffic signals, traffic signs and utilities and manholes, with data as an information asset. Findings from a literature review showed that a number of agencies have made substantial efforts to manage their ancillary transportation assets; however, methods and practices vary. Specific state and municipal agencies identified from the literature review were surveyed for further details on their practices. The survey results show significant knowledge gaps in data collection cost estimates, and cost savings from the implementation of a transportation asset management program for ancillary assets. Finally, this work evaluates the opportunities to quantify the benefits of ancillary transportation asset management, indicating several challenges due to a lack of the data needed. The results obtained highlight the current state of practice, revealing opportunities and challenges for improving the management of ancillary transportation assets.
97

Climate variability and change impacts on coastal environmental variables in British Columbia, Canada

Abeysirigunawardena, Dilumie Saumedaka 29 April 2010 (has links)
The research presented in this dissertation attempted to determine whether climate variability is critical to sea level changes in coastal BC. To that end, a number of statistical models were proposed to clarify the relationships between five climate variability indices representing large-scale atmospheric circulation regimes and sea levels, storm surges, extreme winds and storm track variability in coastal BC. The research findings demonstrate that decadal to inter decadal climatic variability is fundamental to explaining the changing frequency and intensity of extreme atmospheric and oceanic environmental variables in coastal BC. The trends revealed by these analyses suggest that coastal flooding risks are certain to increase in this region during the next few decades, especially if the global sea-levels continue to rise as predicted. The out come of this study emphasis the need to look beyond climatic means when completing climate impact assessments, by clearly showing that climate extremes are currently causing the majority of weather-related damage along coastal BC. The findings highlight the need to derive knowledge on climate variability and change effects relevant at regional to local scales to enable useful adaptation strategies. The major findings of this research resulted in five independent manuscripts: (i) Sea level responses to climatic variability and change in Northern BC. The Manuscript (MC) is published in the Journal of atmospheric and oceans (AO 46 (3), 277-296); (ii) Extreme sea-level recurrences in the south coast of BC with climate considerations. This MC is in review with the Asia Pacific Journal of Climate Change (APJCC); (iii) Extreme sea-surge responses to climate variability in coastal BC. This MC is currently in review in the Annals of the AAG (AN-2009-0098); (iv) Extreme wind regime responses to climate variability and change in the inner-south-coast of BC. This MC is published in the Journal of Atmosphere and Oceans (AO 47 (1), 41-62); (v) Sensitivity of winter storm track characteristics in North-eastern Pacific to climate variability. This manuscript is in review with the Journal of Atmosphere and Oceans (AO (1113)). The findings of this research program made key contributions to the following regional sea level rise impact assessment studies in BC: (i) An examination of the Factors Affecting Relative and Absolute Sea level in coastal BC (Thomson et al., 2008). (ii) Coastal vulnerability to climate change and sea level rise, Northeast Graham Island, Haida Gwaii (formally known as the Queen Charlotte Islands), BC (Walker et al., 2007). (iii) Storm Surge: Atmospheric Hazards, Canadian Atmospheric Hazards Network - Pacific and Yukon Region, C/O Bill Taylor.
98

Value-based global optimization

Moore, Roxanne Adele 21 May 2012 (has links)
Computational models and simulations are essential system design tools that allow for improved decision making and cost reductions during all phases of the design process. However, the most accurate models are often computationally expensive and can therefore only be used sporadically. Consequently, designers are often forced to choose between exploring many design alternatives with less accurate, inexpensive models and evaluating fewer alternatives with the most accurate models. To achieve both broad exploration of the alternatives and accurate determination of the best alternative with reasonable costs incurred, surrogate modeling and variable accuracy modeling are used widely. A surrogate model is a mathematically tractable approximation of a more expensive model based on a limited sampling of that model, while variable accuracy modeling involves a collection of different models of the same system with different accuracies and computational costs. As compared to using only very accurate and expensive models, designers can determine the best solutions more efficiently using surrogate and variable accuracy models because obviously poor solutions can be eliminated inexpensively using only the less expensive, less accurate models. The most accurate models are then reserved for discerning the best solution from the set of good solutions. In this thesis, a Value-Based Global Optimization (VGO) algorithm is introduced. The algorithm uses kriging-like surrogate models and a sequential sampling strategy based on Value of Information (VoI) to optimize an objective characterized by multiple analysis models with different accuracies. It builds on two primary research contributions. The first is a novel surrogate modeling method that accommodates data from any number of analysis models with different accuracies and costs. The second contribution is the use of Value of Information (VoI) as a new metric for guiding the sequential sampling process for global optimization. In this manner, the cost of further analysis is explicitly taken into account during the optimization process. Results characterizing the algorithm show that VGO outperforms Efficient Global Optimization (EGO), a similar global optimization algorithm that is considered to be the current state of the art. It is shown that when cost is taken into account in the final utility, VGO achieves a higher utility than EGO with statistical significance. In further experiments, it is shown that VGO can be successfully applied to higher dimensional problems as well as practical engineering design examples.
99

Evaluating methods for multi-level system design of a series hybrid vehicle

Taylor, Brian Jonathan Hart 05 July 2012 (has links)
In design and optimization of a complex system, there exist various methods for defining the relationship between the system as a whole, the subsystems and the individual components. Traditional methods provide requirements at the system level which lead to a set of design targets for each subsystem. Meeting these targets is sometimes a simple task or can be very difficult and expensive, but this is not captured in the design process and therefore unknown at the system level. This work compares Requirements Allocation (RA) with Distributed Value Driven Design (DVDD). A computational experiment is proposed as a means of evaluating RA and DVDD. A common preliminary design is determined by optimizing the utility of the system, and then a Subsystem of Interest (SOI) is chosen as the focal point of subsystem design. First the behavior of a designer using Requirements Allocation is modeled with an optimization problem where the distance to the design targets is minimized. Next, two formulations of DVDD objective functions are used to approximate the system-level value function. The first is a linear approximation and the second is a nonlinear approximation with higher fidelity around the preliminary design point. This computational experiment is applied to a series hybrid vehicle where the SOI is the electric motor. In this case study, RA proves to be more effective than DVDD on average. It is still possible that the use of objectives is superior to design targets. This work shows that, for this case study, a linear approximation as well as a slightly higher fidelity approximation are not well suited to find the design alternative with the highest expected utility.
100

Statická analýza jazyků s dynamickými funkcemi / Towards Static Analysis of Languages with Dynamic Features

Hauzar, David January 2014 (has links)
Dynamic features of programming languages such as dynamic type system, dynamic method calls, dynamic code execution, and dynamic data structures provide the flexibility which can accelerate the development, but on the other hand they reduce the information that is checked at compile time and thus make programs more error-prone and less efficient. While the problem of lacking compile time checks can be partially addressed by techniques of static analysis, dynamic features pose major challenges for these techniques sacrificing their precision, soundness, and scalability. To tackle this problem, we propose a framework for static analysis that automatically resolves these features and thus allows defining sound and precise static analyses similarly as the analyzed program would not use these functions. To build the framework, we propose a novel heap analysis that models associative arrays and dynamic (prototype) objects. Next, we propose value analysis providing additional information necessary to resolve dynamic features. Finally, we propose a technique that automatically and generically combines value analysis and a heap analysis modeling associative arrays and prototype objects. Powered by TCPDF (www.tcpdf.org)

Page generated in 0.0466 seconds