• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 55
  • 33
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 126
  • 126
  • 32
  • 23
  • 18
  • 14
  • 14
  • 13
  • 11
  • 10
  • 9
  • 8
  • 7
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Kinematic Analysis, Numerical Modeling, and Design Optimization of Helical External Gear Pumps

Xinran Zhao (5930489) 16 January 2020 (has links)
<p>With their advantages of low-cost, high-reliability and simplicity, external gear pumps (EGPs) are popular choices in many applications, such as mobile hydraulic control system, fuel injection, and liquid transportation system, to name a few. Like other positive displacement machines, EGPs are characterized by a flow non-uniformity, which is given by the gear meshing and results in vibrations and noises. With increasing demands for low-noise components required by modern fluid-power systems, new designs of external gear machines with less noise emission and lower pulsation production are highly desired by the industry. </p><p><br></p><p>To satisfy these demands, there are several new-generation gear pump designs that have been realized by the industry and already commercialized. However, the research from both academia on external gear pumps are still primarily focused traditional involute gear pumps, while state-of-the-art research on these new-generation external gear pumps are highly lacked. Also for the most novel designs recently released to the market, their designs still have large margin to improve, as some of the physics inside these gear machines are not well understood and formulated. The goal of this research is to fill in this gap, by gain understanding of the relations between design features and actual flow generated by such novel designs, and provide general methods of analysis and design for efficient and silent units. </p><p><br></p><p>To achieve this goal, this PhD dissertation presents a comprehensive approach of analysis for external gear pumps, with the emphasis on the new-generation helical gear pumps. The discussion covers a large variety of aspects for gear pump design and analysis, including: the analysis on the gear profile design and meshing, the displacement-chamber geometric modeling, and the kinematic-flow analysis. They are followed by a dynamic simulation model covering the dynamics of fluids, forces, and micro-motions, together with simulation results that provides the insights into the physics of new-generation gear machines. Multiple experimental results are provided, which show the validity of the simulation models by matching the pressure ripple measurement and the volumetric efficiencies. Furthermore, a linearized analysis on the ripple source of gear pumps are described, in order to provide the connection and understanding of the pump-generated ripple to the higher-level system analysis, which is also missing from the past academia research. In addition, the some of the models are utilized in optimization studies. These optimization results show the potentials of using the proposed approach of analysis to improve the existing designs as well as development of more efficient and silent units.</p><div><br></div>
52

Quantitative Assessment Of Software Development Project Management Issues Using Process Simulation With System Dynamics Elements

Mizell, Carolyn 01 January 2006 (has links)
The complexity of software development projects makes estimation and management very difficult. There is a need for improved cost estimation methods and new models of lifecycle processes other than the common waterfall process. This work has developed a new simulation model of the spiral development lifecycle as well as an approach for using simulation for cost and schedule estimation. The goal is to provide a tool that can analyze the effects of a spiral development process as well as a tool that illustrates the difficulties management faces in forecasting budgets at the beginning of a project which may encourage more realistic approaches to budgetary planning. A new discrete event process model of the incremental spiral development lifecycle approach was developed in order to analyze the effects this development approach has on the estimation process as well as cost and schedule for a project. The input data for the key variables of size, productivity, and defect injection rates in the model was based on analysis of Software Engineering Laboratory data and provided for analysis of the effects of uncertainty in early project estimates. The benefits of combining a separate system dynamics model with a discrete event process models was demonstrated as was the effects of turnover on the cost and schedule for a project. This work includes a major case study of a cancelled NASA software development project that experienced cost and schedule problems throughout its history. Analysis was performed using stochastic simulation with derived probability distributions for key software development factors. A system dynamics model of human resource issues was also combined with the process model to more thoroughly analyze the effects of turnover on a project. This research has demonstrated the benefits of using a simulation model when estimating to allow for more realistic budget and schedule determination including an interval estimate to help focus on the uncertainty of early estimates.
53

A macroeconometric model for Algeria. A medium term macroeconometric model for Algeria 1963-1984, a policy simulation approach to Algerian development problems.

Laabas, Belkacem January 1989 (has links)
This thesis is concerned with the development and use of a macroeconometric model for the Algerian economy between 1963 and 1984. The model was built because of a systematic lack of applied econometric studies pertaining to Algeria at both the macroeconomic and microeconomic level. It is hoped that the model will fill a gap in this area and will contribute to the much neglected field of applied econometric research with regard to Algeria. This lack of applied econometric studies for Algeria meant that the modelling exercise described here has had to rely on an extensive specification search based on evidence relating to Algeria's economic structure and policy, economic theory, and the experience of Less Developed Countries in the area of macroeconomic model-building. The lack of data was a major constraint in this area and part of this study consisted of collecting and compiling a large database. After the country's independence in 1962, Algerian macroeconomic policy aimed to create a strong industrial system and to satisfy the population's basic needs. It relied on heavy industrialisation to modernise the economy, oil revenues to finance development, and central planning as the major tool of macroeconomic regulation. The accumulation rate was high and the growth record was generally good. However high unemployment and inflation, considerable disequilibrium, low productivity, a vulnerable balance of payments and unsustainable external debt are the major macroeconomic problems that policy-makers have had to face. The model's equations were first estimated using the OLS method and were subjected to stringent statistical tests. The degree of test significance and parameter correspondence to a priori views on the economy was good. when the model was constructed, it was estimated using a 2SLS principal component method. The OLD results were found to be reasonably feasible. The equations were collected into a system of 63 equations and solved using dynamic simulation technique. The model was solved successfully and its tracking of historical data was reasonably good. Further tests were carried out to study its dynamic features. Having constructed the model, it was then used extensively to perform simulation analysis. The experiments ranged from those concerning the goverment's current expenditure to its monetary policy. In all, nine simulation exercises were carried out. These were revealing on the workings of the Algerian economy. The model was further used in scenario analysis. First the model was used to develop an ex ante forecast employing a linear trend model for the exogenous variables. The forecast database was used to generate multipliers. The policy analysis was constructed to coincide with the implementation of the Second Five Year Plan (1985-1989). The feasibility of the plan was examined by varying the price of oil according to three hypotheses. The aim of this test was to develop a realistic framework for applied macroeconomic analysis. / Algerian Ministry of Higher Education
54

Comparison of 1-D and 2-D modeling approaches for simulating runoff and sediment transport in overload areas

Hong, Seonggu 27 August 2007 (has links)
One-dimensional and two-dimensional modeling approaches were compared for their abilities in predicting overland runoff and sediment transport. Both the I-D and 2-D models were developed to test the hypothesis that the 2-0 modeling approach could improve the model predictions over the 1-0 approach, based on the same mathematical representations of physical processes for runoff and sediment transport. Runoff processes were described based on the St. Venant equations and the sediment transport was based on the continuity relationship. The finite element method was employed to solve the governing equations. The nonlinear, time-dependent system of equations obtained by the finite element formulation was solved by the substitution method and the implicit method. The models were verified by comparing the analytical solutions presented by Singh and Regl (1983) and the solution by the Izzard method (Chow, 1959). The comparison showed that both the 1-0 and 2-D models provided reasonable estimations of runoff and sediment loadings. Evaluation of the models was based on four different hypothetical case studies and two experimental studies. The hypothetical case studies investigated the effects of the discretization level, cross slopes, and the size of the field area on the model predictions. The two experimental studies provided a comparison of model predictions with observed data. The results of the hypothetical case studies indicated that the maximum differences in the model predictions at the outlet were about 30% between the two modeling approaches. When the discretization level was sufficient to reasonably describe the shape of the surface, the 1-0 model prediction were almost the same as the 2-D model predictions. Even though cross slopes existed in the field, the differences in the model predictions at the outlet were not significant between the 1-0 and 2-0 models. The differences in the model predictions of runoff and sediment loading were not affected by the changes in the size of the field. Since the 2-D model resulted in 10 to 20% differences in model predictions when different boundary conditions were used and the 1-D model predictions were also affected by the choice of element length, the differences in model predictions at the outlet, shown in model application results, which were less than 30% in most cases, could not be considered significant. The model applications to the experimental studies also showed that no substantial differences existed in the model predictions between the I-D and 2-D models. Even though the spatial distributions of the flow depth and sediment concentration were significantly different, runoff volumes and sediment yields at the outlet showed less than 10% differences. Compared with the I-D model, the 2-D model required much more computational time and effort to simulate the same problems. In addition, convergence problems due to negative flow depths limited the 2-D model applications. The 2-D simulations required more than twice the computational time needed for the I-D simulations. As long as the model predictions at the outlet are concerned, the much greater computational costs and efforts could not justify the use of the 2-D approach. Based on the simulation results from the selected hypothetical case and experimental studies, the 2-D model provided better representations of spatial distribution of flow depths and sediment concentrations than the I-D model. However, no substantial differences in predictions of total runoff volume and sediment yield at the outlet area were found between the I-D and 2-D models. / Ph. D.
55

Computational Framework for Uncertainty Quantification, Sensitivity Analysis and Experimental Design of Network-based Computer Simulation Models

Wu, Sichao 29 August 2017 (has links)
When capturing a real-world, networked system using a simulation model, features are usually omitted or represented by probability distributions. Verification and validation (V and V) of such models is an inherent and fundamental challenge. Central to V and V, but also to model analysis and prediction, are uncertainty quantification (UQ), sensitivity analysis (SA) and design of experiments (DOE). In addition, network-based computer simulation models, as compared with models based on ordinary and partial differential equations (ODE and PDE), typically involve a significantly larger volume of more complex data. Efficient use of such models is challenging since it requires a broad set of skills ranging from domain expertise to in-depth knowledge including modeling, programming, algorithmics, high- performance computing, statistical analysis, and optimization. On top of this, the need to support reproducible experiments necessitates complete data tracking and management. Finally, the lack of standardization of simulation model configuration formats presents an extra challenge when developing technology intended to work across models. While there are tools and frameworks that address parts of the challenges above, to the best of our knowledge, none of them accomplishes all this in a model-independent and scientifically reproducible manner. In this dissertation, we present a computational framework called GENEUS that addresses these challenges. Specifically, it incorporates (i) a standardized model configuration format, (ii) a data flow management system with digital library functions helping to ensure scientific reproducibility, and (iii) a model-independent, expandable plugin-type library for efficiently conducting UQ/SA/DOE for network-based simulation models. This framework has been applied to systems ranging from fundamental graph dynamical systems (GDSs) to large-scale socio-technical simulation models with a broad range of analyses such as UQ and parameter studies for various scenarios. Graph dynamical systems provide a theoretical framework for network-based simulation models and have been studied theoretically in this dissertation. This includes a broad range of stability and sensitivity analyses offering insights into how GDSs respond to perturbations of their key components. This stability-focused, structure-to-function theory was a motivator for the design and implementation of GENEUS. GENEUS, rooted in the framework of GDS, provides modelers, experimentalists, and research groups access to a variety of UQ/SA/DOE methods with robust and tested implementations without requiring them to necessarily have the detailed expertise in statistics, data management and computing. Even for research teams having all the skills, GENEUS can significantly increase research productivity. / Ph. D. / Uncertainties are ubiquitous in computer simulation models especially for network-based models where the underlying mechanisms are difficult to characterize explicitly by mathematical formalizations. Quantifying uncertainties is challenging because of either the lack of knowledge or their inherent indeterminate properties. Verification and validation of models with uncertainties cannot include every detail of real systems and therefore will remain a fundamental task in modeling. Many tools are developed for supporting uncertainty quantification, sensitivity analysis, and experimental design. However, few of them is domain-independent or supports the data management and complex simulation workflow of network-based simulation models. In this dissertation, we present a computational framework called GENEUS, which incorporates a multitude of functions including uncertain parameter specification, experimental design, model execution management, data access and registrations, sensitivity analysis, surrogate modeling, and model calibration. This framework has been applied to systems ranging from fundamental graph dynamical systems (GDSs) to large-scale socio-technical simulation models with a broad range of analyses for various scenarios. GENEUS provides researchers access to uncertainty quantification, sensitivity analysis and experimental design methods with robust and tested implementations without requiring detailed expertise in modeling, statistics, or computing. Even for groups having all the skills, GENEUS can help save time, guard against mistakes and improve productivity.
56

Can degrowth deliver social wellbeing within ecological limits? Dynamics and interactions of degrowth policies in Sweden using iSDG simulation modelling

Zwetsloot, Karel January 2024 (has links)
In response to growing critiques of the economic growth-centered model of development, post-growth approaches have been suggested as having potential to address various social-ecological crises. However, key uncertainties remain regarding the inner dynamics of such a society and the transition towards it; degrowth policy proposals often lack precision and depth, and do not sufficiently consider interactions. Yet policies do not exist in isolation, and their synergies, trade-offs, and unintended consequences need to be investigated. This study explores, through modelling, the potential of degrowth policies to achieve rapid reductions in environmental degradation whilst enabling high social wellbeing. Examples of policies that are explored in the model environment are production taxes on returns on capital, capital decommissioning, universal basic income, work time reduction, and maximum incomes. The research is grounded in a two-fold case study: Sweden as a high-income country where degrowth policies are appropriate, and the Integrated Sustainable Development Goals (iSDG) model as a system dynamics-based policy simulation tool designed to assess the impacts of various policy scenarios at a national scale. The results provide insights into the potential causal dynamics of these policies and show that they all cause trade-offs when implemented in isolation; they improve some social or ecological variables at the cost of others. However, when applied as a coherent package they appear effective in achieving social-ecological prosperity. In the model, combining a downscaling of production with reduced working hours and redistribution of wealth leads to rapid reductions in environmental pressures whilst eradicating poverty and reducing inequality and unemployment. Although there is a risk of misrepresenting degrowth policy dynamics because they are placed in a model context based on current economic structures, the results of this study highlight that some degrowth policies have the potential to succeed even if deeper structural changes have not yet taken place.
57

Prozessintegrierte Dokumentation und optimierte Wiederverwendung von Simulationsmodellen der automobilen Funktionsabsicherung / Process-integrated documentation and optimized reuse of simulation models of the automotive function safeguarding

Gruber, Thomas 26 September 2016 (has links) (PDF)
Die Schaffung, Wahrung und Nutzung von Wissen stellt heute eine wichtige Säule für die Konkurrenzfähigkeit von Unternehmen am Markt dar. Vor diesem Hintergrund steht insbesondere die moderne Funktionsentwicklung der Automobilindustrie vor der Herausforderung immer neue, hochgradig vernetzte Fahrzeugfunktionen zu entwickeln und in immer kürzerer Zeit und immer geringeren Kosten in den Markt zu bringen. Um dieser Herausforderung gerecht zu werden, hat sich die modellbasierte Entwicklung mit dem Ziel der Beherrschung dieser steigenden Komplexität etabliert. Dadurch ist es möglich die Entwicklungsaufgaben auf unterschiedlichen Ebenen zu abstrahieren und eine verteilte, vernetzte Entwicklung zu realisieren. Die Entwicklung einer einzigen Funktion benötigt heute häufig mehrere hundert Personen, die in einen gemeinsamen Entwicklungsprozess integriert werden müssen. Hier fehlt es an Konzepten um den Informations- und Wissensfluss zwischen den Prozessbeteiligten sicherzustellen. In diesem Kontext entwickelt die vorliegende Arbeit einen Ansatz zur prozessintegrierten Dokumentation der in modellbasierten Entwicklungsprozessen benötigten Entwicklungsartefakte. Der Ansatz betrachtet dabei den vollständigen Informationsfluss, von der Definition benötigter Informationen, über deren automatisierte Erfassung und Verarbeitung bis zur zielgerichteten Wiederverwendung. Anschließend skizziert die Arbeit die Architektur eines Informationssystems, dass diese Durchgängigkeit in beliebigen, modellbasierten Entwicklungsprozessen ermöglicht und überträgt diese zur Validierung des Ansatzes auf einen konkreten Entwicklungsprozess der automobilen Funktionsentwicklung. Der Fokus des Ansatzes liegt dabei insbesondere auf der Integration in bestehende Entwicklungsprozesse, ohne in diese verändernd einzugreifen. Dies wird einerseits durch eine modellbasierte Beschreibung des Informationsmodells, mit Methoden wie sie im Funktionsentwicklungsprozess Anwendung finden, erreicht. Prozessbeteiligte können dadurch das Informationsmodell selbst verstehen und bei Bedarf Anpassungen vornehmen, ohne auf geschulte Experten angewiesen zu sein. Andererseits erlaubt der architektonische Ansatz einen direkten Zugriff auf bestehende Entwicklungssysteme und darin enthaltenen dokumentationsrelevanten Informationen. / Today, the creation, preservation and exploitation of knowledge represent key factors of the competitiveness of companies in the global market. In this context, the modern function development in the automotive industry faces challenges to bring new, highly interconnected vehicle functions at shorter time and lower cost to the market. To meet these challenges and manage the growing compelity, a model-based development process has been established. Thus, it is possible to distribute development tasks to different levels of abstraction and enable a distributed, interconnected function development.This development involves up to several hundred persons per function, who have to be integrated in a common development process. Especially when it comes to managing the information and knowledge flow between the process participants, there is a lack of concepts to support this communication. Based on this context, this work presents an approach for process integrated documentation of the necessary development artifacts in model-based development processes. This approach considers the complete information flow, from the definition of necessary information, over automatic acquisition and processing to its targeted reuse during the process. Subsequently, this work sketches the architecture of an information system, which enables this continuous approach to be applied to any model-based development process. For validation purposes, the approach is then applied to an actual development process of the automotive function development. The focus of the presented approach lies in the integration in existing development processes without changing them. On the one hand, this is achieved by applying a model-based description of the information model using methods, that can be found in the function development process today. Thus, process participants can understand the information model themselves and apply changes when they are required without the necessity of qualified experts. On the other hand, the architectural approach allows a direct access to existing development systems and documentation relevant information they contain.
58

The impact of cover crops on farm finance and risk: insights from Indiana farm data using econometric and stochastic methods

Andrew Anderson (7038185) 02 August 2019 (has links)
<p>For agricultural soils to be perpetually productive, farmers must maintain and improve the physical, chemical, and biological properties of the soil. The loss of soil to erosion is a major challenge to soil health, contributing to farmland loss and declines in productivity. This is a long-term problem for agriculture because there is a limited amount of topsoil available. Another costly loss happens when<em> residual nitrogen is lost to leaching or carried away in runoff. This is a particular problem in the fall and winter months when fields lie fallow, and there are no plants to take up excess nitrogen. Losing nitrogen is a problem for both the nutrient content of the soil as well as a serious concern in terms of water contamination.</em><em> </em>Cover crops provide a way to at least partially address each of these and many other agronomic and soil health issues. Although there has been a steady increase in cover crop use, adoption has been relatively slow. This is likely due to a lack of economic information and understanding of the associated risk. To address this problem, field level data was gathered from farmers across central and northeastern Indiana. The data included information on cash crop yield, cover crops grown, fertilizer use, among many other variables. The sample was trimmed based on the estimated propensity to cover crop, in order to reduce selection bias. Using this data, the effect of cover crops on the mean and variation of the subsequent cash crop yield was estimated using regression analysis. This information was combined in a stochastic analysis of a farm enterprise budget. The effects of cover crops on farm finance and risk were evaluated. These final analyses provide agricultural producers with more information to make informed decisions regarding the adoption of cover crops. The information may also provide insight to policy makers, who may wish to understand more completely the private economics of cover crops. The results indicated that cover crops have the ability to provide economic benefits when grown prior to corn in our study region. These include increased yield, reduced need for nitrogen fertilizer, and increased temporal yield stability. These benefits translate into higher revenue from the sale of the grain, lower input costs, and lower risk and uncertainty. However, the results for soybeans showed cover crops had a negative, albeit statistically insignificant, effect on desirable measures. This led to lower projected revenue, higher projected costs, and increased expected risk. Even so, the average corn-soybean contribution margin with cover crops was nearly equal to the baseline scenario. Furthermore, the analysis of risk showed that the corn-soybean two-year average would be preferred by farmers with moderate to high risk aversion. The difference between the effect of cover crops in corn and soybeans may be due to differences in the crop’s inherent nitrogen needs and the difficulty of cover crop establishment after corn in the region.<br></p>
59

Modelos de simulação da cultura do milho - uso na determinação das quebras de produtividade (Yield Gaps) e na previsão de safra da cultura no Brasil / Maize simulation models - use to determine yield gaps and yield forecasting in Brazil

Duarte, Yury Catalani Nepomuceno 18 January 2018 (has links)
Sendo o cereal mais produzido no mundo e em larga expansão, os sistemas de produção de milho são altamente complexos e sua produção é diretamente dependente de fatores ligados tanto ao clima local quanto ao manejo da cultura. Para auxiliar na determinação tanto dos patamares produtivos de milho quanto quantificar o impacto causado por condições adversas tanto de clima quanto de manejo, pode-se lançar mão do uso de modelos de simulação de culturas. Para que os modelos possam ser devidamente aplicados, uma base solida de dados meteorológicos deve ser consistida, a fim de alimentar esses modelos. Nesse sentido, o presente estudo teve como objetivos: i) avaliar dois sistemas de obtenção de dados meteorológicos, o NASA-POWER e o DailyGridded, comparando-os com dados medidos em estações de solo; ii) calibrar, testar e combinar os modelos de simulação MZA-FAO, CSM DSSAT Ceres-Maize e APSIM-Maize, a fim de estimar as produtividades potenciais e atingíveis do milho no Brasil; iii) avaliar o impacto na produtividade causado pelo posicionamento da semeadura em diferentes tipos de solo; iv) desenvolver e avaliar um sistema de previsão de safra baseado em modelos de simulação; v) mapear as produtividades potencial, atingível e real do milho no Brasil, identificando regiões mais aptas ao cultivo e vi) determinar e mapear as quebras de produtividade, ou yield gaps (YG) da cultura do milho no Brasil. Comparando os dados climáticos dos sistemas em ponto de grade com os dados de estações meteorológicas de superfície, na escala diária, encontrou-se boa correlação entre as variáveis meteorológicas, inclusive para a chuva, com R2 da ordem de 0,58 e índice d = 0,85. O desempenho da combinação dos modelos ao final da calibração e ajuste se mostrou superior ao desempenho dos modelos individuais, com erros absolutos médios relativamente baixos (EAM = 627 kg ha-1) e com boa precisão (R2 = 0,62) e ótima acurácia (d = 1,00). Durante a avaliação da influência das épocas de semeadura e do tipo de solo no patamar produtivo do milho, observou-se que esse varia de acordo com a região estudada e apresenta seus valores máximos e com menores riscos à produção quando a semeaduras coincidem com o início do período de chuvas do local. O sistema de previsão de safra, baseado em modelos de simulação de cultura teve seu melhor desempenho simulando produtividades de milho semeados no início da safra e no final da safrinha, sendo capaz de prever de forma satisfatória a produtividade com até 25 dias antes da colheita. Para o estudo dos YGs, 152 locais foram avaliados e suas produtividades potenciais e atingíveis foram comparadas às produtividades reais, obtidas junto ao IBGE. Os maiores YGs referentes ao déficit hídrico se deram em solos arenosos e durante os meses de outono e inverno, usualmente mais secos na maioria das regiões brasileiras, atingindo valores de quebra superiores a 12000 kg ha-1. Quanto ao YG causado pelo manejo, esse foi maior nas regiões menos tecnificadas, como na região Norte e na Nordeste, apresentando valores superiores a 6000 kg ha-1. Já as regiões mais tecnificadas e tradicionais na produção de milho, como a região Sul e a Centro-Oeste, os YGs referentes ao manejo foram inferiores a 3500 kg ha-1 na maioria dos casos. / Maize is the most important cereal cultivated in the world, being its production system very complex and its productivity directly affected by climatic and crop management factors. In order to quantify the impacts caused by water and crop management deficits on maize yield, the use of crop simulation models is very useful. For properly apply these models, a solid basis of meteorological data is required. In this sense, the present study had as objectives: i) to evaluate two meteorological gridded data, NASA-POWER and DailyGridded, by comparing them with measured data from surface stations; (ii) to calibrate, evaluate and combine the MZA-FAO, CSM DSSAT Ceres-Maize and APSIM-Maize simulation models to estimate the maize potential and attainable yields in Brazil; iii) to evaluate the impact caused by the different sowing dates and soil types on maize yield; iv) to develop and evaluate a crop forecasting system based on crop simulation models and climatological data; v) to map the potential and the attainable maize yields in Brazil, identifying the most suitable regions for cultivation, and vi) to determine and map maize yields and yield gaps (YG) in Brazil. Comparing the gridded climatic data with observed ones, on a daily basis, a good agreement was found for all weather variables, including rainfall, with R2 = 0.58 and d = 0,85. The performances of the combination of the models at the end of the calibration and evaluation phases were better than those obtained with the individual models, with relatively low mean absolute error (EAM = 627 kg ha-1) and with good precision (R2 = 0.62) and accuracy (d = 1.00). During the evaluation of different sowing dates and soil types on maize yield, it was observed that this variable depends on the region and presents the maximum values and, consequently, the minimum risk during the sowings in the beginning of the rainy season of each site. The crop forecasting system, based on crop simulation models, had its best performance for simulating maize yields when the sowings were performed at the beginning of the main season and at the end of the second season, when it was able to predict yield satisfactorily 25 days before harvest. For the YG analysis, 152 sites were assessed and their potential and attainable yields were compared to the actual yields reported by IBGE. The highest YGs caused by water deficit occurred for sandy soils and during the autumn and winter months, usually dry in most of Brazilian regions, reaching values above 12000 kg ha-1. For YG caused by crop management, the values were higher in the less technified regions, such as in the North and Northeast regions, with values above 6000 kg ha-1. In contrast, more traditional maize production regions, such as the South and Center-West, presented YG caused by crop management, lower than 3500 kg ha-1 in most cases.
60

Arquitetura em espaços de fluxo: modelagem e simulação em estações metroferroviárias e espaços de multidão / Architectural spaces flow: modeling and simulation in subway and train stations and spaces crowd

Terra, Ulisses Demarchi Silva 25 April 2014 (has links)
O objetivo desta pesquisa é investigar como o fluxo de pedestres interfere na concepção arquitetônica dos espaços que envolvem multidões. Tendo como objeto inicial as estações metroferroviárias, realizou-se um abrangente levantamento bibliográfico sobre o tema, buscando abordá-lo na complexidade que envolve o comportamento humano, a engenharia, a arquitetura e a construção. Conceitos sobre modelagem e simulação de fluxos de pedestres são apresentados e servem de base para o desenvolvimento de um estudo de caso sobre a concepção de distintos espaços arquitetônicos: O Estádio Arena das Dunas, em Natal; o Estádio Mineirão, em Belo Horizonte; o Parque Olímpico, no Rio de Janeiro; e a interligação metroviária Paulista-Consolação, em São Paulo. A presente pesquisa não tem a pretensão de apresentar soluções ou diretrizes de projeto para espaços que envolvem multidões, mas busca investigar como a concepção desses espaços pode ser alterada a partir de uma abordagem que coloca os pedestres como elemento determinante da arquitetura. / This essay aims at investigating how the pedestrian flow interferes the architectural design of the places full of people. Regarding the subway and rail stations as the initial objective, it was made a comprehensive bibliographic survey about this subject in order to speaking about the human behavior engineering, architecture and construction complexity. It is presented the modeling and simulation concepts of pedestrian flows which function as the basis for the development of a case study of different architectural spaces design: the Arena das Dunas Stadium in Natal, the Mineirão Stadium in Belo Horizonte, the Olympics Park in Rio de Janeiro and the subway stations interconnection Consolação-Paulista in São Paulo. This essay does not intend to present solutions or design guide lines for places full of people, but investigate how the design of these places can be changed from an approach that is considering the pedestrians as a determinant element of the architecture.

Page generated in 0.1389 seconds