Spelling suggestions: "subject:"bnetwork modelling"" "subject:"conetwork modelling""
11 |
Impacts of scaling up water recycling and rainwater harvesting technologies on hydraulic and hydrological flowsBertrand, Nathalie Marie-Ange January 2008 (has links)
In recent years, the increasing awareness of scarcity of water resources, indications of likely climate variability, and the increasing pressure to use available fresh water resources more efficiently have together reinforced the need to look at infrastructure solutions with due regard to environmental considerations and social impacts, present and future. There is a vital need to apply an integrated approach to catchment management to implement sustainable solutions to resolve issues such as water supply and sewerage, drainage and river flooding. Many potentials solutions are available to control water demand and manage flood problems. Greywater recycling and rainwater harvesting are novel technologies. However, their catchment scale impacts on hydraulic and hydrological flows are poorly understood. The research aim is to identify the hydrologic and hydraulic impacts of scaling up such technologies at catchment scale. For this particular study, a computer simulation model will be used to evaluate how increasing urbanisation, climate change and the implementation of greywater recycling and rainwater harvesting may alter the water balance within a representative catchment. To achieve these aims data from the Carrickmines catchment in Ireland have been collected; a simulation model has been adapted to carry out the study, the model has been calibrated and validated, results have been analysed, and finally, a sensitivity analysis has been carried out. The results show that rainwater harvesting systems are comparatively more effective than greywater recycling techniques in reducing flood frequency and intensity. Under five year return period rainfall events, the implementation of rainwater harvesting at any scale and number of units is a useful technique to control river flow and floods. However, the study also shows that under extreme conditions the efficiency of rainwater harvesting systems decreases. The study concludes that implementing the two technologies within a single catchment is not a solution to several forms of hydrological problem. The study shows that implementing rainwater harvesting or re-use technologies are a very useful way to protect local freshwater reserves and therefore conserve our environment.
|
12 |
Auditory Object Segregation: Investigation Using Computer Modelling and Empirical Event-Related Potential MeasuresMorissette, Laurence 12 July 2018 (has links)
There are multiple factors that influence auditory steaming. Some, like frequency separation or rate of presentation, have effects that are well understood while others remain contentious. Human behavioural studies and event-related potential (ERP) studies have shown dissociation between a pre-attentive sound segregation process and an attention-dependent process in forming perceptual objects and streams. This thesis first presents a model that synthetises the processes involved in auditory object creation. It includes sensory feature extraction based on research by Bregman (1990), sensory feature binding through an oscillatory neural network based on work by Wang (1995; 1996; 1999; 2005; 2008), work by Itti and Koch (2001a) for the saliency map, and finally, work by Wrigley and Brown (2004) for the architecture of single feature processing streams, the inhibition of return of the activation and the attentional leaky integrate and fire neuron. The model was tested using stimuli and an experimental paradigm used by Carlyon, Cusack, Foxton and Robertson (2001). Several modifications were then implemented to the initial model to bring it closer to psychological and cognitive validity. The second part of the thesis furthers the knowledge available concerning the influence of the time spent attending to a task on streaming. Two deviant detection experiments using triplet stimuli are presented. The first experiment is a follow-up of Thompson, Carlyon and Cusack (2011) and replicated their behavioural findings, showing that the time spent attending to a task enhances streaming, and that deviant detection is easier when one stream is perceived. The ERP results showed double decisions markers indicating that subjects may have made their deviant detection based on the absence of the time delayed deviant and confirmed their decision with its later presence. The second experiment investigated the effect of the time spent attending to the task in presence of a continuity illusion on streaming. It was found that the presence of this illusion prevented streaming in such a way that the pattern of the triplet was strengthened through time instead of separated into two streams, and that the deviant detection was easier the longer the subjects attended to the sound sequence.
|
13 |
Social preconditions of collective action among NGO:s : A social network analysis of the information exchanges between 55 NGO:s in Georgia.Essman, Carl January 2015 (has links)
Individual shortcomings and the need for resources stimulates organizations desire to establish collaborative relations with each other. An organization tends to prefer to collaborate with other familiar organizations. The information available to an organization about its peers is necessary for its ability to appreciate the suitability of potential partners as well as their capabilities and ability to contribute to a successful collaborative relation. In a three stage analytical process, social network analysis and statistical network modelling is applied to investigate the correlation between patterns of communication and the extent to which organizations establish collaborative relationships. With a theoretical framework of resource dependence theory and social capital, data on information exchanges, resource exchanges and common advocacy among humanitarian 55 organizations is mapped. The first analytical stage explicates the structures of the collected information exchanges and evaluates the prevalence of coordination facilitating communication structures. The second stage appreciates the extent of inter-organizational involvement in collaborative relationships. The third step combines these results to demonstrate the covariance between the prevalence of coordination facilitating structures and extent of collaborative relations. The results indicate that the collected information exchanges exhibit few coordination facilitating structures and the organizations are only to a very limited extent engaged in collaborative relationships with each other. While consistent with previous research on the importance of communication for coordination, these observations illustrate the negative consequences of lacking communication. This analysis contributes with added empirical experiences to solidify our understanding of organizational behavior in inter-organizational interaction and tendencies to establish collaborative relations.
|
14 |
Hurricane Evacuation: Origin, Route And DestinationDixit, Vinayak 01 January 2008 (has links)
Recent natural disasters have highlighted the need to evacuate people as quickly as possible. During hurricane Rita in 2005, people were stuck in queue buildups and large scale congestions, due to improper use of capacity, planning and inadequate response to vehicle breakdown, flooding and accidents. Every minute is precious in situation of such disaster scenarios. Understanding evacuation demand loading is an essential part of any evacuation planning. One of the factors often understood to effect evacuation, but not modeled has been the effect of a previous hurricane. This has also been termed as the 'Katrina Effect', where, due to the devastation caused by hurricane Katrina, large number of people decided to evacuate during Hurricane Rita, which hit Texas three weeks after Katrina hit Louisiana. An important aspect influencing the rate of evacuation loading is Evacuation Preparation Time also referred to as 'Mobilization time' in literature. A methodology to model the effect of a recent past hurricane on the mobilization times for evacuees in an evacuation has been presented utilizing simultaneous estimation techniques. The errors for the two simultaneously estimated models were significantly correlated, confirming the idea that a previous hurricane does significantly affect evacuation during a subsequent hurricane. The results show that the home ownership, number of individuals in the household, income levels, and level/risk of surge were significant in the model explaining the mobilization times for the households. Pet ownership and number of kids in the households, known to increase the mobilization times during isolated hurricanes, were not found to be significant in the model. Evacuation operations are marred by unexpected blockages, breakdown of vehicles and sudden flooding of transportation infrastructure. A fast and accurate simulation model to incorporate flexibility into the evacuation planning procedure is required to react to such situations. Presently evacuation guidelines are prepared by the local emergency management, by testing various scenarios utilizing micro-simulation, which is extremely time consuming and do not provide flexibility to evacuation plans. To gain computational speed there is a need to move away from the level of detail of a micro-simulation to more aggregated simulation models. The Cell Transmission Model which is a mesoscopic simulation model is considered, and compared with VISSIM a microscopic simulation model. It was observed that the Cell Transmission Model was significantly faster compared to VISSIM, and was found to be accurate. The Cell Transmission model has a nice linear structure, which is utilized to construct Linear Programming Problems to determine optimal strategies. Optimization models were developed to determine strategies for optimal scheduling of evacuation orders and optimal crossover locations for contraflow operations on freeways. A new strategy termed as 'Dynamic Crossovers Strategy' is proposed to alleviate congestion due to lane blockages (due to vehicle breakdowns, incidents etc.). This research finds that the strategy of implementing dynamic crossovers in the event of lane blockages does improve evacuation operations. The optimization model provides a framework within which optimal strategies are determined quickly, without the need to test multiple scenarios using simulation. Destination networks are the cause of the main bottlenecks for evacuation routes, such aspects of transportation networks are rarely studied as part of evacuation operations. This research studies destination networks from a macroscopic perspective. Various relationships between network level macroscopic variables (Average Flow, Average Density and Average speed) over the network were studied. Utilizing these relationships, a "Network Breathing Strategy" was proposed to improve dissipation of evacuating traffic into the destination networks. The network breathing strategy is a cyclic process of allowing vehicles to enter the network till the network reaches congestion, which is followed by closure of their entry into the network until the network reaches an acceptable state. After which entrance into the network is allowed again. The intuitive motivation behind this methodology is to ensure that the network does not remain in congested conditions. The term 'Network Breathing' was coined due to the analogy seen between this strategy to the process of breathing, where vehicles are inhaled by the network (vehicles allowed in) and dissipated by the network (vehicles are not allowed in). It is shown that the network breathing improves the dissipation of vehicle into the destination network. Evacuation operations can be divided into three main levels: at the origin (region at risk), routes and destination. This research encompasses all the three aspects and proposes a framework to assess the whole system in its entirety. At the Origin the demand dictates when to schedule evacuation orders, it also dictates the capacity required on different routes. These breakthroughs will provide a framework for a real time Decision Support System which will help emergency management official make decisions faster and on the fly.
|
15 |
Neural network modelling for shear strength of concrete members reinforced with FRP barsBashir, Rizwan, Ashour, Ashraf 10 April 2012 (has links)
Yes / This paper investigates the feasibility of using artificial neural networks (NNs) to predict the shear capacity of concrete members reinforced longitudinally with fibre reinforced polymer (FRP) bars, and without any shear reinforcement. An experimental database of 138 test specimens failed in shear is created and used to train and test NNs as well as to assess the accuracy of three existing shear design methods. The created NN predicted to a high level of accuracy the shear capacity of FRP reinforced concrete members.
Garson index was employed to identify the relative importance of the influencing parameters on the shear capacity based on the trained NNs weightings. A parametric analysis was also conducted using the trained NN to establish the trend of the main influencing variables on the shear capacity. Many of the assumptions made by the shear design methods are predicted by the NN developed; however, few are inconsistent with the NN predictions.
|
16 |
Neural network based correlation for estimating water permeability constant in RO desalination process under foulingBarello, M., Manca, D., Patel, Rajnikant, Mujtaba, Iqbal M. 14 May 2014 (has links)
No / The water permeability constant, (K-w), is one of the many important parameters that affect optimal design and operation of RO processes. In model based studies, e.g. within the RO process model, estimation of W-w is therefore important There are only two available literature correlations for calculating the dynamic K-w values. However, each of them is only applicable for a given membrane type, given feed salinity over a certain operating pressure range. In this work, we develop a time dependent neural network (NN) based correlation to predict K-w in RO desalination processes under fouling conditions. It is found that the NN based correlation can predict the K-w values very closely to those obtained by the existing correlations for the same membrane type, operating pressure range and feed salinity. However, the novel feature of this correlation is that it is able to predict K-w values for any of the two membrane types and for any operating pressure and any feed salinity within a wide range. In addition, for the first time the effect of feed salinity on Kw values at low pressure operation is reported. Whilst developing the correlation, the effect of numbers of hidden layers and neurons in each layer and the transfer functions is also investigated. (C) 2014 Elsevier B.V. All rights reserved.
|
17 |
A Bandwidth Market in an IP NetworkLusilao-Zodi, Guy-Alain 03 1900 (has links)
Thesis (MSc (Mathematical Sciences. Computer Science))--University of Stellenbosch, 2008. / Consider a path-oriented telecommunications network where calls arrive to each route in a
Poisson process. Each call brings on average a fixed number of packets that are offered to
route. The packet inter-arrival times and the packet lengths are exponentially distributed.
Each route can queue a finite number of packets while one packet is being transmitted. Each
accepted packet/call generates an amount of revenue for the route manager. At specified
time instants a route manager can acquire additional capacity (“interface capacity”) in
order to carry more calls and/or the manager can acquire additional buffer space in order
to carry more packets, in which cases the manager earns more revenue; alternatively a
route manager can earn additional revenue by selling surplus interface capacity and/or by
selling surplus buffer space to other route managers that (possibly temporarily) value it
more highly. We present a method for efficiently computing the buying and the selling
prices of buffer space.
Moreover, we propose a bandwidth reallocation scheme capable of improving the network
overall rate of earning revenue at both the call level and the packet level. Our
reallocation scheme combines the Erlang price [4] and our proposed buffer space price
(M/M/1/K prices) to reallocate interface capacity and buffer space among routes. The
proposed scheme uses local rules and decides whether or not to adjust the interface capacity
and/or the buffer space. Simulation results show that the reallocation scheme achieves
good performance when applied to a fictitious network of 30-nodes and 46-links based on
the geography of Europe.
|
18 |
Development of a process modelling methodology and condition monitoring platform for air-cooled condensersHaffejee, Rashid Ahmed 05 August 2021 (has links)
Air-cooled condensers (ACCs) are a type of dry-cooling technology that has seen an increase in implementation globally, particularly in the power generation industry, due to its low water consumption. Unfortunately, ACC performance is susceptible to changing ambient conditions, such as dry bulb temperatures, wind direction, and wind speeds. This can result in performance reduction under adverse ambient conditions, which leads to increased turbine back pressures and in turn, a decrease in generated electricity. Therefore, this creates a demand to monitor and predict ACC performance under changing ambient conditions. This study focuses on modelling a utility-scale ACC system at steady-state conditions applying a 1-D network modelling approach and using a component-level discretization approach. This approach allowed for each cell to be modelled individually, accounting for steam duct supply behaviour, and for off-design conditions to be investigated. The developed methodology was based on existing empirical correlations for condenser cells and adapted to model double-row dephlegmators. A utility-scale 64-cell ACC system based in South Africa was selected for this study. The thermofluid network model was validated using site data with agreement in results within 1%; however, due to a lack of site data, the model was not validated for off-design conditions. The thermofluid network model was also compared to the existing lumped approach and differences were observed due to the steam ducting distribution. The effect of increasing ambient air temperature from 25 35 − C C was investigated, with a heat rejection rate decrease of 10.9 MW and a backpressure increase of 7.79 kPa across the temperature range. Condensers' heat rejection rate decreased with higher air temperatures, while dephlegmators' heat rejection rate increased due to the increased outlet vapour pressure and flow rates from condensers. Off-design conditions were simulated, including hot air recirculation and wind effects. For wind effects, the developed model predicted a decrease in heat rejection rate of 1.7 MW for higher wind speeds, while the lumped approach predicted an increase of 4.9 . MW For practicality, a data-driven surrogate model was developed through machine learning techniques using data generated by the thermofluid network model. The surrogate model predicted systemlevel ACC performance indicators such as turbine backpressure and total heat rejection rate. Multilayer perceptron neural networks were developed in the form of a regression network and binary classifier network. For the test sets, the regression network had an average relative error of 0.3%, while the binary classifier had a 99.85% classification accuracy. The surrogate model was validated to site data over a 3 week operating period, with 93.5% of backpressure predictions within 6% of site data backpressures. The surrogate model was deployed through a web-application prototype which included a forecasting tool to predict ACC performance based on a weather forecast.
|
19 |
Monitoring and modelling of water quality characteristics along a reticulation system: a case study of modimolle reticulation networkMehlo, Mahlomola 01 1900 (has links)
M. Tech. (Department of Civil Engineering and Building, Faculty of Engineering and Technology), Vaal University of Technology. / Potable water quality can deteriorate immensely from point of treatment to point of usage. This
change in quality along a bulk distribution main may be attributed to numerous factors, such as
the ingress of storm water. Furthermore, water utilities experience challenges in terms of the
microbiological organisms that are not attributed to operational practices. For example, drinking
water bulk distribution mains may be a shelter for these microorganisms that are sustained by
organic and inorganic nutrients present within the pipe itself. These microorganisms may be
active in the water being transported by the pipe, and can cause a significant drop in the water
quality. In order to deal with the problem of deteriorating water quality, sufficient information
within the bulk main is required, so that the consumer can be protected from ingesting
contaminated water or water of poor quality. Hence, the overall objective of this study was to
investigate and model water quality characteristics within the Modimolle reticulation network.
Water samples were collected from various points throughout the entire system for quality
analysis. Different sampling points were established along the main pipeline as well as within the
Modimolle distribution system. Water quality software, EPANET, was then used to model the
water quality deterioration for both the bulk line and the reticulation network of Modimolle
extension 11. Residual chlorine was the main parameter which was monitored. This study
presents results of a research on water quality variation within a long distribution mains
conveying water up to 87 km. Results show that raw residual chlorine is constantly depleted
along the pipeline, and is therefore unable to be maintained at the required level of 0.2 mg/l, as
stipulated by the Department of Water Affairs. This means that if any harmful contaminants
should enter the water, the residual chlorine in the water will not be able to protect the consumers
from the contaminants.
|
20 |
The Shifting Web of Trust : Exploring the Transformative Journey of Certificate Chains in Prominent Domains / Förtroendets Föränderliga Väv : Att Utforska den Transformativa Resan av Certifikatkedjor av Populära DomänerDöberl, Marcus, Freiherr von Wangenheim, York January 2023 (has links)
The security and integrity of TLS certificates are essential for ensuring secure transmission over the internet and protecting millions of people from man-in-the-middle attacks. Certificate Authorities (CA) play a crucial role in issuing and managing thesecertificates. This bachelor thesis presents a longitudinal analysis of certificate chains forpopular domains, examining their evolution over time and across different categories. Using publicly available certificate data from sources such as crt.sh and censys.io, we createda longitudinal dataset of certificate chains for domains from the Top 1-M list of Tranco.We categorized the certificates based on their type, and the particular service categories.We analyzed a selected set of domains over time and identified the patterns and trendsthat emerged in their certificate chains. Our analysis revealed several noteworthy trends,including an increase in the use of new CAs and a shift of which types of certificates areused, we also found a trend in shorter certificate chains and fewer paths from domain toroot certificate. This implies a more streamlined and simplified certificate process overtime until today. Our findings have implications for the broader cybersecurity communityand demonstrate the importance of ongoing monitoring and analysis of certificate chainsfor popular domains.
|
Page generated in 0.0705 seconds