• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 23
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 37
  • 37
  • 16
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Finite Element Modelling of Off-Road Tyres

Conradie, Johan January 2014 (has links)
Most tyre models developed to date require a fair amount of data before an accurate representation of the tyre can be obtained. This study entails the development of a simplified, yet accurate, non-linear Finite Element (FE) model of an “off-road” tyre to study the behaviour of the tyre due to radial loading conditions. The study aims to develop a FE tyre model that can solve fast and be accurate enough to be used in multibody dynamic vehicle simulations. A model that is less complex than conventional detailed FE models is developed. The work explores the use of superimposed finite elements to model the varying stiffness in the respective orthogonal directions of the sidewall and tread of the tyre. Non-linear elements defined by Neo-Hookean or Ogden models and elements with different linear orthogonal stiffnesses are superimposed onto each other to simulate the global material properties of the tread and the sidewall of the tyre investigated. The geometry of the tyre studied was measured experimentally using laser displacement transducers and digital image correlation techniques. Material properties of segments of the tyre were obtained by performing tensile tests on samples. Since the rubber slipped against the clamps during the experiment, deformation of the segments was also measured using digital image correlation. These geometrical and material properties were used as input to develop a finite element model of an “off-road” tyre. Measurements were conducted using laser displacement transducers, load cells mounted to actuators, etc. to obtain accurate sidewall deformation profiles and global radial load vs. displacement curves for different radial loading conditions. The data obtained from the results was used to validate the tyre model developed. Numerous analyses are performed with different combinations of moduli of elasticity in the respective orthogonal directions of the sidewall stiffness and the tread to investigate its influence on the global behaviour of the tyre model. The main focus of the project was to develop a tyre model from data obtained from laser and photogrammetry measurements in a laboratory that accurately represents tyre behaviour due to radial forces. A finite element model that can simulate the effect of radial forced and obstacles on a tyre was developed. The use of two subsets of elements, superimposed onto each other to simulate global material properties of the rubbers, steel wires, polyester and nylon threads, was investigated. The combination of material properties that gave the best fit for all the load cases investigated were determined. The finite element model correlated well with the load vs. displacement graphs and sidewall displacement profiles determined experimentally. The solving time is still fairly high and is still not quite suitable for real-time dynamic simulation. However, it solves faster than more complex tyre models where details of steel wires, etc. are included in the model. For future studies it is recommended that different element types be investigated in the tyre model. The study proves that equivalent material properties can be used to simulate the composite properties of the materials in tyres. Most tyres can be divided into a few regions that each has its own material structure right through the region. These regions can be characterized by simple tests and the input can be used as a first estimation of the tyre’s material properties for the model. Accurate validation criteria should be used to validate the tyre model if time does not allow for excessive testing of the material properties of all the rubber, steel wires, polyester threads, etc. Geometric displacement data at various loading conditions can be used for validation of the tyre model. The model developed can be used to investigate the effect of different stiffnesses and other material changes in the sidewall or tread of a tyre. Useful insight can be obtained from the finite element model developed for dynamic simulation where the force vs. global displacement data is important. / Dissertation (MEng)--University of Pretoria, 2014. / tm2015 / Mechanical and Aeronautical Engineering / MEng / Unrestricted
22

Estudio del comportamiento no lineal de dispositivos activos de microondas

Tazón Puente, Antonio 05 June 1987 (has links)
Dentro de los sistemas de comunicación han adquirido gran importancia las aplicaciones en alta frecuencia, lo que conlleva un gran desarrollo en este tipo de tecnologías; aplicaciones bajo ruido, potencia monolítico, etc. Este tipo de tecnologías precisan un conocimiento no lineal de los circuitos. Por ello el presente trabajo se ha orientado hacia la resolución de tres problemas no lineales fundamentales que son: modelización física, análisis no lineal de sistemas activos autónomos trabajando en microondas y el desarrollo de una formulación matemática compacta dirigida hacia la optimización gran señal de sistemas activos a transistor Mesfet. El trabajo se completa con una verificación experimental de los comportamientos simulados teóricamente lo que confiere a los métodos matemáticos un amplio rango de validez
23

A multilevel analysis of learner and school contextual factors associated with educational quality

Winnaar, Lolita January 2013 (has links)
Magister Philosophiae - MPhil / The South African schools act, (number 5, 1996), asserts that all learners have a right to access both basic and quality education without discrimination of any sort. Since the implementation of the Millennium Development Goals there has been a drive by the Department of Education to ensure that all learners have access to basic education by 2015. However what remains a challenge after almost 20 years of democracy is the poor quality of education and this is clear from the results of international assessment studies. Results from studies like the Trends in International Mathematics and Science Study and Southern and East Africa Consortium for Monitoring Educational Quality, show that South African children perform well below international averages. In this study learner Mathematics achievement scores taken from the Trends in International Mathematics and Science Study 2011 cycle will serve as a proxy for educational quality. Using multilevel analysis the current study aims to use a 2-level Hierarchical Linear Model to firstly; determine the learner and family background factors associated with education quality. Secondly; factors at the school level will be identified and proven to be associated with education quality. Variables selected for the study was based on Creamer’s theory of school effectiveness which looked at school, classroom level inputs as well as learner background variables to explain student level achievement. The results show that at the learner’s level the most significant factors were the age of the leaner, in the sense that grade age appropriate learners obtained higher scores than overage learners. Learner’s perception of mathematics is extremely important and has a positive effect on mathematics performance. In the current study mathematics perception refers to learners valuing and liking mathematics as well learner confidence in learning mathematics. Learners who said they were bullied as school generally scored lower than learners who were not bullied. At the school level the most significant factors were teacher working conditions, teachers’ specialisation in mathematics, school socio-economic status, and general infrastructure. Interesting to note at the school level is when socioeconomic status was included in the model as a single variable the score difference between low socio-economic status and high socio-economic status schools was almost 46 points. However when the factors mentioned above were added to the model the difference in scores dropped by almost half.
24

Reading between the lines : contributing factors that affect Grade 5 learner reading performance

Van Staden, Surette 24 May 2011 (has links)
This study aims to identify and explain relationships between some major factors associated with successful reading at Grade 5 level in South African primary schools. In South Africa, grave concerns with regards to low levels of student achievement pervade research initiatives and educational debates. Despite considerable investments in educational inputs (such as policy and resources) and processes (such as curriculum provision and teacher support), outcomes (such as student achievement) remain disappointingly low. The South African population is characterized by great diversity and variation. With 11 official languages, current educational policy in South Africa advocates an additive bilingualism model and students in Grade 1 to 3 are taught in their mother tongue. Thereafter, when these students progress to Grade 4, the language of learning and teaching changes to a second language, which in most cases is English. At this key developmental stage students are also expected to advance from learning to read to a stage where they can use reading in order to learn. With this complexity of issues in mind, Hierarchical Linear Modeling (HLM) was used to determine the effect of a number of explanatory variables at learner and school level on reading achievement as outcome variable, while controlling for language using the South African Progress in International Reading Literacy Study (PIRLS) 2006 data. As an international comparative evaluation of reading literacy involving more than 40 countries, PIRLS 2006 was the second, after PIRLS 2001, in a series of planned five-year cycles of assessment to measure trends in children’s reading literacy achievement, policy and practices related to literacy. Grade 5 learners in South African primary schools who participated in PIRLS 2006 were not able to achieve satisfactory levels of reading competence. The gravity of this finding is exacerbated by the fact that these learners were tested in the language in which they had been receiving instruction during the Foundation Phase of schooling. This study found most significant factors associated with reading literacy at learner-level, but this does not mean that the existence of teacher- and school-level factors is not of importance. While some explanatory factors at learner-level can more easily become the target of reading interventions, the higher level effect of the classroom and school are not diminished by this study. Creemers’ Comprehensive Model of Educational Effectiveness was utilized as theoretical point of departure. Creemers’ model was adapted for the purposes of this study to reflect a South African model of reading effectiveness in contrast with Creemers’ original use of it as a model of school effectiveness. Evidence was provided that the conceptual framework was inadequate in identifying factors affecting reading achievement for all South African language groupings. More specifically, the adapted South African reading effectiveness model was only appropriate in explaining reading achievement scores for the Afrikaans and English language groupings than for those from African language groupings. / Thesis (PhD)--University of Pretoria, 2010. / Science, Mathematics and Technology Education / unrestricted
25

Distribution matters: Meeting human needs at sustainable carbon consumption

Barbour, Felix January 2022 (has links)
To avoid irreversible damage to the climate system and biosphere, the majority of the world’s countries must reduce rates of resource throughput. However, the socio-economic conditions for satisfying basic human needs at low resource use have received scant empirical attention. I apply cross-country panel analysis and dynamic linear modelling to explore how different dimensions of inequality affect countries’ abilities to deliver a good life for all at sustainable levels of carbon consumption. My results suggest that inequalities reduce socio-ecological performance, with income inequality reducing the proportion of carbon channelled into meeting basic needs and wealth inequality increasing the carbon-intensity of expenditure. Overall, this study highlights the importance of reducing inequalities in a resource-constrained world. Social media summary. Income inequality raises the carbon cost of meeting basic human needs at the national and global scales.
26

Modelling the effects of changing habitat characteristics and spatial pattern on woodland songbird distributions in West and Central Scotland

Creegan, Helen P. January 2005 (has links)
This study investigated bird distributions in relation to local habitat and landscape pattern and the implications which habitat fragmentation may have for woodland birds. There were two sections to the research: an experimental study investigating bird gap crossing behaviour across distances of five to 120m; and an observational study modelling woodland bird distributions in relation to local habitat and landscape scale variables in two study areas (East Loch Lomond and the Central Scotland Forest). In the experimental study it was hypothesised that bird willingness to cross gaps will decrease with increasing gap distance even at home-range scales and that the rate of decline will vary interspecifically in relation to bird morphology. Song thrush mobbing calls played at woodland edges in the West of Scotland were used to attract birds across gaps and results were compared with the response along woodland edges. Data were obtained for four species: chaffinch, coal tit, robin and goldcrest. The decline in response with distance across gaps and along woodland edge was modelled for each species using generalized linear modelling. Maximum gap crossing distances ranged from 46m (goldcrest) to 150m (extrapolated value for the chaffinch). Goldcrests responded more readily through woodlands. There was no difference between woodland edge and gap response for the coal tit. Robins and chaffinches however responded more readily across gaps than through woodland. When different response indices were plotted against bird mass and wing area, results suggested that larger birds with bigger wings responded more readily across gaps than through woodland. It is suggested that this relates to differences in bird manoeuvrability within woodlands and ability to evade a predator in gaps. Fragmentation indices were calculated for an area of the Central Scotland Forest to show how willingness to cross different gap distances influences perception of how fragmented the woodlands are in a region. Results are discussed in the context of the creation of Forest Habitat Networks. The data for the observational section of the work was from bird point counts for 200 sample points at East Loch Lomond in 1998 and 2000 and 267 sample points in the Central Scotland Forest in 1999. In addition a time series of point count data was available for 30 sample points at East Loch Lomond. Additional data was gathered for ten sample points (1998) and two sample points (2000) at East Loch Lomond to investigate effects of observer, time and weather on count data. Generalized linear and generalized additive modelling was carried out on these additional data. Results indicated that biases due to the variation in time and weather conditions between counts existed in the pure count data but that these were eliminated by reducing data to presence and absence form for analysis. Species accumulation curves indicated that two counts per sample point were insufficient to determine species richness. However a sufficiently large proportion of the species was being detected consistently in two counts of ten minutes duration for it to be valid to model them in relation to habitat and landscape variables. Point count data for East Loch Lomond in 1998 (ELL98) and the Central Scotland Forest in 1999 (CSF99) for the wren, treecreeper, garden warbler, robin, blue tit, blackbird, willow warbler, coal tit, goldcrest, great tit, and song thrush were analysed using generalized additive modelling. In addition models were built for the blackcap (CSF99) and the siskin, redstart and wood warbler (ELL98). Where all relationships were identified as linear, models were rebuilt as GLMs. Models were evaluated using the Area Under the Curve (AUC) of Receiver Operating Characteristic (ROC) plots. AUC values ranged from 0.84-0.99 for ELL98 and from 0.76-0.93 for CSF99 indicating high predictive accuracy. Habitat variables accounted for the largest proportion of explained variation in all models and could be interpreted in terms of bird nesting and feeding behaviour. However additional variation was explained by landscape scale and fragmentation related (especially edge) variables. ELL98 models were used to predict bird distributions for Loch Lomond in 2000 (ELL00) and for the CSF99. Likewise the CSF99 models were used to predict distributions for ELL98 and ELL00. Predicted distributions had useful application in many cases within the ELL site between years. Fewer cases of useful application arose for predicting distributions between sites. Results are discussed in the context of the generality of bird environment relationships and reasons for low predictive accuracy when models are applied between sites and years. Models which had useful application for ELL00 were used to predict bird distributions for 2025 and 2050 at East Loch Lomond. Habitat and landscape changes were projected based on the proposed management for the site. Since woodland regeneration rates are difficult to predict, two scenarios were modelled, one assuming a modest amount of regeneration and one assuming no regeneration. Predictions derived from the ELL98 models showed broad-leaved species increasing in distribution while coniferous species declined. This was in keeping with the expected changes in the relative extent of broad-leaved and coniferous habitat. However, predictions from the CSF99 models were often less readily explicable. The value of the modelling approach is discussed and suggestions are made for further study to improve confidence in the predictions.
27

Investigations into the effects of neuromodulations on the BOLD-fMRI signal

Maczka, Melissa May January 2013 (has links)
The blood oxygen level dependent functional MRI (BOLD-fMRI) signal is an indirect measure of the neuronal activity that most BOLD studies are interested in. This thesis uses generative embedding algorithms to investigate some of the challenges and opportunities that this presents for BOLD imaging. It is standard practice to analyse BOLD signals using general linear models (GLMs) that assume fixed neurovascular coupling. However, this assumption may cause false positive or negative neural activations to be detected if the biological manifestations of brain diseases, disorders and pharmaceutical drugs (termed "neuromodulations") alter this coupling. Generative embedding can help overcome this problem by identifying when a neuromodulation confounds the standard GLM. When applied to anaesthetic neuromodulations found in preclinical imaging data, Fentanyl has the smallest confounding effect and Pentobarbital has the largest, causing extremely significant neural activations to go undetected. Half of the anaesthetics tested caused overestimation of the neuronal activity but the other half caused underestimation. The variability in biological action between anaesthetic modulations in identical brain regions of genetically similar animals highlights the complexity required to comprehensively account for factors confounding neurovascular coupling in GLMs generally. Generative embedding has the potential to augment established algorithms used to compensate for these variations in GLMs without complicating the standard (ANOVA) way of reporting BOLD results. Neuromodulation of neurovascular coupling can also present opportunities, such as improved diagnosis, monitoring and understanding of brain diseases accompanied by neurovascular uncoupling. Information theory is used to show that the discriminabilities of neurodegenerative-diseased and healthy generative posterior parameter spaces make generative embedding a viable tool for these commercial applications, boasting sensitivity to neurovascular coupling nonlinearities and biological interpretability. The value of hybrid neuroimaging systems over separate neuroimaging technologies is found to be greatest for early-stage neurodegenerative disease.
28

Assessing EPA + DHA requirements of Sparus aurata and Dicentrarchus labrax : impacts on growth, composition and lipid metabolism

Houston, Sam James Silver January 2018 (has links)
The gilthead seabream (Sparus aurata) and European seabass (Dicentrarchus labrax) require n-3 long-chain polyunsaturated fatty acids (LC-PUFA), eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA), for optimal growth and health. Due to the rapid growth of global aquaculture the quantity of marine oils used in aquafeeds has been limited, yet the overall quantity of oil in an aquafeed has increased by the addition of vegetable oil (VO) to supply dietary energy. For aquaculture to continue to grow more fish must be produced with less marine ingredients, yet EPA and DHA must be maintained at levels above fish requirements. This project set out to re-evaluate the requirement for EPA and DHA in gilthead seabream and European seabass. Two dose-response studies were designed and executed where juvenile seabream and seabass were fed one of six levels of EPA+DHA (0.2 – 3.2 % as fed). Biometric data were collected and analysed to determine new requirement estimates for EPA+DHA for fish of two weight ranges (24 – 80 g and 80 – 200 g). The effects of the dietary LC-PUFA gradient on lipid composition and metabolism were also considered.
29

EEG and fMRI studies of the effects of stimulus properties on the control of attention

Mugruza Vassallo, Carlos Andrés January 2015 (has links)
In this dissertation the effects of variations in stimulus properties and CTOA, in auditory attention tasks were explored using recently developed approaches to EEG analysis including LIMO. The last experiment was structured using information theory, designing an effective experiment. Four studies were carried out using a number parity decision task, that employed different combinations of cueing Tone (T), Novel (N) and the Goal (G) stimuli. In the first EEG study, contrary to previous findings (Polich 2002, 2007) in control participants, no correlation between the time of a novel condition to the next novel condition and P300 amplitude was found. Therefore single trial across-subject averaging of participants’ data revealed significant correlations (r > .3) of stimulus properties (such as probability, frequency, amplitude and duration) on P300, and even r > .5 was found when N was an environmental sound in schizophrenic patients. In the second EEG study, simultaneously with fMRI recordings, the participants that showed significant behavioural distraction evoked brain activations and differences in both hemispheres (similar to Corbetta, 2002, 2008) while the participants, as a whole, produced significant activations mainly in left cortical and subcortical regions. A context analysis was run in distracted participants contrasting the trials immediately prior to the G trials, resulting in different prefrontal activations, which was consistent with studies of prefrontal control of visual attention (Koechlin 2003, 2007). In the third EEG study, the distractor noise type was manipulated (white vs environmental sounds) as well as presence or absence of scanner background noise in a blocked design. Results showed consistent P300, MMN and RON due to environmental noise. In addition, using time constants found in MEG results (Lu, Williamson & Kaufman, 1992) and adding the CTOA to the analysis, an information theory framework was calculated. After the simulation of the information of the experiment, a saddle indentation in the curve of the information measure based on the states of the incoming signal at around 300 ms CTOA was found. This saddle indentation was evident in more than 60 novel trials. In the fourth study, the CTOA and stimulus properties were manipulated in a parametric experiment. Based on the three studies, reducing complexity if the task (first study), using more than 60 stimuli in the novel conditions (third study). The CTOA randomly varying between 250 ms or 500 ms. Thirty-eight ANCOVA with 2 categorical and 1 continuous regressors were conducted and determined which time and channels elicited reliably signatures (p <.05) in the whole participants at short CTOA. Results revealed differences for the waveforms of current condition by depending on which condition appeared previously as well in terms of frequency and duration in scalp frontal electrodes (such as the second study). These results were interpreted as a consequence of switching between modes of attention and alerting states which resulted in the activation of frontal areas. Moreover, contextual analyses showed that systematic manipulation of stimulus properties allowed the visualization of the relationships between CTOA, executive function and orienting of attention.
30

Uncertainty in Aquatic Toxicological Exposure-Effect Models: the Toxicity of 2,4-Dichlorophenoxyacetic Acid and 4-Chlorophenol to Daphnia carinata

Dixon, William J., bill.dixon@dse.vic.gov.au January 2005 (has links)
Uncertainty is pervasive in risk assessment. In ecotoxicological risk assessments, it arises from such sources as a lack of data, the simplification and abstraction of complex situations, and ambiguities in assessment endpoints (Burgman 2005; Suter 1993). When evaluating and managing risks, uncertainty needs to be explicitly considered in order to avoid erroneous decisions and to be able to make statements about the confidence that we can place in risk estimates. Although informative, previous approaches to dealing with uncertainty in ecotoxicological modelling have been found to be limited, inconsistent and often based on assumptions that may be false (Ferson & Ginzburg 1996; Suter 1998; Suter et al. 2002; van der Hoeven 2004; van Straalen 2002a; Verdonck et al. 2003a). In this thesis a Generalised Linear Modelling approach is proposed as an alternative, congruous framework for the analysis and prediction of a wide range of ecotoxicological effects. This approach was used to investigate the results of toxicity experiments on the effect of 2,4-Dichlorophenoxyacetic Acid (2,4-D) formulations and 4-Chlorophenol (4-CP, an associated breakdown product) on Daphnia carinata. Differences between frequentist Maximum Likelihood (ML) and Bayesian Markov-Chain Monte-Carlo (MCMC) approaches to statistical reasoning and model estimation were also investigated. These approaches are inferentially disparate and place different emphasis on aleatory and epistemic uncertainty (O'Hagan 2004). Bayesian MCMC and Probability Bounds Analysis methods for propagating uncertainty in risk models are also compared for the first time. For simple models, Bayesian and frequentist approaches to Generalised Linear Model (GLM) estimation were found to produce very similar results when non-informative prior distributions were used for the Bayesian models. Potency estimates and regression parameters were found to be similar for identical models, signifying that Bayesian MCMC techniques are at least a suitable and objective replacement for frequentist ML for the analysis of exposureresponse data. Applications of these techniques demonstrated that Amicide formulations of 2,4-D are more toxic to Daphnia than their unformulated, Technical Acid parent. Different results were obtained from Bayesian MCMC and ML methods when more complex models and data structures were considered. In the analysis of 4-CP toxicity, the treatment of 2 different factors as fixed or random in standard and Mixed-Effect models was found to affect variance estimates to the degree that different conclusions would be drawn from the same model, fit to the same data. Associated discrepancies in the treatment of overdispersion between ML and Bayesian MCMC analyses were also found to affect results. Bayesian MCMC techniques were found to be superior to the ML ones employed for the analysis of complex models because they enabled the correct formulation of hierarchical (nested) datastructures within a binomial logistic GLM. Application of these techniques to the analysis of results from 4-CP toxicity testing on two strains of Daphnia carinata found that between-experiment variability was greater than that within-experiments or between-strains. Perhaps surprisingly, this indicated that long-term laboratory culture had not significantly affected the sensitivity of one strain when compared to cultures of another strain that had recently been established from field populations. The results from this analysis highlighted the need for repetition of experiments, proper model formulation in complex analyses and careful consideration of the effects of pooling data on characterising variability and uncertainty. The GLM framework was used to develop three dimensional surface models of the effects of different length pulse exposures, and subsequent delayed toxicity, of 4-CP on Daphnia. These models described the relationship between exposure duration and intensity (concentration) on toxicity, and were constructed for both pulse and delayed effects. Statistical analysis of these models found that significant delayed effects occurred following the full range of pulse exposure durations, and that both exposure duration and intensity interacted significantly and concurrently with the delayed effect. These results indicated that failure to consider delayed toxicity could lead to significant underestimation of the effects of pulse exposure, and therefore increase uncertainty in risk assessments. A number of new approaches to modelling ecotoxicological risk and to propagating uncertainty were also developed and applied in this thesis. In the first of these, a method for describing and propagating uncertainty in conventional Species Sensitivity Distribution (SSD) models was described. This utilised Probability Bounds Analysis to construct a nonparametric 'probability box' on an SSD based on EC05 estimates and their confidence intervals. Predictions from this uncertain SSD and the confidence interval extrapolation methods described by Aldenberg and colleagues (2000; 2002a) were compared. It was found that the extrapolation techniques underestimated the width of uncertainty (confidence) intervals by 63% and the upper bound by 65%, when compared to the Probability Bounds (P3 Bounds) approach, which was based on actual confidence estimates derived from the original data. An alternative approach to formulating ecotoxicological risk modelling was also proposed and was based on a Binomial GLM. In this formulation, the model is first fit to the available data in order to derive mean and uncertainty estimates for the parameters. This 'uncertain' GLM model is then used to predict the risk of effect from possible or observed exposure distributions. This risk is described as a whole distribution, with a central tendency and uncertainty bounds derived from the original data and the exposure distribution (if this is also 'uncertain'). Bayesian and P-Bounds approaches to propagating uncertainty in this model were compared using an example of the risk of exposure to a hypothetical (uncertain) distribution of 4-CP for the two Daphnia strains studied. This comparison found that the Bayesian and P-Bounds approaches produced very similar mean and uncertainty estimates, with the P-bounds intervals always being wider than the Bayesian ones. This difference is due to the different methods for dealing with dependencies between model parameters by the two approaches, and is confirmation that the P-bounds approach is better suited to situations where data and knowledge are scarce. The advantages of the Bayesian risk assessment and uncertainty propagation method developed are that it allows calculation of the likelihood of any effect occurring, not just the (probability)bounds, and that the same software (WinBugs) and model construction may be used to fit regression models and predict risks simultaneously. The GLM risk modelling approaches developed here are able to explain a wide range of response shapes (including hormesis) and underlying (non-normal) distributions, and do not involve expression of the exposure-response as a probability distribution, hence solving a number of problems found with previous formulations of ecotoxicological risk. The approaches developed can also be easily extended to describe communities, include modifying factors, mixed-effects, population growth, carrying capacity and a range of other variables of interest in ecotoxicological risk assessments. While the lack of data on the toxicological effects of chemicals is the most significant source of uncertainty in ecotoxicological risk assessments today, methods such as those described here can assist by quantifying that uncertainty so that it can be communicated to stakeholders and decision makers. As new information becomes available, these techniques can be used to develop more complex models that will help to bridge the gap between the bioassay and the ecosystem.

Page generated in 0.1776 seconds