• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3265
  • 1518
  • 455
  • 365
  • 292
  • 223
  • 133
  • 101
  • 44
  • 37
  • 34
  • 33
  • 25
  • 22
  • 20
  • Tagged with
  • 8265
  • 989
  • 913
  • 900
  • 859
  • 730
  • 669
  • 648
  • 515
  • 483
  • 460
  • 454
  • 450
  • 446
  • 440
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
541

Volunteer Monitoring of Water Quality in New Zealand: Where does the Value Lie?

Coates, Annabelle January 2013 (has links)
Natural waterways form an integral part of the urban and rural environment. In New Zealand, their uses are generally related to agriculture, drainage, power generation and recreation, but their value also extends to providing ecological services that are vital to the maintenance of a fully functioning environment. In the areas of the world that are considered to be developed, several functions and services that waterway systems initially would have provided, have been degraded, or lost completely, due to water abstractions, altered flow regimes and input of pollutants. In New Zealand, and around the world, groups of volunteers give up their time in order to help monitor the quality and state of waterways. However, there remains a distrust of data generated by such groups throughout the scientific community. This concern is also voiced by members of these groups, querying what the point of their monitoring is, if the data has no real use. As a result of this uncertainty about the data quality and its subsequent uses, data is often just entered onto a database with little, or no, analysis conducted. The purpose of this research was to ascertain the quality of the data generated by volunteers groups in New Zealand by comparing it with data collected by professionals from city and regional councils. Volunteer monitoring methods and tools were also compared with those available to professionals in order to determine if any differences observed were a product of equipment, or other factors. However, data generation is not the only purpose of these volunteer groups. By being involved, volunteers are gaining education, practical skills and knowledge they may not have access to otherwise, and they are meeting people and strengthening community ties. Volunteers from each group therefore also completed a survey to determine their knowledge of the programme they participate in, of the environment and freshwater, and to collect some basic background information. The Styx Living Laboratory Trust (SLLT) in Christchurch, the Wakapuaka Rivercare Group in Nelson and Wai Care in Auckland were the three New Zealand community water monitoring groups chosen to be the subject of this study. Generally, the volunteer conductivity and pH data was significantly different from that of their professional counterparts, with large differences obvious in the data sets from all three groups. Water temperature was the only variable that was consistently similar for volunteer and professional data. Comparison of the SLLT’s methods with professional-level methods, however, revealed that differences in the data sets may be due to a combination of factors including equipment (e.g., use of pH colour strips instead of meters), and variation in the monitoring protocols, rather than a lack of quality in the volunteer data. However, new dissolved oxygen and nitrogen monitoring methods utilised by Wai Care did produce some promising results, with some of the comparisons unable to be statistically differentiated from the professional data set. Visual assessment of the SLLT data over time suggests seasonal patterns in pH and conductivity, and possible increases in water clarity over time. Statistical analysis of the individual variables of pH, water temperature, clarity and conductivity, in the SLLT data revealed several significant predictors and interactions, including time, date and pH among other things. However, the very small effect size and the large data set suggest this may just be a product of the large data set with very few of these variable interactions having any real meaning with regards to management. Volunteers were predominantly over the age of 40, and were generally either very new recruits to their monitoring programmes (<6 months) or had been involved for a reasonably long time (>5 years). There were differing patterns of involvement between the groups with the WRG having volunteers mainly involved for >10 years while the SLLT had a large number of new recruits. There were also varying reasons volunteers chose to become involved however, the predominant reason was concern for the environment. Approximately half of the volunteers surveyed proved to be very knowledgeable about their programme and understood the purposes of the monitoring programme, although most were associated with a science-related industry and therefore likely already had this knowledge. More education and training would be needed to bring all of the other volunteers up to this level. All volunteers had good knowledge of issues in New Zealand’s environment and freshwater currently face, with public apathy considered the most pressing issue. In summary, despite the lack of clear statistical similarities between volunteer and professional data sets for some variables, the data do not appear to be randomly inaccurate and could be corrected to be combined with professional data. The benefits the volunteers gain appear to outweigh any issues that may be present in the data, as long as the volunteers perceive the data to be ultimately useful. Volunteer-based water quality monitoring has proved to be a valuable way to gather environmental data, educate the community and improve their commitment to local waterways.
542

The long-term impacts of an aerial 1080 application on non-target forest species

Peterson, Amanda Jane January 2014 (has links)
The control of introduced mammalian predators in New Zealand forests is crucial for the protection of native species and essential ecosystem services. Possum control in the form of aerial 1080 applications is conducted by TbFree New Zealand to prevent the spread of bovine tuberculosis, and often has the added conservation benefit of temporarily reducing levels of other mammalian predators such as rodents and mustelids. However, native non-target species such as birds and weta can also be at risk of direct and secondary poisoning following 1080 applications, as well as increased predation risk through mesopredator release. To determine whether the benefits of 1080 applications outweigh the risks to non-target native species, both short and long-term monitoring of populations following aerial 1080 applications is needed. For this study, two forest regions in the South Island were selected for pre- and post-treatment monitoring of non-target species following an aerial 1080 application for possum control. Each region contained a treatment site and a paired non-treatment site. Relative indicies of possums, rodents and other mammalian predators were obtained using tracking tunnels and chew cards, indicies of birds were obtained using five-minute bird counts, and indicies of tree weta were obtained using tracking tunnels and artificial shelters. Monitoring was conducted before the aerial 1080 was applied in August 2012, and over the following 2012/13 and 2013/14 summer seasons. The aerial 1080 applications were successful at reducing possums to undetectable levels at both treatment sites for the two seasons following treatment. Mice were significantly reduced at one treatment site relative to the paired non-treatment site immediately following the 1080 operation, but had increased to pre-1080 levels by the second post-treatment monitoring season. Rats were detected at low levels, and showed no response to the treatment. Mustelids were not detected at either region throughout the monitoring period. No native species showed a decline in a treatment site that was not matched in the non-treatment site. Chaffinches significantly declined at both treatment sites relative to non-treatment sites, likely due to an indirect delayed effect such as competition for food resources. Tomtits showed a positive response to the treatment, significantly increasing in both treatment sites over the post-treatment monitoring periods. Tree weta showed no significant decline in response to the treatment. The reduction of possums to low levels, and the maintenance of possum control with ongoing 1080 operations, is likely to continue to provide an overall net benefit to native non-target species.
543

A multiple sensor dual chamber waveform recording diagnostic pacemaker

Edgar, Deborah Rankine January 1997 (has links)
No description available.
544

Real-time monitoring of continuous fermentation by Raman spectroscopy

Krieg, Therese January 2014 (has links)
The production of bio-ethanol from lignocellulosic material requires a more efficient process to be feasible and compete with products from fossil fuels. There is a need to rapidly and nondestructively be able to determine key components during fermentation. Raman spectroscopy is a technique, which can be used to monitor the fermentation process in real-time and provide information about key components which can be accessed immediately, thus facilitating process control. A continuous system with membrane cell recycling was set up and fermentations were performed using Saccharomyces cerevisiae ATCC 96581. Fermentations were performed to test for optimal dilution rates and operating times, the effect of different sugar concentrations in the media feed, and which position in the system was optimal for Raman data collection. Raman data and aliquot samples for HPLC validation were continuously collected throughout the fermentations. Raman data was analysed with PLS models to obtain component concentrations, for which RMSE was calculated in order to compare to HPLC validation set. Fermentations were performed with synthetic glucose media as well as with poplar hydrolysate. It was shown that the continuous system with membrane cell recycling could achieve a glucose-to-ethanol conversion of between 75-100%. The process could be sufficiently monitored by Raman spectroscopy, and predicted concentrations were within the range of the validation set in most cases. However, the error of prediction varied between the different fermentations.
545

Time-triggered Runtime Verification of Real-time Embedded Systems

Navabpour, Samaneh January 2014 (has links)
In safety-critical real-time embedded systems, correctness is of primary concern, as even small transient errors may lead to catastrophic consequences. Due to the limitations of well-established methods such as verification and testing, recently runtime verification has emerged as a complementary approach, where a monitor inspects the system to evaluate the specifications at run time. The goal of runtime verification is to monitor the behavior of a system to check its conformance to a set of desirable logical properties. The literature of runtime verification mostly focuses on event-triggered solutions, where a monitor is invoked when a significant event occurs (e.g., change in the value of some variable used by the properties). At invocation, the monitor evaluates the set of properties of the system that are affected by the occurrence of the event. This type of monitor invocation has two main runtime characteristics: (1) jittery runtime overhead, and (2) unpredictable monitor invocations. These characteristics result in transient overload situations and over-provisioning of resources in real-time embedded systems and hence, may result in catastrophic outcomes in safety-critical systems. To circumvent the aforementioned defects in runtime verification, this dissertation introduces a novel time-triggered monitoring approach, where the monitor takes samples from the system with a constant frequency, in order to analyze the system's health. We describe the formal semantics of time-triggered monitoring and discuss how to optimize the sampling period using minimum auxiliary memory and path prediction techniques. Experiments on real-time embedded systems show that our approach introduces bounded overhead, predictable monitoring, less over-provisioning, and effectively reduces the involvement of the monitor at run time by using negligible auxiliary memory. We further advance our time-triggered monitor to component-based multi-core embedded systems by establishing an optimization technique that provides the invocation frequency of the monitors and the mapping of components to cores to minimize monitoring overhead. Lastly, we present RiTHM, a fully automated and open source tool which provides time-triggered runtime verification specifically for real-time embedded systems developed in C.
546

Testing the Effectiveness of Citizen Science Using a Volunteer Butterfly Monitoring Program

Moore, Charlotte January 2014 (has links)
An increasingly popular method of collecting scientific data is to use citizen scientists in community-based monitoring programs. Numerous formal and community-based monitoring programs use butterflies as indicator species to detect and understand changes in ecosystems. A butterfly monitoring program was established with the City of Kitchener in 2012, in order to measure the effectiveness of citizen science observations in identifying butterfly assemblages. Two monitoring sites were used: Lakeside Park contained relatively simple butterfly habitats, and Huron Natural Area which included complex butterfly habitats. The program consisted of training volunteers to collect data on the butterfly assemblages. Volunteers were given the choice to monitor an established trail at either natural area once every two weeks from the beginning of May to the end of August using the modified Pollard method. It was important to train volunteers how to monitor butterflies, so they were required to attend an instruction workshop to learn the methods to be employed, identification of butterflies, use of the recording form, and proper net and butterfly handling techniques. Quality control measures were another important component of the program, and included comparing volunteer observations at each natural area to those collected by an expert. Each species observation was reviewed to determine the likelihood of the species observation. The species and its habitat were compared to the information and status recorded in The Butterflies of Waterloo Region. The data collected by the participants was analyzed separately by study site and included the calculation of species richness, Shannon-Wiener Diversity Index (H), evenness value, Simpson Index (D), and Simpson Reciprocal (1/D) values. A two-tailed t-test was conducted to compare the data (as represented by Shannon indexes) collected by volunteers and the expert. The species richness for Lakeside Park was 29, which was slightly higher than the 26 species identified at Huron Natural Area. However, the Shannon-Wiener Diversity Index (H), evenness value (E), Simpson Diversity Index (D), and Simpson Reciprocal Index (1/D) all demonstrated that there was a more even and diverse butterfly population at Huron Natural Area than at Lakeside Park. A high level of validity of volunteer observations was found during this study, as 93% of submitted observations at Lakeside Park and 94% of submitted observations at Huron Natural Area passed the review process. Based on the high level of validity of observations it was determined that the volunteers were successful in characterizing butterfly assemblages, and establishing baseline conditions at each site. The City of Kitchener will be collecting long-term butterfly data, which they can compare over time to provide insight into the diversity at these natural areas. This research program has contributed to the field of science and to the literature by establishing a review process for citizen science, particularly for butterfly programs. It has also provided further validation of citizen science data.
547

On-line monitoring of microbial fermentation end-products synthesized by Clostridium thermocellum using Titrimetric Off-Gas Analysis (TOGA)

Blunt, Warren 04 September 2013 (has links)
Bioprocesses carried out for the production of fuels and other value-added co-products require effective process control strategies. The objective of this research is to apply Titrimetric Off-Gas Analysis (TOGA) for the on-line estimation of fermentation end products using Clostridium thermocellum. The hydrogen ion production, gaseous H2 and CO2, soluble H2 and CO2, as well as ethanol in the liquid phase and vapour phase, were monitored. All parameters, except the dissolved gases, showed good correlation with concurrent off-line analysis. The resulting mass and electron balances were close to theoretical values, and not significantly different from those determined using off-line analysis. Liquid-to-gas mass transfer limitations caused supersaturation of H2(aq) for a wide-range of operating conditions, and on average, ranged between 8-14 times the expected value at thermodynamic equilibrium. The supersaturation of CO2(aq) was conditional, and could be alleviated by increased sparging at agitation such that no significant mass transfer limitation was present. Simultaneous data on ethanol, CO2, and H2 could be obtained with the MIMS probe placed adjacent to the liquid surface in the reactor headspace. From this data, a metabolic model was proposed for the on-line estimation of formate and acetate using a mass balance and an electron balance. The model estimated formate concentrations with reasonable accuracy. Acetate predictions agreed with the qualitative trends, but the concentrations were inaccurate in comparison with off-line analysis. It was demonstrated that the sensor could provide on-line information on all major end-products synthesized by C. thermocellum. In conclusion, TOGA is a valuable instrument for the on-line monitoring and study of fermentation processes for cellulosic biofuels production
548

Non-parametric and Non-filtering Methods for Rolling Element Bearing Condition Monitoring

Faghidi, Hamid 12 March 2014 (has links)
Rolling element bearings are one of the most significant elements and frequently-used components in mechanical systems. Bearing fault detection and diagnosis is important for preventing productivity loss and averting catastrophic failures of mechanical systems. In industrial applications, bearing life is often difficult to predict due to different application conditions, load and speed variations, as well as maintenance practices. Therefore, reliable fault detection is necessary to ensure productive and safe operations. Vibration analysis is the most widely used method for detection and diagnosis of bearing malfunctions. A measured vibration signal from a sensor is often contaminated by noise and vibration interference components. Over the years, many methods have been developed to reveal fault signatures, and remove noise and vibration interference components. Though many vibration based methods have been proposed in the literature, the high frequency resonance (HFR) technique is one of a very few methods have received certain industrial acceptance. However, the effectiveness of the HFR methods depends, to a great extent, on some parameters such as bandwidth and centre frequency of the fault excited resonance, and window length. Proper selection these parameters is often a knowledge-demanding and time-consuming process. In particular, the filter designed based on the improperly selected bandwidth and center frequency of the fault excited resonance can filter out the true fault information and mislead the detection/diagnosis decisions. In addition, even if these parameters can be selected properly at beginning of each process, they may become invalid in a time-varying environment after a certain period of time. Hence, they may have to be re-calculated and updated, which is again a time-consuming and error-prone process. This undermines the practical significance of the above methods for online monitoring of bearing conditions. To overcome the shortcomings of existing methods, the following four non-parametric and non-filtering methods are proposed: 1. An amplitude demodulation differentiation (ADD) method, 2. A calculus enhanced energy operator (CEEO) method, 3. A higher order analytic energy operator (HO_AEO) approach, and 4. A higher order energy operator fusion (HOEO_F) technique. The proposed methods have been evaluated using both simulated and experimental data.
549

An Evaluation of Winnipeg's Electronic Monitoring Pilot Project for Youth Auto Theft Offenders

Pearson, Ashley 09 July 2012 (has links)
In 2008, the Manitoba government implemented an electronic monitoring (EM) project for high-risk automobile theft offenders. To evaluate this program, youth in the program were matched with other high-risk auto theft offenders who had not been put on EM. Dimensions including characteristics, daily contacts and criminal histories were examined between groups. Interviews were also conducted with offenders who had been on EM and with program staff and stakeholders. The results of the evaluation indicated a small change in criminal history for the EM group for auto theft, technical and combined offenses. Since the Winnipeg Auto Theft Suppression Strategy (WATSS) began in 2005, there has been a decrease of approximately 11,000 auto thefts. Notably, only a very small part of this number could be attributed to the EM program. Electronic monitoring as an intervention can be a complementary program when offered in accompaniment with other WATSS and Manitoba Youth Correctional Services (MYCS) programs.
550

Integrating research root cause analysis tools into a commercial IT service manager

Li, Xiaochun 13 December 2011 (has links)
IT environments are turning more complex by the day and this trend is poised to rise in the coming years. To manage IT resources and maximize productivity better, large organizations are striving for better methods to control their current environments. They also have to prepare for future complexity growth as their environments cater to the growing IT needs. In the current economic recession, organizations are not only threatened by the growing complexity, but also have to cope with limited personnel due to financial constraints. Organizations are ardent about obtaining new technology to have firmer control on different platforms, vendors, and solutions at a reasonable cost. At the same time, this new technology must deliver quality services that can effectively fulfill customer needs. To deal with IT management challenges, CA developed Spectrum Service Assurance Manager (SAM), a product by CA Inc. (formerly Computer Associates) to solve complex IT environment service management problems. SAM can provide organizations with a wide-ranging view of their multi-faceted IT environments by providing vital pieces of information that no other software can perceive. Thus, SAM can monitor and manage systems, databases, networks, applications, and end-user experiences. Although, this technology is able to detect many errors and problems, it still lacks a good mechanism to diagnose the detected problems and uncover their root causes for end users to fix. Four research groups from Universities of Alberta, Toronto, Victoria and Waterloo—under the auspices of the Consortium for Software Engineering Research—built different tools for root-cause analysis and detection. To integrate these solutions, these research groups worked together with CA Inc. to produce a web-based integration tool to integrate these add-ons into the main SAM application. The resulting framework does not affect any of SAM’s existing features as the additions only involve a new web communication layer that acts from the core of the software to detect and present root causes. The detection tools only parse the log files for vital information and thus the core functionality of the software remains unaffected. My contributions to this research project are presented in this thesis. In the beginning of this thesis, I report on background research on SAM and describe how it is going to solve the increasing complexity problem in IT environments. Later on, I propose two software integration approaches to integrate root cause diagnosis tools with SAM and briefly describe CA’s latest software integration framework Catalyst. Towards the end of this thesis, I compare our integration solution with Catalyst, and discuss advantages and disadvantages of these integration solutions. / Graduate

Page generated in 0.0623 seconds