• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 7
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 36
  • 36
  • 10
  • 7
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Long-term assessment of the surface water quality in the Blesbokspruit Ramsar Wetland

Ambani, Annie Estelle 30 June 2014 (has links)
M.Sc. (Environmental Management) / Surface water quality in the Blesbokspruit Ramsar wetland has been an area of concern since the 1990s, especially following the authorised and subsidised pumping of underground waters—high in salts—from Grootvlei Mine Shaft No. 3. The pumping was necessary to maintain their underground mining operations and to avert flooding of low-lying areas from decantation of derelict gold mines in the Blesbokspruit catchment. High levels of salt, coupled with a change in the flow of the system, contributed to the loss in the ecological character of the Blesbokspruit wetland in 1996 and, its subsequent listing on the Montreux Record during the same year. In Ramsar terms, the Blesbokspruit was under threat and on the brink of losing its international Ramsar status if management action was not taken to improve the surface water quality of this wetland. The Blesbokspruit has become, since 1996, a wetland in need of restoration to optimum hydrological conditions, i.e. water quality and quantity. A return to desirable water conditions in the Blesbokspruit wetland would enhance aquatic species diversity and abundance—especially the important waterfowl species that gave the Blesbokspruit wetland its international reputation. With the shutting down of the mine and the cessation of pumping operations at Grootvlei (Aurora) Mine in December 2010, the surface water in the Blesbokspruit wetland should have improved and enhanced the agricultural activities (irrigation and livestock watering) adjacent to this wetland, as well as contributed to healthier aquatic conditions much needed by local and migratory birds. An investigation of the surface water quality in the Blesbokspruit wetland was performed on historical water quality data for the period 2000 - 2011, obtained from Rand Water. The study revealed that there was a distinct seasonal and spatial pattern in the salts (i.e. sulphate, chloride, sodium, and magnesium), and related electrical conductivity and pH values for sites downstream of the underground water pumping point at Grootvlei Mine Shaft No. 3. Such observable seasonal and spatial patterns in the sites downstream of the underground mine-water discharge point could validate previous findings that had associated saline pollution with the pumping operations of Grootvlei Mine. Inter-annual trends showed a progressive decline in the concentrations of the salts and associated electrical conductivity values, with pH readings between neutral and slightly alkaline. Improvements in the salinity and acidity levels in the Blesbokspruit wetland could then be associated with the number of water management interventions adopted, particularly by Grootvlei Mine, from the mid-1990s until December 2010 (the year when mining and pumping operations ceased at Grootvlei Mine). Nevertheless, during the year 2011, the chemical properties of the Blesbokspruit revealed a step alteration—a substantial drop in concentrations of sulphate and magnesium salts—following cessation of underground pumping the preceding year, also confirming previous investigations linking saline water contamination to underground mine-water pumping operations at Grootvlei Mine.
12

Digital Provenance Techniques and Applications

Amani M Abu Jabal (9237002) 13 August 2020 (has links)
This thesis describes a data provenance framework and other associated frameworks for utilizing provenance for data quality and reproducibility. We first identify the requirements for the design of a comprehensive provenance framework which can be applicable to various applications, supports a rich set of provenance metadata, and is interoperable with other provenance management systems. We then design and develop a provenance framework, called SimP, addressing such requirements. Next, we present four prominent applications and investigate how provenance data can be beneficial to such applications. The first application is the quality assessment of access control policies. Towards this, we design and implement the ProFact framework which uses provenance techniques for collecting comprehensive data about actions which were either triggered due to a network context or a user (i.e., a human or a device) action. Provenance data are used to determine whether the policies meet the quality requirements. ProFact includes two approaches for policy analysis: structure-based and classification-based. For the structure-based approach, we design tree structures to organize and assess the policy set efficiently. For the classification-based approach, we employ several classification techniques to learn the characteristics of policies and predict their quality. In addition, ProFact supports policy evolution and the assessment of its impact on the policy quality. The second application is workflow reproducibility. Towards this, we implement ProWS which is a provenance-based architecture for retrieving workflows. Specifically, ProWS transforms data provenance into workflows and then organizes data into a set of indexes to support efficient querying mechanisms. ProWS supports composite queries on three types of search criteria: keywords of workflow tasks, patterns of workflow structure, and metadata about workflows (e.g., how often a workflow was used). The third application is the access control policy reproducibility. Towards this, we propose a novel framework, Polisma, which generates attribute-based access control policies from data, namely from logs of historical access requests and their corresponding decisions. Polisma combines data mining, statistical, and machine learning techniques, and capitalizes on potential context information obtained from external sources (e.g., LDAP directories) to enhance the learning process. The fourth application is the policy reproducibility by utilizing knowledge and experience transferability. Towards this, we propose a novel framework, FLAP, which transfer attribute-based access control policies between different parties in a collaborative environment, while considering the challenges of minimal sharing of data and support policy adaptation to address conflict. All frameworks are evaluated with respect to performance and accuracy.
13

Sound Quality Analysis of Sewing Machines

Chatterley, James J. 20 May 2005 (has links) (PDF)
Sound quality analysis is a tool designed to help determine customer preferences, which can be used to help the designer improve product quality. Many industries desire to know how the consuming public perceives their product, as this affects the product life and success. This research investigates which of the six sewing machines provided by Viking Sewing Machine Group (VSM group) consumers find most acoustically appealing. The sound quality analysis methods used include both jury based listening tests and quantitative sound quality metrics from empirical equations. The results from both methods are completely independent and are shown to have a very strong correlation. The procedures and results of both methods, jury listening tests and mathematical metrics, are presented. Near field sound intensity scans identified acoustic hot spots and give direction for possible design modifications to improve the acoustic signature of the two top tier machines, the Designer 1 and Creative 2144 (Husqvarna Viking and Pfaff respectively). This research determined that the entry level Pfaff Select 1530 has the most acoustically appealing sound of the six machines evaluated. In addition, it was also determined that a reduction in the higher frequency sounds produced by the machines is preferred over a reduction in the lower frequency sounds. Further investigations, including an evaluation of machine isolation and startup sounds, were also performed. The machine isolation results are highly dependant on the individual machine being evaluated and would require independent evaluation. In the machine startup sound assessment, it was discovered that again the Pfaff Select 1530 has the preferred sound. Near field acoustic intensity scans provide additional information on locations of strong acoustic radiation. The near field scans provided valuable design information. The acoustic "hot" spots were discovered to exist in the lower portions of the machines near the main stepper motor in the Designer 1, and radiating from the bottom plate of the machine in the Pfaff Creative 2144. This analysis has led to various design modifications that could be implemented to improve the sound quality of the machines, specifically the Designer 1 and the Creative 2144.
14

Quality Analysis of UAV based 3D Reconstruction and its Applications in Path Planning

Rathore, Aishvarya 04 October 2021 (has links)
No description available.
15

Efektivnost výstavby nízkoenergetických bytových domů / Construction efficiency of low-energy residential buildings

Zgúthová, Katarína January 2009 (has links)
Construction of low-energy residential buildings (i.e. houses with low energy demand) in the Czech Republic is not as common as in other countries in the region. This thesis examines the reasons for such development and whether or which obstacles potential buyers face. The analysis of the economic choice between "regular" house and a low-energy building will be primarily based on comparison of basic parameters of both types of constructions. Data for this research will be taken from commercial developers. Quality analysis in the form of detailed interviews with developers, architects and potential buyers will follow. The last section of this thesis will deal with the approach of Czech media towards low-energy development. Presentation of this issue to the public will be examined by the frequency and content of reports.
16

Determination of FAME in Gasoline : A Fuel Quality Analysis / Kvantitativ mätning av FAME i bensin : En analys av bränslekvalitet

Fransson, Rasmus January 2018 (has links)
Gasoline is produced by distilling petroleum oil. This is done at a refinery, where a lot of other products are produced as well. With increasing interest in bio-fuels the fuel companies started to produce substances such as biodiesel as well as the petroleum-based fuels. These products are then transported to where they are going to be used or sold, there included both gasoline, which is a petroleum-based fuel, and biodiesel (FAME), which in Sweden is based on rapeseed oil. If the vessel for transporting gasoline, or pipeline/connections filling and emptying the tanks, has previously been used for biodiesel, there is a risk of contaminating the gasoline with biodiesel. This contamination can have a lot of different effects such as either clogging filters or injectors in both gasoline- and ethanol-based engines, or even change the properties and therefore quality of the fuel.  To ensure that the results from tests and research involving gasoline can be used and compared with each other, the quality of the fuel must have the same properties throughout all tests. This is controlled by taking samples on a regular basis and analyzing the quality and level of impurities in the fuel used in that specific test. Screening for FAME is therefore necessary which is where this thesis becomes relevant.  This thesis was carried out with the purpose to develop a new or verify an already developed method to quantify FAME in gasoline. To determine the FAME content, a standard gas chromatography method, IP 585, was used. It was changed to fit in this application, since it was originally made to determine FAME content in diesel, not gasoline. It was concluded that it was possible to determine the FAME content in gasoline when IP 585 was used as is. There were some possible alternatives to IP 585 and they will be discussed in the literature study. / Bensin framställs genom destillation av råolja. Detta görs på ett raffinaderi där ett flertal andra produkter också utvinns. På senare tid har ögonen öppnats för "biobränslen", bland annat biodiesel. Det händer därför att detta också framställs på samma plats som bensinen. Dessa produkter säljs sedan och brukar fraktas i stora tankar till företagen som köpt dem. Ifall tankarna vid transport eller rören bränslet går igenom till tankarna först använts till biodiesel och sedan används till bensin finns det stor risk att en del biodiesel hamnar i bensinen. Detta kan leda till en rad olika problem. Ett exempel är att biodieseln kan sätta igen och förstöra injektorer i bensin- och etanolmotorer. Det kan även påverka testresultat i olika testriggar, vilket är ett av fallen på Volvo. Ifall bränslet inte bibehåller samma kvalitet för varje test det används i leder det till svårigheter vid jämförelser och resultatens riktighet. Det blir därför nödvändigt att kontrollera bensinens innehåll, där inräknat screening av FAME. Detta arbete utfördes med syftet att utveckla en ny eller verifiera en redan beprövad metod för att bestämma koncentrationen FAME i bensin. För att mäta koncentrationen FAME användes en standardmetod till GC-MS, IP 585. Den modifierades något för att passa in i denna applikation då den från början var gjord för kvantifiering i diesel och inte bensin. Slutsatsen drogs att det är möjligt att mäta koncentrationen FAME i bensin med IP 585 använd som den är. Det fanns möjliga alternativ till metoden, dessa bemöts i litteraturstudien.
17

Multi-version software quality analysis through mining software repositories

Kriukov, Illia January 2018 (has links)
The main objective of this thesis is to identify how the software repository features influence software quality during software evolution. To do that the mining software repository area was used. This field analyzes the rich data from software repositories to extract interesting and actionable information about software systems, projects and software engineering. The ability to measure code quality and analyze the impact of software repository features on software quality allows us to better understand project history, project quality state, development processes and conduct future project analysis. Existing work in the area of software quality describes software quality analysis without a connection to the software repository features. Thus they lose important information that can be used for preventing bugs, decision-making and optimizing development processes. To conduct the analysis specific tool was developed, which cover quality measurement and repository features extraction. During the research general procedure of the software quality analysis was defined, described and applied in practice. It was found that there is no most influential repository feature and the correlation between software quality and software repository features exist, but it is too small to make a real influence.
18

Analys av byxors kvalitet genom simulerad användning : En undersökning av vilka kvalitetsparametrar som förkortar en vävd byxas användningstid / Quality analysis of trousers through simulated use

Gustafsson, Jonnah, Nilsson, Marcus January 2022 (has links)
En textilprodukts produktionsprocess beräknas stå för 80% av dess totala klimatutsläpp. Produktionen förorenar både vatten och land. Ännu är inte återvinning svaret på den negativa klimat- och miljöpåverkan som den textila värdekedjan medför. All textilproduktion och återvinning kräver resurser i form av vatten, kemikalier och energi. Enligt studier konsumerar den genomsnittliga svensken 14 kg textilier varje år men den tid som textilierna används blir kortare och kortare på grund av bristfällig kvalitet. Inom cirkulär textil är en viktig del återanvändning, för att det skall kunna ske i högre utsträckning är det av vikt att de textilier som produceras är av god kvalitet. Idag kvalitetstestas nyproducerade, oanvända och konditionerade plagg direkt efter produktion enligt gällande standarder genom mekaniska påfrestningar, ofta testas slitaget tills det inte längre kan anses vara acceptabelt ur ett kundperspektiv. Dessa tester mäter dock inte den verkliga användningen. Vilket innebär att det inte finns något verkligt underlag för att avgöra om det material som använts har en tillräckligt god kvalitet för att hålla för verklig användning och slitage i ett längre perspektiv eller bara uppfyller kravspecifikationen som upprättats. Arbetet i denna rapport har haft som syfte att identifiera vilka kvalitetsbrister som uppstår vid användning av plagg och genom simulerad användning med hjälp av standardiserade testmetoder försökt utvärdera plaggens fysiska livslängd. För att om möjligt kunna användas av företag i samma syfte. Som fallstudie har ett företag studerats och sex olika byxor har testats. Kvalitativa och kvantitativa studier resulterade i ett antal tester till att simulera användning för att försöka utvärdera plaggens fysiska livslängd. De valda testerna för att simulera slitage var upprepad tvätt, nötningshärdighet och rivstyrka, färgförändring, och dimensionsstabilitet vilka har representerat de visuella kvalitetsbrister som kan uppstå. Resultaten från testerna visar att de vanligaste kvalitetsbrister som konsumenter upplever är att det uppstår hål i materialet, att plagget tappar form och att det sker färgförändringar. En av sex byxor uppfyllde kraven i de tester som utförs för validering av testutrustningen. Efter att samtliga byxor testats enligt de utvalda metoderna var det ingen byxa som uppfyllde alla de ställda kraven. Slutsatsen av arbetet är att simulerad användning inte är omöjligt som metod men det går inte fullt ut att återskapa det mekaniska dagliga slitage som ett plagg genomgår hos konsumenten. För att bli en fullgod metod behöver ett mer omfattande arbete göras. Det går inte att med de resultat som återfinns i arbetet att ge konkreta förslag på förbättringar för en ökad livslängd. Det krävs utförliga kravspecifikationer för plaggen som gäller för en hel livslängd. Vilka inte är generella utan uppförda för de olika produktkategorierna i större detalj för att ha kontroll över hela produktionskedjan. / The production process for textiles is estimated to be responsible for 80 % of the total climate emissions made by the industry. The production pollutes both water and land. Recycling, though on the rise, does not offer the necessary solution at the time to deal with the negative climate- and environmental impact of the textile industry. All textile production including recycling requires resources in the form of water, chemicals and energy. The average Swede consumes 14 kg of textiles annually, but the user time is decreasing because of defective quality according to studies. In circular textiles the term re-use is of importance, and to possibly implement it to a greater extent it is of importance that the textiles produced are of good quality making it more probable that the end use will be prolonged. Today, a garment is tested unused and conditioned directly after production, according to standardized test methods through the means of mechanical wear, which is measured to the point where the wear is considered nonacceptable from a consumer perspective. However, these tests do not measure the real wear and tear. Meaning there is no proper basis to decide whether the material used, reaches the level of quality needed for real use in a larger perspective or if it just fulfills the set requirements. The purpose of this project has been to identify what quality deficiencies consumers experience during the use-phase of garments. Through simulated use according to standardized test methods, evaluate the physical lifespan of the garments. With the prospect that fast-fashion companies can implement it to ensure the quality of their materials. As a case study, a company has been studied and six different trousers have been tested. Qualitative and quantitative studies resulted in several tests to simulate use to try to evaluate the physical life of the garments. The tests chosen to simulate wear were repeated washing, abrasion resistance, tear strength, color change, and dimensional stability which represented the visual quality defects that may occur. The test results showed that the most common quality deficiencies that consumers experience are breakage in the material, shape and color changes. One out of six trousers met the requirements of the tests performed for the validation of the test equipment. After all trousers were tested according to the selected methods, no trousers met the set requirements. In conclusion simulated use is not undoable as a method but it cannot fully recreate the mechanical daily wear and tear that a garment undergoes during the user-phase. For it to become an effective method, more extensive work needs to be carried out. It is not possible with the results drawn from this study to give concrete suggestions for improvements for increased longevity. Detailed garment, yarn and requirements specifications are essential. They need to be in greater detail for the various product categories and not be general for all.
19

Language Engineering for Information Extraction

Schierle, Martin 12 July 2011 (has links)
Accompanied by the cultural development to an information society and knowledge economy and driven by the rapid growth of the World Wide Web and decreasing prices for technology and disk space, the world\''s knowledge is evolving fast, and humans are challenged with keeping up. Despite all efforts on data structuring, a large part of this human knowledge is still hidden behind the ambiguities and fuzziness of natural language. Especially domain language poses new challenges by having specific syntax, terminology and morphology. Companies willing to exploit the information contained in such corpora are often required to build specialized systems instead of being able to rely on off the shelf software libraries and data resources. The engineering of language processing systems is however cumbersome, and the creation of language resources, annotation of training data and composition of modules is often enough rather an art than a science. The scientific field of Language Engineering aims at providing reliable information, approaches and guidelines of how to design, implement, test and evaluate language processing systems. Language engineering architectures have been a subject of scientific work for the last two decades and aim at building universal systems of easily reusable components. Although current systems offer comprehensive features and rely on an architectural sound basis, there is still little documentation about how to actually build an information extraction application. Selection of modules, methods and resources for a distinct usecase requires a detailed understanding of state of the art technology, application demands and characteristics of the input text. The main assumption underlying this work is the thesis that a new application can only occasionally be created by reusing standard components from different repositories. This work recapitulates existing literature about language resources, processing resources and language engineering architectures to derive a theory about how to engineer a new system for information extraction from a (domain) corpus. This thesis was initiated by the Daimler AG to prepare and analyze unstructured information as a basis for corporate quality analysis. It is therefore concerned with language engineering in the area of Information Extraction, which targets the detection and extraction of specific facts from textual data. While other work in the field of information extraction is mainly concerned with the extraction of location or person names, this work deals with automotive components, failure symptoms, corrective measures and their relations in arbitrary arity. The ideas presented in this work will be applied, evaluated and demonstrated on a real world application dealing with quality analysis on automotive domain language. To achieve this goal, the underlying corpus is examined and scientifically characterized, algorithms are picked with respect to the derived requirements and evaluated where necessary. The system comprises language identification, tokenization, spelling correction, part of speech tagging, syntax parsing and a final relation extraction step. The extracted information is used as an input to data mining methods such as an early warning system and a graph based visualization for interactive root cause analysis. It is finally investigated how the unstructured data facilitates those quality analysis methods in comparison to structured data. The acceptance of these text based methods in the company\''s processes further proofs the usefulness of the created information extraction system.
20

Desenvolvimento de sensores químicos de baixo custo visando ao monitoramento da qualidade e da potabilidade de águas / Development of low-cost chemical sensors aiming at monitoring water quality and potability

Silva, José Ricardo da 30 October 2018 (has links)
A falta de acesso à água potável ainda é um problema de saúde pública no Brasil. O desenvolvimento de novos métodos analíticos de baixo custo para o reconhecimento de amostras contaminadas é uma necessidade, pois análises laboratoriais estão fora da realidade socioeconômica da população mais vulnerável. Visando contribuir para a resolução deste problema, esta tese apresenta esforços para o desenvolvimento de métodos de baixo custo para a análise de qualidade de águas ambientais. Foi estudada a associação de ferramentas quimiométricas com sensores voltamétricos para tentar discriminar amostras de águas contaminadas com espécies eletroativas. O modelo desenvolvido foi capaz de discriminar de forma satisfatória amostras contaminadas contendo chumbo(II), cobre(II), zinco(II) e nitrito. Os esforços para a redução do custo das análises também focaram no desenvolvimento de sensores com materiais de baixo custo. Um dos dispositivos voltamétricos propostos foi capaz de quantificar metais tóxicos e pesticidas, utilizando papel, grafite e cera. Foi desenvolvido também um sistema de agitação por som que resultou em um aumento significativo da sensibilidade dos dispositivos voltamétricos portáteis permitindo a quantificação de chumbo(II) até 48 nmol L-1, cádmio(II) até 370 nmol L-1 e zinco(II) até 340 nmol L-1. Outro sensor voltamétrico foi confeccionado utilizando apenas papelão como matéria prima, para o qual um laser de CO2 foi utilizado pela primeira vez com o intuito de pirolisar a superfície do papelão gerando estruturas de carbono condutoras. Sensores colorimétricos em papel foram testados com sucesso para a quantificação de fluoreto até 500 µmol L-1 em amostras de água mineral utilizando fotografias retiradas por um telefone celular para a construção de modelos12 de calibração. Com outro sistema colorimétrico em papel foi possível medir o pH de amostras utilizando um método de calibração multivariada. Como mostrado neste trabalho, o desenvolvimento e a integração dos dispositivos analíticos em papel é uma alternativa abrangente, confiável e de baixo custo para a análise da qualidade de águas ambientais. / The lack of access to safe water remains as a public health problem in Brazil. The development of new analytical methods for low cost contaminated samples recognition is necessary since the complete laboratorial procedures are away from the reality of the most socioeconomic vulnerable population. In order to contribute to solve this problem, this thesis shows our efforts to develop new low-cost analytical methods to evaluate environmental waters quality. The combination of chemometric tools with voltammetric sensors was studied to discriminate contaminated water samples with electroactive species. The proposed model was able to discriminate potable and contaminated samples containing lead(II), copper(II), zinc(II) and nitrite species. Efforts to reduce the analysis cost have also focused on the development of sensors using low-cost materials. A proposed voltammetric device fabricated with paper, graphite and wax was able to quantify heavy metals and pesticides. The use of a sound agitation system for the portable voltammetric devices resulted in a significant increase in the sensitivity allowing the quantification of lead(II) above 48 nmol L-1, cadmium(II) above 370 nmol L-1 e and zinc(II) above 340 nmol L-1 . Another voltammetric sensor was made for the first time using only cardboard as material and a CO2 laser to pyrolyze the cardboard surface generating conductive carbon structures. Paper colorimetric sensors were successfully tested for fluoride quantification in spring water samples based on photographs taken by a smartphone with a LOQ of 500 µmol L-1. Another paper colorimetric system was capable to measure the pH of samples using a multivariate calibration method. As shown in this thesis, the development and integration of analytical paper-based devices is a reliable and low-cost alternative for water quality analysis

Page generated in 0.0818 seconds