• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 37
  • 36
  • 18
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 118
  • 27
  • 23
  • 22
  • 17
  • 14
  • 14
  • 14
  • 13
  • 11
  • 10
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Využití propojených dat na webu ke tvorbě strategické znalostní hry / The use of linked open data for strategic knowledge game creation

Turečková, Šárka January 2015 (has links)
The general theme of this thesis was the use of linked open data for a creation of games. This thesis specifically addressed the issue of usage of DBpedia for automatic question generation suitable for use in games. Within that are proposed suitable ways of selecting wanted objects from DBpedia and ways of obtaining and processing relevant information from them. Including a method for estimating renown of individual objects. Some of methods are then applied to create a program for a question generation from the data obtained through DBpedia during the run of the application. The real possibility of using these questions generated from DBpedia for gaming purposes is subsequently proved by the design, prototype and tests of a knowledge strategic multiplayer game. The paper also summarizes all the major issues and possible complications from using the data obtained through DBpedia or DBpedia Live endpoints. Current challenges and opportunities for mutual utilization of games and LOD are also briefly discussed.
72

Efektivita procesu oceňování stavební výroby / Effectiveness of the process of valuation of construction production

Sofka, Filip January 2019 (has links)
Diplomová práce se zabývá efektivitou oceňování stavební výroby. Teoretická část se věnuje oceňování stavební výroby, analýzou náročnosti oceňování, principem sestavení rozpočtů a představení BIM procesu ve stavebnictví. Praktická část se věnuje konkrétnímu způsobu zefektivnění oceňování stavební výroby. Efektivita je zkoumána ve fázi zpracování výkazu výměr, pomocí BIM modelů a porovnána s CAD modelem. Výsledkem je definování možné úspory zpracování výkazu výměr pomocí BIM modelu. Závěrečná část je věnována popisu standardizace stavebnictví.
73

Implementace algoritmu LoD terénu / Terrain LoD Algorithm Implementation

Radil, Přemek January 2012 (has links)
This thesis discusses implementation of LoD terrain visualization algorithm Seamless Patches for GPU-Based Terrain Rendering as extension for Coin3D library. It presents procedures which this algorithm uses for displaying large terrain datasets. Entire terrain is composed of patches that are stored in patch hierarchy. Patch hierarchy is traversed during runtime to generate active patches based on observer's position. Each patch consists of predefined tiles and connection strips so it doesn't need to store any geometry. During render of tiles and strips, displacement shader is applied. This thesis also evaluates results achieved in sample application and suggests some modifications to further increase algorithm performance.
74

Optimalizace renderování rozsáhlého terénu / Optimization of Large Scale Terrain Rendering

Luner, Radek January 2010 (has links)
This work is focusing on optimization of large scale terrain rendering. It explains basic methods and data structures for optimization. It describes fundamentals of methods such as ROAM, Geometrical clipmaps, GPU Based Geometrical Clipmaps, GeoMipMapping and Chunked LOD. It explains implementation details of system for terrain optimization based on GeoMipMapping method.
75

Metodutveckling för bestämning av vattenhalten i en frystorkad proteinprodukt / Method development for determination of the water contentin a freeze-dried protein product

Said, Rana January 2018 (has links)
Vattenhalten i ett frystorkat proteinläkemedel bestämdes med KF och LOD som är två primära och traditionella metoder som används för bestämning av vattenhalten i produkter inom flera industrier. Istället för KF och LOD, som är tidskrävande och destruktiva, kan den sekundära, snabba och icke-destruktiva metoden NIR användas. För att kunna implementera NIR måste en kalibrering gentemot en primär metod ske. Syftet med detta projekt var att med en kvantitativ analys finna ett samband mellan NIR och KF samt mellan NIR och LOD för den frystorkade produkten. Det initiala steget var att upprätta en kalibreringsuppsättning och samla in spektrum med NIR. Därefter gjordes försök att bestämma vattenhalten med KF och LOD för den frystorkade proteinprodukten med inledande tester som verifierades genom att spetsa prover med vatten. LOD utfördes i en exsickator och KF-analyserna utfördes kolometriskt på två olika sätt, med och utan extraktion av proverna med metanol. Det slutgiltiga steget var att korrelera insamlat spektrum från NIR med bestämda referensvärden genom att skapa en kalibreringsmodell. Data från insamlat spektrum förbehandlades med multiple scatter correction (MSC), andra derivatan och Savitzy Golay, innan en kalibreringsmodell skapades med partial least square (PLS). Resultatet från LOD-analyserna med exsickator tydde på att tillvägagångssättet som användes inte var lämpligt för produkten och en kalibreringsmodell upprättades inte. Vattenhalten bestämdes med KF med extraktion av åtta prover till 0,144 – 0,558 % [w/w], och med KF utan extraktion till 2,34– 2,972 % [w/w] för åtta prover. Det förväntade värdet var 0,62-1,61 % [w/w]. En kalibreringsmodell upprättades för KF med och utan extraktion samt för det förväntade referensintervallet. Detta resulterade i en korrelationsfaktor på 0,9496 för KF med extraktion, 0,97418 för KF utan extraktion och 0,9932 för de förväntade värdena. Metoderna KF med och utan extraktion var inte lämpliga för produkten i detta projekt, utan fler försök behöver utföras med större kalibreringsuppsättningar för att kunna dra en slutgiltig slutsats. / The water content of a lyophilized protein drug is determined using KF and LOD which are two primary and traditional methods used by several industries for determination of water content in products. Instead of KF and LOD, which are time consuming and destructive, the secondary, fast and non-destructive method near infrared spectroscopy (NIR), can be used. To implement NIR, a calibration towards a primary method must be performed. The purpose of this study was to find a relation between NIR and KF and between NIR and LOD for the lyophilized product with a quantitative analysis. The initial step was to establish a calibration set and collect a spectrum with NIR. Attempts to determine the water content of KF and LOD for the lyophilized protein product were made with initial tests which later were verified by spiking samples with water. LOD was performed in a desiccator and the KF analyses were performed calorimetrically in two different ways: with and without extraction of the samples with methanol. The last step was to correlate the collected spectrum from NIR with determined reference values by creating a calibration model with partial least square (PLS). Prior to model development, data from collected spectrum was pre-treated with multiple scatter correction (MSC), second derivative and Savitzy-Golay filter. The result of the LOD analyses with a desiccator indicated that the method used was not suitable for the product and a calibration model was not established. The water content was determined by KF with extraction of eight samples to 0.144 - 0.558 % [w/w], and with KF without extraction of eight samples to 2.34-2.972 [w/w]. The expected value was 0.62-1.61 % [w/w]. A calibration model was established for KF with extraction of the samples, KF without extraction of the samples and for the expected reference interval. The developed calibration models resulted in a correlation factor of 0.9496 for KF with extraction, 0.97418 for KF without extraction and 0.9932 for the expected values. KF with and without extraction of the samples was not suitable for the product in this project. More experiments are needed with bigger calibration sets to be able to make a conclusion.
76

Linked Open Data Alignment & Querying

Jain, Prateek 27 August 2012 (has links)
No description available.
77

Utvärdering av multikriterieanalys som verktyg för spatial resursallokering av dagvattenåtgärder för tillskottsvatten i spillvattennät / Evaluation of multi criteria analysis as a tool for spatial resource allocation of stormwater measures for inflow and infiltration to the sewage water system

Vallin, Hanna January 2016 (has links)
Utbyggnation av städer och allt större andel hårdgjorda ytor leder till problem med dagvattenhanteringen. Flödena blir snabbare samtidigt som större mängder än tidigare bildar avrinning och då är det viktigt att de befintliga systemen klarar av att hantera dem. Dagvatten från tak och hårdgjorda ytor och dräneringsvatten från källarfastigheter kan vara kopplat till spillvattennätet och leda till att vattenflödena i ledningarna vid stora regn kan bli betydligt större än de är dimensionerade för, med kapacitetsproblem som följd. En lösning som många VA-huvudmän har implementerat är att använda öppna dagvattenlösningar istället för att välja det mer kostnadskrävande alternativet att bygga ut ledningssystemet. Att koppla om stuprör, brunnar och dräneringsledningar och gräva diken för att leda om vattnet är både kostnads- och resurskrävande. Syftet var därför att undersöka om multikriterieanalys kan fungera som ett lämpligt verktyg för att allokera resurserna till de områden som ger mest nytta per satsad krona. Detta genom att ta fram en metodik för detta syfte och testa dess robusthet för att avgöra om den är lämplig att applicera eller om osäkerheterna i parametervärdena blir för stora för att några slutsatser ska kunna dras. Metodiken som togs fram testades på Bjursås, ett litet samhälle två mil utanför Falun, eftersom omfattande utredningar gjorts i området tidigare, vilket innebar att mycket data fanns att tillgå. Undersökningar gjordes av var mängderna kunde förväntas bli stora, var det fanns stor risk för källaröversvämningar och vilka områden som bidrog mest till bräddningar. Detta utvärderades tillsammans med förväntade åtgärdskostnader och en samlad bedömning gjordes av var nyttan per satsad krona bör bli störst. Stora osäkerheter återfanns i bedömningen. Slutsatsen blev att metoden kan fungera som stöd vid beslutsunderlag, men att den inte är tillräckligt robust för att kunna användas uteslutande utan att efterföljande utvärderingar och kritisk granskning av resultaten måste göras. Detta kan lämpligtvis ske genom en känslighetsanalys. För att resultaten ska utgöra ett användbart verktyg för VA-huvudmannen måste också kostnaderna utvärderas noggrannare än de har gjorts i den här studien. / Urbanization along with a greater amount of hardened surfaces affects the storm water management. When the flows get faster and larger amounts create runoff, it is crucial that the available systems are able to handle the water. Storm water from roofs and asphalt surfaces and drainage water can be connected to the sewage water pipelines and make the flows in the conduits at rainfall much greater than they are designed for, leading to capacity issues. Many municipalities have implemented open storm water solutions instead of choosing the more cost-ineffective way of expanding the conduit system. Reconnecting downspouts, wells and drainage pipelines and digging trenches to lead the water is both cost and resource demanding. Therefore the aim of this study was to investigate whether multi criteria analysis can be used as an appropriate tool in order to allocate the resources to the most beneficial areas. This was done by developing a method for this aim and testing its robustness in order to determine if it is suitable to use in this context or if the uncertainties make the method too unreliable. The robustness in the method developed can be questioned since the uncertainties can be substantial. To be able to use this method, a lot of data is needed and the method needs to be updated relatively often in order to contain relevant information. It is recommended that a sensitivity analysis is performed along with the method, since the use of only one set of parameters can make the result relatively arbitrary. Using a Monte Carlo procedure with the uncertainties defined can reduce the time needed to measure and identify the values. The sensitivity analysis showed that the parameters that have the largest impact on the results are the number of residents living in every real estate with a basement, the catchment areas, the roughness parameters of the pipelines and the use of energy and chemicals in the system. In the future, the urbanization is expected to increase as well as the amount of rainfall and problems related to inflow and infiltration are expected to become more common. The hope is that the method used and its results will be useful for the municipalities’ future planning and to inspire to more studies on this topic.
78

Ovidkommande dagvatten i spillvattenledningar - En fallstudie av dagvattenhantering i ett bostadsområde i Hok / Irrelevant stormwater in wastewater pipes - A case study of stormwater management in a residential area in Hok

El Masry, Josef, Alkazragi, Miher January 2015 (has links)
No description available.
79

Limit of detection for second-order calibration methods

Rodríguez Cuesta, Mª José 02 June 2006 (has links)
Analytical chemistry can be split into two main types, qualitative and quantitative. Most modern analytical chemistry is quantitative. Popular sensitivity to health issues is aroused by the mountains of government regulations that use science to, for instance, provide public health information to prevent disease caused by harmful exposure to toxic substances. The concept of the minimum amount of an analyte or compound that can be detected or analysed appears in many of these regulations (for example, to discard the presence of traces of toxic substances in foodstuffs) generally as a part of method validation aimed at reliably evaluating the validity of the measurements.The lowest quantity of a substance that can be distinguished from the absence of that substance (a blank value) is called the detection limit or limit of detection (LOD). Traditionally, in the context of simple measurements where the instrumental signal only depends on the amount of analyte, a multiple of the blank value is taken to calculate the LOD (traditionally, the blank value plus three times the standard deviation of the measurement). However, the increasing complexity of the data that analytical instruments can provide for incoming samples leads to situations in which the LOD cannot be calculated as reliably as before.Measurements, instruments and mathematical models can be classified according to the type of data they use. Tensorial theory provides a unified language that is useful for describing the chemical measurements, analytical instruments and calibration methods. Instruments that generate two-dimensional arrays of data are second-order instruments. A typical example is a spectrofluorometer, which provides a set of emission spectra obtained at different excitation wavelengths.The calibration methods used with each type of data have different features and complexity. In this thesis, the most commonly used calibration methods are reviewed, from zero-order (or univariate) to second-order (or multi-linears) calibration models. Second-order calibration models are treated in details since they have been applied in the thesis.Concretely, the following methods are described:- PARAFAC (Parallel Factor Analysis)- ITTFA (Iterative Target Transformation Analysis)- MCR-ALS (Multivariate Curve Resolution-Alternating Least Squares)- N-PLS (Multi-linear Partial Least Squares)Analytical methods should be validated. The validation process typically starts by defining the scope of the analytical procedure, which includes the matrix, target analyte(s), analytical technique and intended purpose. The next step is to identify the performance characteristics that must be validated, which may depend on the purpose of the procedure, and the experiments for determining them. Finally, validation results should be documented, reviewed and maintained (if not, the procedure should be revalidated) as long as the procedure is applied in routine work.The figures of merit of a chemical analytical process are 'those quantifiable terms which may indicate the extent of quality of the process. They include those terms that are closely related to the method and to the analyte (sensitivity, selectivity, limit of detection, limit of quantification, ...) and those which are concerned with the final results (traceability, uncertainty and representativity) (Inczédy et al., 1998). The aim of this thesis is to develop theoretical and practical strategies for calculating the limit of detection for complex analytical situations. Specifically, I focus on second-order calibration methods, i.e. when a matrix of data is available for each sample.The methods most often used for making detection decisions are based on statistical hypothesis testing and involve a choice between two hypotheses about the sample. The first hypothesis is the "null hypothesis": the sample is analyte-free. The second hypothesis is the "alternative hypothesis": the sample is not analyte-free. In the hypothesis test there are two possible types of decision errors. An error of the first type occurs when the signal for an analyte-free sample exceeds the critical value, leading one to conclude incorrectly that the sample contains a positive amount of the analyte. This type of error is sometimes called a "false positive". An error of the second type occurs if one concludes that a sample does not contain the analyte when it actually does and it is known as a "false negative". In zero-order calibration, this hypothesis test is applied to the confidence intervals of the calibration model to estimate the LOD as proposed by Hubaux and Vos (A. Hubaux, G. Vos, Anal. Chem. 42: 849-855, 1970).One strategy for estimating multivariate limits of detection is to transform the multivariate model into a univariate one. This strategy has been applied in this thesis in three practical applications:1. LOD for PARAFAC (Parallel Factor Analysis).2. LOD for ITTFA (Iterative Target Transformation Factor Analysis).3. LOD for MCR-ALS (Multivariate Curve Resolution - Alternating Least Squares)In addition, the thesis includes a theoretical contribution with the proposal of a sample-dependent LOD in the context of multivariate (PLS) and multi-linear (N-PLS) Partial Least Squares. / La Química Analítica es pot dividir en dos tipus d'anàlisis, l'anàlisi quantitativa i l'anàlisi qualitativa. La gran part de la química analítica moderna és quantitativa i fins i tot els govern fan ús d'aquesta ciència per establir regulacions que controlen, per exemple, nivells d'exposició a substàncies tòxiques que poden afectar la salut pública. El concepte de mínima quantitat d'un analit o component que es pot detectar apareix en moltes d'aquestes regulacions, en general com una part de la validació dels mètodes per tal de garantir la qualitat i la validesa dels resultats.La mínima quantitat d'una substància que pot ser diferenciada de l'absència d'aquesta substància (el que es coneix com un blanc) s'anomena límit de detecció (limit of detection, LOD). En procediments on es treballa amb mesures analítiques que són degudes només a la quantitat d'analit present a la mostra (situació d'ordre zero) el LOD es pot calcular com un múltiple de la mesura del blanc (tradicionalment, 3 vegades la desviació d'aquesta mesura). Tanmateix, l'evolució dels instruments analítics i la complexitat creixent de les dades que generen, porta a situacions en les que el LOD no es pot calcular fiablement d'una forma tan senzilla. Les mesures, els instruments i els models de calibratge es poden classificar en funció del tipus de dades que utilitzen. La Teoria Tensorial s'ha utilitzat en aquesta tesi per fer aquesta classificació amb un llenguatge útil i unificat. Els instruments que generen dades en dues dimensions s'anomenen instruments de segon ordre i un exemple típic és l'espectrofluorímetre d'excitació-emissió, que proporciona un conjunt d'espectres d'emissió obtinguts a diferents longituds d'ona d'excitació.Els mètodes de calibratge emprats amb cada tipus de dades tenen diferents característiques i complexitat. En aquesta tesi, es fa una revisió dels models de calibratge més habituals d'ordre zero (univariants), de primer ordre (multivariants) i de segon ordre (multilinears). Els mètodes de segon ordre estan tractats amb més detall donat que són els que s'han emprat en les aplicacions pràctiques portades a terme. Concretament es descriuen:- PARAFAC (Parallel Factor Analysis)- ITTFA (Iterative Target Transformation Analysis)- MCR-ALS (Multivariate Curve Resolution-Alternating Least Squares)- N-PLS (Multi-linear Partial Least Squares)Com s'ha avançat al principi, els mètodes analítics s'han de validar. El procés de validació inclou la definició dels límits d'aplicació del procediment analític (des del tipus de mostres o matrius fins l'analit o components d'interès, la tècnica analítica i l'objectiu del procediment). La següent etapa consisteix en identificar i estimar els paràmetres de qualitat (figures of merit, FOM) que s'han de validar per, finalment, documentar els resultats de la validació i mantenir-los mentre sigui aplicable el procediment descrit.Algunes FOM dels processos químics de mesura són: sensibilitat, selectivitat, límit de detecció, exactitud, precisió, etc. L'objectiu principal d'aquesta tesi és desenvolupar estratègies teòriques i pràctiques per calcular el límit de detecció per problemes analítics complexos. Concretament, està centrat en els mètodes de calibratge que treballen amb dades de segon ordre.Els mètodes més emprats per definir criteris de detecció estan basats en proves d'hipòtesis i impliquen una elecció entre dues hipòtesis sobre la mostra. La primera hipòtesi és la hipòtesi nul·la: a la mostra no hi ha analit. La segona hipòtesis és la hipòtesis alternativa: a la mostra hi ha analit. En aquest context, hi ha dos tipus d'errors en la decisió. L'error de primer tipus té lloc quan es determina que la mostra conté analit quan no en té i la probabilitat de cometre l'error de primer tipus s'anomena fals positiu. L'error de segon tipus té lloc quan es determina que la mostra no conté analit quan en realitat si en conté i la probabilitat d'aquest error s'anomena fals negatiu. En calibratges d'ordre zero, aquesta prova d'hipòtesi s'aplica als intervals de confiança de la recta de calibratge per calcular el LOD mitjançant les fórmules d'Hubaux i Vos (A. Hubaux, G. Vos, Anal. Chem. 42: 849-855, 1970)Una estratègia per a calcular límits de detecció quan es treballa amb dades de segon ordre es transformar el model multivariant en un model univariant. Aquesta estratègia s'ha fet servir en la tesi en tres aplicacions diferents::1. LOD per PARAFAC (Parallel Factor Analysis).2. LOD per ITTFA (Iterative Target Transformation Factor Analysis).3. LOD per MCR-ALS (Multivariate Curve Resolution - Alternating Least Squares)A més, la tesi inclou una contribució teòrica amb la proposta d'un LOD que és específic per cada mostra, en el context del mètode multivariant PLS i del multilinear N-PLS.
80

Palladium telluride quantum dots biosensor for the determination of indinavir drug

Feleni, Usisipho January 2013 (has links)
Magister Scientiae - MSc / Indinavir is a potent and well tolerated protease inhibitor drug used as a component of the highly active antiretroviral therapy (HAART) of HIV/AIDS, which results in pharmacokinetics that may be favourable or adverse. These drugs work by maintaining a plasma concentration that is sufficient to inhibit viral replication and thereby suppressing a patient’s viral load. A number of antiretroviral drugs, including indinavir, undergo metabolism that is catalysed by cytochrome P450-3A4 enzyme found in the human liver microsomes. The rate of drug metabolism influences a patient’s response to treatment as well as drug interactions that may lead to life-threatening toxic conditions, such as haemolytic anaemia, kidney failure and liver problems. Therapeutic drug monitoring (TDM) during HIV/AIDS treatment has been suggested to have a potential to reduce drug toxicity and optimise individual therapy. A fast and reliable detection technique, such as biosensing, is therefore necessary for the determination of a patient’s metabolic profile for indinavir and for appropriate dosing of the drugs. In this study biosensors developed for the determination of ARV drugs comprised of cysteamine self-assembled on a gold electrode, on which was attached 3-mercaptopropionic acid-capped palladium telluride (3-MPA-PdTe) or thioglycolic acid-capped palladium telluride (TGA-PdTe) quantum dots that are cross-linked to cytochrome P450-3A4 (CYP3A4) in the presence of 1-ethyl-3(3-dimethylaminopropyl) carbodiimide hydrochloride and N-hydroxysuccinimide. The quantum dots were synthesized in the presence of capping agents (3-MPA or TGA) to improve their stability, solubility and biocompatibility. The capping of PdTe quantum dots with TGA or 3-MPA was confirmed by FTIR, where the SH group absorption band disappeared from the spectra of 3-MPA-PdTe and TGA-PdTe. The particle size of the quantum dots (< 5 nm) was estimated from high resolution transmission electron microscopy (HRTEM) measurements. Optical properties of the materials were confirmed by UV-Vis spectrophotometry which produced absorption iii bands at ~320 nm that corresponded to energy band gap values of 3 eV (3.87 eV) for TGAPdTe (3-MPA-PdTe) quantum dots. The electrocatalytic properties of the quantum dots biosensor systems were studied by cyclic voltammetry (CV) for which the characteristic reduction peak at 0.75 V was used to detect the response of the biosensor to indinavir. Results for indinavir biosensor constructed with 3-MPA-SnSe quantum dots are also reported in this thesis. The three biosensors systems were very sensitive towards indinavir; and gave low limits of detection (LOD) values of 3.22, 4.3 and 6.2 ng/mL for 3-MPA-SnSe, 3-MPA-PdTe and TGA-PdTe quantum dots biosensors, respectively. The LOD values are within the ‘maximum plasma concentration’ (Cmax) value of indinavir (5 - 15 ng/mL) normally observed 8 h after drug intake.

Page generated in 0.0268 seconds