• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 1
  • Tagged with
  • 7
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The capture and integration of construction site data

Ward, Michael James January 2004 (has links)
The use of mobile computing on the construction site has been a well-researched area since the early 1990's, however, there still remains a lack of computing on the construction site. Where computers are utilised on the site this tends to be by knowledge workers utilising a laptop or PC in the site office with electronic data collection being the exception rather than the norm. The problems associated with paper-based documentation on the construction site have long been recognised (Baldwin, et al, 1994; McCullough, 1993) yet there still seems to be reluctance to replace this with electronic alternatives. Many reasons exist for this such as; low profit margins, perceived high cost; perceived lack of available hardware and perceived inability of the workforce. However, the benefits that can be gained from the successful implementation of IT on the construction site and the ability to re-use construction site data to improve company performance, whilst difficult to cost, are clearly visible. This thesis represents the development and implementation of a data capture system for the management of the construction of rotary bored piles (SHERPA). Operated by the site workforce, SHERPA comprises a wireless network, site-based server and webbased data capture using tablet computers. This research intends to show that mobile computing technologies can be implemented on the construction site and substantial benefits can be gained for the company from the re-use and integration of the captured site data.
2

Seasonal Effects on Soil Drying After Irrigation

Kimball, B. A., Jackson, R. D. 23 April 1971 (has links)
From the Proceedings of the 1971 Meetings of the Arizona Section - American Water Resources Assn. and the Hydrology Section - Arizona Academy of Science - April 22-23, 1971, Tempe, Arizona / A study was made to determine how the evaporation rate from a bare Adelanto loam soil in Phoenix changes with season and with time since the last irrigation. The evaporation rates were determined by precision lysimeters in a bare field, with measurements being taken in every month of the year for at least a week after irrigation. The data exhibited a cosine-shaped curve, with a maximum evaporation rate of about 5 mm/day in summer and a minimum rate of about 2 mm/day in winter. By the seventh day, seasonal effects virtually disappear, and the evaporation rate is the same in both summer and winter, being about 2 mm/day after the 7th day and about 0.75 mm/day after the 21st day. It is generally accepted that soil dries in 3 stages, and the transition between the 1st and 2nd stages occurs when atmospheric conditions are no longer critical. In previous laboratory studies of soil drying, with constant atmospheric conditions, stage 1 was easily distinguished from stage II, and these results correlated closely with the equations of Gardner and Hillel. The individual drying curves of this field study were qualitatively different from the laboratory studies and did not confirm the predictions of the equations, suggesting that diurnal variations in temperature and other meteorological parameters have caused the difference.
3

Field Measurements of Soil-Water Content and Soil-Water Pressure

Reginato, R. J., Jackson, R. D. 23 April 1971 (has links)
From the Proceedings of the 1971 Meetings of the Arizona Section - American Water Resources Assn. and the Hydrology Section - Arizona Academy of Science - April 22-23, 1971, Tempe, Arizona / Knowledge of the dynamic water content-pressure potential relationship within the soil profile is useful in determining the importance of hysteresis under natural conditions. Continuous monitoring of water content in the field is now possible using recently developed gamma-ray transmission equipment which allows water content measurements in 1 cm-thick soil layers with an error of 0.0009 gm/gm. The nuclear equipment and the tensiometer assembly for pressure measurements are described. Soil water content and pressure in the top 10 cm of a field soil profile were measured continuously for a 2-week period following an irrigation. The highest water content was measured each day just before sunrise. This declined rapidly from early morning to early afternoon, and was followed by a gain during the mid-afternoon and evening. The amplitude of this diurnal change diminished with time after irrigation. The pressure potential at a depth of 1.5 cm decreased most rapidly as the water content declined, but not exactly in phase. This may have been due to temperature effects on the pressure metering system. A moisture characteristic curve was constructed from the data.
4

Recharging the Ogallala Formation Using Shallow Holes

Dvoracek, M. J., Peterson, S. H. 23 April 1971 (has links)
From the Proceedings of the 1971 Meetings of the Arizona Section - American Water Resources Assn. and the Hydrology Section - Arizona Academy of Science - April 22-23, 1971, Tempe, Arizona / The southern bed of the ogallala aquifer is hydrologically isolated from all outside areas of recharge, requiring local precipitation for all natural recharge. Current withdrawals are so much greater than natural recharge that it appears that artificial recharge affords the only means of establishing at least a pseudo-balance. A number of observation wells were drilled at Texas Tech University, and subsequently capped until recharge water became available. The initial recharge was 2.5 af over 12 days, at a rate of 120 gpm for about the first day, after which 60 gpm was relatively constant. Approximately 1 month later, 1.2 af were recharged over 3 days at rates ranging over 140-90 gpm. It became evident that a cavity was present at the bottom of the hole being recharged. On a later recharge occasion, the cavity seemed to have enlarged. During a period of 2 years more than 28 af of surface runoff water have been recharged through the shallow hole with increases in recharge rates for each subsequent recharge period. The nature of this phenomenon and the cavities are not understood. This may represent the long sought after answer to recharge of the aquifer, but much more extensive research needs to be done.
5

Nitrogen Balance for a 23-Square Mile Minnesota Watershed

Johnson, Jack D. 23 April 1971 (has links)
From the Proceedings of the 1971 Meetings of the Arizona Section - American Water Resources Assn. and the Hydrology Section - Arizona Academy of Science - April 22-23, 1971, Tempe, Arizona / The nitrogen balance of a watershed near the city of New Prague, Minnesota was evaluated as part of an overall study on lake and stream eutrophication. Although the n-balance of a humid Midwest watershed cannot be expected to be identical to that of an arid watershed, the processes are the same and differences should be mainly quantitive. Sources of input and causes of depletion are reviewed for 4 points in the nitrogen cycle: the atmospheric zone, the soil-atmosphere interface, the plant-root and soil-water zone and the surface water zone. In the New Prague watershed, commercial fertilizer and bulk precipitation were the major sources of input, contributing, respectively, 53% and 34.4% of the total input of 2.34 million lb/yr. Crop yield and soil or groundwater storage contributed 52.1% and 20.4% of non-enrichment depletions. The closeness of the values of crop yield and commercial fertilizer application was an unfortunate coincidence and is certainly not an indication that the entire fertilizer supply was taken up cry crops. In an arid environment, free from fertilized agriculture, bulk precipitation probably provides the major source of nitrogen compounds.
6

Use of Stock Ponds for Hydrologic Research on Southwest Rangelands

Simanton, J. R., Osborn, H. B. 05 May 1973 (has links)
From the Proceedings of the 1973 Meetings of the Arizona Section - American Water Resources Assn. and the Hydrology Section - Arizona Academy of Science - May 4-5, 1973, Tucson, Arizona / Five livestock watering ponds on the walnut gulch experimental watershed were instrumented to evaluate the use of these ponds as a method for comparing rainfall amounts with runoff sediment volumes. Pond drainage area, vegetative cover, soil type, percent slope, and years of record were tested. Instrumentation consisted of water level recorders, and a topographic survey of each stock pond to ascertain its storage capacity. The results to date have been insufficient to reach definite conclusions due to instrumentation and surveying problems, and because of the natural variability of thunderstorm rainfall. Since most of these problems have now been corrected, future data should yield valuable hydrologic data for semiarid rangelands by means of these instrumented stock ponds.
7

Probabilistic methods for multi-source and temporal biomedical data quality assessment

Sáez Silvestre, Carlos 05 April 2016 (has links)
[EN] Nowadays, biomedical research and decision making depend to a great extent on the data stored in information systems. As a consequence, a lack of data quality (DQ) may lead to suboptimal decisions, or hinder the derived research processes and outcomes. This thesis aims to the research and development of methods for assessing two DQ problems of special importance in Big Data and large-scale repositories, based on multi-institutional, cross-border infrastructures, and acquired during long periods of time: the variability of data probability distributions (PDFs) among different data sources-multi-source variability-and the variability of data PDFs over time-temporal variability. Variability in PDFs may be caused by differences in data acquisition methods, protocols or health care policies; systematic or random errors during data input and management; demographic differences in populations; or even falsified data. To date, these issues have received little attention as DQ problems nor count with adequate assessment methods. The developed methods aim to measure, detect and characterize variability dealing with multi-type, multivariate, multi-modal data, and not affected by large sample sizes. To this end, we defined an Information Theory and Geometry probabilistic framework based on the inference of non-parametric statistical manifolds from the normalized distances of PDFs among data sources and over time. Based on this, a number of contributions have been generated. For the multi-source variability assessment we have designed two metrics: the Global Probabilistic Deviation, which measures the degree of global variability among the PDFs of multiple sources-equivalent to the standard deviation among PDFs; and the Source Probabilistic Outlyingness, which measures the dissimilarity of the PDF of a single data source to a global latent average. They are based on the construction of a simplex geometrical figure (the maximum-dimensional statistical manifold) using the distances among sources, and complemented by the Multi-Source Variability plot, an exploratory visualization of that simplex which permits detecting grouping patterns among sources. The temporal variability method provides two main tools: the Information Geometric Temporal plot, an exploratory visualization of the temporal evolution of PDFs based on the projection of the statistical manifold from temporal batches; and the PDF Statistical Process Control, a monitoring and automatic change detection algorithm for PDFs. The methods have been applied to repositories in real case studies, including the Public Health Mortality and Cancer Registries of the Region of Valencia, Spain; the UCI Heart Disease; the United States NHDS; and Spanish Breast Cancer and an In-Vitro Fertilization datasets. The methods permitted discovering several findings such as partitions of the repositories in probabilistically separated temporal subgroups, punctual temporal anomalies due to anomalous data, and outlying and clustered data sources due to differences in populations or in practices. A software toolbox including the methods and the automated generation of DQ reports was developed. Finally, we defined the theoretical basis of a biomedical DQ evaluation framework, which have been used in the construction of quality assured infant feeding repositories, in the contextualization of data for their reuse in Clinical Decision Support Systems using an HL7-CDA wrapper; and in an on-line service for the DQ evaluation and rating of biomedical data repositories. The results of this thesis have been published in eight scientific contributions, including top-ranked journals and conferences. One of the journal publications was selected by the IMIA as one of the best of Health Information Systems in 2013. Additionally, the results have contributed to several research projects, and have leaded the way to the industrialization of the developed methods and approaches for the audit and control of biomedical DQ. / [ES] Actualmente, la investigación biomédica y toma de decisiones dependen en gran medida de los datos almacenados en los sistemas de información. En consecuencia, una falta de calidad de datos (CD) puede dar lugar a decisiones sub-óptimas o dificultar los procesos y resultados de las investigaciones derivadas. Esta tesis tiene como propósito la investigación y desarrollo de métodos para evaluar dos problemas especialmente importantes en repositorios de datos masivos (Big Data), basados en infraestructuras multi-céntricas, adquiridos durante largos periodos de tiempo: la variabilidad de las distribuciones de probabilidad (DPs) de los datos entre diferentes fuentes o sitios-variabilidad multi-fuente-y la variabilidad de las distribuciones de probabilidad de los datos a lo largo del tiempo-variabilidad temporal. La variabilidad en DPs puede estar causada por diferencias en los métodos de adquisición, protocolos o políticas de atención; errores sistemáticos o aleatorios en la entrada o gestión de datos; diferencias demográficas en poblaciones; o incluso por datos falsificados. Esta tesis aporta métodos para detectar, medir y caracterizar dicha variabilidad, tratando con datos multi-tipo, multivariantes y multi-modales, y sin ser afectados por tamaños muestrales grandes. Para ello, hemos definido un marco de Teoría y Geometría de la Información basado en la inferencia de variedades de Riemann no-paramétricas a partir de distancias normalizadas entre las PDs de varias fuentes de datos o a lo largo del tiempo. En consecuencia, se han aportado las siguientes contribuciones: Para evaluar la variabilidad multi-fuente se han definido dos métricas: la Global Probabilistic Deviation, la cual mide la variabilidad global entre las PDs de varias fuentes-equivalente a la desviación estándar entre PDs; y la Source Probabilistic Outlyingness, la cual mide la disimilaridad entre la DP de una fuente y un promedio global latente. Éstas se basan en un simplex construido mediante las distancias entre las PDs de las fuentes. En base a éste, se ha definido el Multi-Source Variability plot, visualización que permite detectar patrones de agrupamiento entre fuentes. El método de variabilidad temporal proporciona dos herramientas: el Information Geometric Temporal plot, visualización exploratoria de la evolución temporal de las PDs basada en la la variedad estadística de los lotes temporales; y el Control de Procesos Estadístico de PDs, algoritmo para la monitorización y detección automática de cambios en PDs. Los métodos han sido aplicados a casos de estudio reales, incluyendo: los Registros de Salud Pública de Mortalidad y Cáncer de la Comunidad Valenciana; los repositorios de enfermedades del corazón de UCI y NHDS de los Estados Unidos; y repositorios españoles de Cáncer de Mama y Fecundación In-Vitro. Los métodos detectaron hallazgos como particiones de repositorios en subgrupos probabilísticos temporales, anomalías temporales puntuales, y fuentes de datos agrupadas por diferencias en poblaciones y en prácticas. Se han desarrollado herramientas software incluyendo los métodos y la generación automática de informes. Finalmente, se ha definido la base teórica de un marco de CD biomédicos, el cual ha sido utilizado en la construcción de repositorios de calidad para la alimentación del lactante, en la contextualización de datos para el reuso en Sistemas de Ayuda a la Decisión Médica usando un wrapper HL7-CDA, y en un servicio on-line para la evaluación y clasificación de la CD de repositorios biomédicos. Los resultados de esta tesis han sido publicados en ocho contribuciones científicas (revistas indexadas y artículos en congresos), una de ellas seleccionada por la IMIA como una de las mejores publicaciones en Sistemas de Información de Salud en 2013. Los resultados han contribuido en varios proyectos de investigación, y facilitado los primeros pasos hacia la industrialización de las tecnologías / [CA] Actualment, la investigació biomèdica i presa de decisions depenen en gran mesura de les dades emmagatzemades en els sistemes d'informació. En conseqüència, una manca en la qualitat de les dades (QD) pot donar lloc a decisions sub-òptimes o dificultar els processos i resultats de les investigacions derivades. Aquesta tesi té com a propòsit la investigació i desenvolupament de mètodes per avaluar dos problemes especialment importants en repositoris de dades massius (Big Data) basats en infraestructures multi-institucionals o transfrontereres, adquirits durant llargs períodes de temps: la variabilitat de les distribucions de probabilitat (DPs) de les dades entre diferents fonts o llocs-variabilitat multi-font-i la variabilitat de les distribucions de probabilitat de les dades al llarg del temps-variabilitat temporal. La variabilitat en DPs pot estar causada per diferències en els mètodes d'adquisició, protocols o polítiques d'atenció; errors sistemàtics o aleatoris durant l'entrada o gestió de dades; diferències demogràfiques en les poblacions; o fins i tot per dades falsificades. Aquesta tesi aporta mètodes per detectar, mesurar i caracteritzar aquesta variabilitat, tractant amb dades multi-tipus, multivariants i multi-modals, i no sent afectats per mides mostrals grans. Per a això, hem definit un marc de Teoria i Geometria de la Informació basat en la inferència de varietats de Riemann no-paramètriques a partir de distàncies normalitzades entre les DPs de diverses fonts de dades o al llarg del temps. En conseqüència s'han aportat les següents contribucions: Per avaluar la variabilitat multi-font s'han definit dos mètriques: la Global Probabilistic Deviation, la qual mesura la variabilitat global entre les DPs de les diferents fonts-equivalent a la desviació estàndard entre DPs; i la Source Probabilistic Outlyingness, la qual mesura la dissimilaritat entre la DP d'una font de dades donada i una mitjana global latent. Aquestes estan basades en la construcció d'un simplex mitjançant les distàncies en les DPs entre fonts. Basat en aquest, s'ha definit el Multi-Source Variability plot, una visualització que permet detectar patrons d'agrupament entre fonts. El mètode de variabilitat temporal proporciona dues eines: l'Information Geometric Temporal plot, visualització exploratòria de l'evolució temporal de les distribucions de dades basada en la varietat estadística dels lots temporals; i el Statistical Process Control de DPs, algoritme per al monitoratge i detecció automàtica de canvis en les DPs de dades. Els mètodes han estat aplicats en repositoris de casos d'estudi reals, incloent: els Registres de Salut Pública de Mortalitat i Càncer de la Comunitat Valenciana; els repositoris de malalties del cor de UCI i NHDS dels Estats Units; i repositoris espanyols de Càncer de Mama i Fecundació In-Vitro. Els mètodes han detectat troballes com particions dels repositoris en subgrups probabilístics temporals, anomalies temporals puntuals, i fonts de dades anòmales i agrupades a causa de diferències en poblacions i en les pràctiques. S'han desenvolupat eines programari incloent els mètodes i la generació automàtica d'informes. Finalment, s'ha definit la base teòrica d'un marc de QD biomèdiques, el qual ha estat utilitzat en la construcció de repositoris de qualitat per l'alimentació del lactant, la contextualització de dades per a la reutilització en Sistemes d'Ajuda a la Decisió Mèdica usant un wrapper HL7-CDA, i en un servei on-line per a l'avaluació i classificació de la QD de repositoris biomèdics. Els resultats d'aquesta tesi han estat publicats en vuit contribucions científiques (revistes indexades i en articles en congressos), una de elles seleccionada per la IMIA com una de les millors publicacions en Sistemes d'Informació de Salut en 2013. Els resultats han contribuït en diversos projectes d'investigació, i han facilitat la industrialització de les tecnologies d / Sáez Silvestre, C. (2016). Probabilistic methods for multi-source and temporal biomedical data quality assessment [Tesis doctoral]. Editorial Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/62188 / TESIS / Premiado

Page generated in 0.0615 seconds