Spelling suggestions: "subject:"desite"" "subject:"17site""
451 |
Status and death at Grasshopper Pueblo: experiments toward an archaeological theory of correlatesWhittlesey, Stephanie Michelle January 1978 (has links)
No description available.
|
452 |
Environmental site characterization via artificial neural network approachMryyan, Mahmoud January 1900 (has links)
Doctor of Philosophy / Department of Civil Engineering / Yacoub M. Najjar / This study explored the potential use of ANNs for profiling and characterization of various environmental sites. A static ANN with back-propagation algorithm was used to model the environmental containment at a hypothetical data-rich contaminated site. The performance of the ANN profiling model was then compared with eight known profiling methods. The comparison showed that the ANN-based models proved to yield the lowest error values in the 2-D and 3-D comparison cases. The ANN-based profiling models also produced the best contaminant distribution contour maps when compared to the actual maps. Along with the fact that ANN is the only profiling methodology that allows for efficient 3-D profiling, this study clearly demonstrates that ANN-based methodology, when properly used, has the potential to provide the most accurate predictions and site profiling contour maps for a contaminated site.
ANN with a back-propagation learning algorithm was utilized in the site characterization of contaminants at the Kansas City landfill. The use of ANN profiling models made it possible to obtain reliable predictions about the location and concentration of lead and copper contamination at the associated Kansas City landfill site. The resulting profiles can be used to determine additional sampling locations, if needed, for both groundwater and soil in any contaminated zones.
Back-propagation networks were also used to characterize the MMR Demo 1 site. The purpose of the developed ANN models was to predict the concentrations of perchlorate at the MMR from appropriate input parameters. To determine the most-appropriate input parameters for this model, three different cases were investigated using nine potential input parameters. The ANN modeling used in this case demonstrates the neural network’s ability to accurately predict perchlorate contamination using multiple variables. When comparing the trends observed using the ANN-generated data and the actual trends identified in the MMR 2006 System Performance Monitoring Report, both agree that perchlorate levels are decreasing due to the use of the Extraction, Treatment, and Recharge (ETR) systems.
This research demonstrates the advantages of ANN site characterization modeling in contrast with traditional modeling schemes. Accordingly, characterization task-related uncertainties of site contaminations were curtailed by the use of ANN-based models.
|
453 |
Détection de sites sécuritaires par réseaux de neurones pour un atterrissage autonome sur corps planétaireBelley, Katia January 2008 (has links)
Dans le cadre des futures missions d'exploration planétaire comportant un atterrissage, la sélection d'un site d'atterrissage sécuritaire en temps réel devient une technologie de plus en plus recherchée. Celle-ci permet d'augmenter les retombées scientifiques de la mission en donnant accès à des régions à plus haut potentiel scientifique. Elle permet aussi d'accroître les chances de réussite de la mission et d'augmenter la charge utile des équipements en rendant l'atterrissage plus sécuritaire. Parmi les méthodes développées pour faire la sélection d'un site d'atterrissage, celle proposée par Andrew Johnson du Jet Propulsion Laboratory pour évaluer le degré de sécurité de sites d'atterrissage à partir d'images lidar prises pendant la descente s'avère très intéressante. Il utilise une technique nommée moindres carrées médians pour calculer la pente et la rugosité des sites d'atterrissage. Cependant, le temps de calcul exigé par cette approche la rend difficile à exécuter en temps réel. Ce mémoire de maîtrise propose l'utilisation d'un système à base de RNA (réseaux de neurones artificiels) pour faire l'approximation de la méthode des moindres carrés médians. Une architecture comportant quatre RNA a été développée afin de déterminer la pente et la rugosité d'un site d'atterrissage. Trois RNA permettent d'évaluer les paramètres du plan médian afin d'estimer ces deux propriétés du terrain. Un réseau optionnel est spécialisé pour l'évaluation des sites comportant une grande rugosité. Des modules de prétraitement et post-traitement des données sont utilisés pour améliorer la performance des réseaux de neurones et des modules d'arbitrage servent à déterminer les deux sorties du système. Une solution est aussi proposée pour présélectionner une zone d'atterrissage sécuritaire afin de réduire le nombre de sites individuels à évaluer. Plusieurs types de réseaux de neurones ont été comparés pour résoudre la problématique. Des lignes directrices ont été établies permettant de choisir les réseaux de neurones les plus efficaces pour chacun des modules en fonction du temps de calcul disponible. Le système développé permet de diminuer considérablement le temps de calcul requis pour résoudre la problématique. De plus, la solution proposée peut facilement être adaptée en fonction des objectifs de la mission spatiale.
|
454 |
Étude des taux d'oxydation du méthane dans des colonnes expérimentales simulant un biorecouvrement de site d'enfouissementRoncato, Camila January 2009 (has links)
Le méthane (CH4 ) peut être oxydé en dioxyde de carbone (CO2 ) par les bactéries méthanotrophes présentes dans les recouvrements des sites d'enfouissement, en présence de l'oxygène. Il est possible de développer un recouvrement qui favorise la croissance des bactéries qui oxydent le méthane. On parle alors de biorecouvrement d'oxydation passive du méthane (BOPM). La mise en place d'un BOPM permet de réduire les émissions de CH 4 pendant et après la période de récupération active, où il n'est plus économiquement viable d'extraire le biogaz. Plus spécifiquement, le présent mémoire traite des essais d'oxydation en colonnes simulant un biorecouvrement d'un site d'enfouissement. Les taux d'oxydation ont été obtenus en faisant varier les substrats, les épaisseurs et les degrés de saturation de ceux-ci. Les sols utilisés provenaient des 2 BOPM construits sur le site d'enfouissement de Saint-Nicéphore (Québec, Canada). De plus, un sol contaminé a été testé. Les taux d'oxydation maximaux des essais de colonne ont varié entre 49 et 141 g CH 4 /m2 /j, représentant des efficacités variant entre 81 et 90 %. Les facteurs les plus susceptibles de faire varier le taux d'oxydation sont : le pas d'augmentation de l'alimentation en CH4 , l'épaisseur du substrat et le degré de saturation en eau. Des essais de respiration ont été réalisés et une méthode a été développée pour calculer les efficacités d'oxydation du CH4 en profil en prenant en compte la respiration. Des essais de caractérisation initiaux et finaux sur les substrats ont été faits pour évaluer les changements du pH, de la teneur en matière organique et du degré de saturation. La saturation est le seul paramètre qui a présenté une différence significative entre le début et la fin des essais d'oxydation. Une comparaison a été réalisée entre les taux d'oxydation du laboratoire et le taux au terrain. Pour le substrat du BOPM 3B, les taux au laboratoire ont été plus élevés, par contre, pour le substrat du BOPM 2 les taux au terrain ont été supérieurs grâce à la présence de végétation indigène.
|
455 |
Pea carbonic anhydrase : a kinetic studyJohansson, Inga-Maj January 1994 (has links)
The enzyme carbonic anhydrase (CA), catalysing the interconversion between CO2 and HCO3', has long been known to be present in plants as well as in animals. Several of the animal isozymes, but none of the plant CAs, have been extensively studied. When the first plant CA cDNA sequences were published in 1990, it was obvious that the animal and plant CAs represent evolutionarily distinct families with no significant sequence homology between the families. Pea CA is synthesised as a precursor and subsequently processed at the import into the chloroplast. When we purified CA from pea leaves two oligomeric forms with molecular masses around 230 kDa were obtained. One form was homogenous while the other form contained subunits of two different sizes. The larger subunit has an acidic and highly charged N-terminal extension, consisting of 37 residues. We propose that the sequence that precedes the cleavage site resulting in the large subunit represents the functional transit peptide, directing CA to the chloroplast. Neither the transit peptide nor the acidic 37-residue peptide were found to affect the folding, activity or oligomerisation of pea CA. Kinetic investigations showed that pea CA requires a reduced environment and high concentrations of buffer for maximal catalytic activity. High buffer concentrations result in a faster turnover of the enzyme (kcat) while the efficiency (kcatlKm) is not affected. This is consistent with a ping-pong mechanism with the buffer as the second substrate. Both kcat and kcatlKm increase with pH but the dependences cannot be described by simple titration curves. SCN' is an uncompetitive inhibitor at high pH and a noncompetitive inhibitor at neutral and low pH. This is in accordance with the mechanistic model, previously proposed for human CAM, involving a zincbound water molecule as a catalytic group. In this model, the carbon dioxide - bicarbonate interconversion, reflected by kcatlKm, is temporally separated from a rate limiting proton-transfer step. At high pH, solvent hydrogen isotope effects obtained for pea CA agree with this scheme, while they do not fit at neutral and low pH. Site-specific mutations of cysteine residues at positions 165, 269 and 272 were difficult to study, either because strong deviations from Michaelis-Menten kinetics were observed, or because the mutants were found in inclusion bodies. However, the mutant H208A was found to be a very efficient enzyme with the highest kcatlKm value obtained for any CA so far, 2.9-108 M'1s '1. With the H208A mutant an increased dependence on high buffer concentrations at low pH was obtained. At high pH, the mutant is more efficient than the unmutated enzyme. The H208A mutant is also more prone to oxidation than the wild-type enzyme. / <p>Diss. (sammanfattning) Umeå : Umeå universitet, 1994, härtill 4 uppsatser</p> / digitalisering@umu
|
456 |
Broken K Pueblo: Prehistoric Social Organization in the American SouthwestHill, James N. January 1970 (has links)
This report presents an analysis of a prehistoric Pueblo community in structural, functional, and evolutionary terms; it is a sequel to William A. Longacre's Archaeology as Anthropology. The emphasis is on social organization (including the patterning of community activities) and on understanding changes in this organization in terms of adaptive responses to a shifting environment.
|
457 |
Mimbres Archaeology of the Upper Gila, New MexicoLekson, Stephen H. January 1990 (has links)
This reappraisal of archaeology conducted at the Saige-McFarland site presents for the first time a substantial body of comparative data from a Mimbres period site in the Gila drainage. Lekson offers a new and controversial interpretation of the Mimbres sequence, reintroducing the concept of the Mangas phase first proposed by the Gila Pueblo investigations of the 1930s and demonstrating a more gradual shift from pithouse to pueblo occupance than has been suggested previously.
|
458 |
Gribshunden, en studie av site formation processNilsson-Björk, Mikael January 2016 (has links)
Today’s knowledge about late medieval ships is very poor, especially when it comes to caravel-built warships in northern Europe. In 1971 some sport divers found a wreck in the archipelago of Ronneby, that proved to be the Danish king Hans great ship Gribshunden, which sank in 1495. The wreck has been investigated in recent years by Kalmar läns museum and MARIS from Södertörn högskola. The research potential is from an archaeological perspective tremendous. The wreck is in very good condition compared to other finds around the world, it is unique. My intention with this thesis is to find out why the wreck hasn’t disintegrated and totally disappeared like most of contemporary wrecks. To understand the factors that are involved in the decomposition process, the wreck is analyzed with regard to Keith Muckelroy´s maritime version of “site formation process”, which presents a set of analytic tools to assess how different processes affect a wreck and its find place, both in the short and long terms. Hopefully, this thesis might be useful when it comes to find still undiscovered wrecks in similar environments.
|
459 |
TELEMETRY AND JUGGLINGJones, Charles H. 10 1900 (has links)
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California / One of the beauties of mathematics is its ability to demonstrate the relationship between apparently unrelated subjects. And this is not only an aesthetic attribute. The insight obtained by seeing relations where they are not obvious often leads to elegant solutions to difficult problems. This paper will demonstrate a mathematical relation between telemetry and juggling. Any given pulse code modulation (PCM) format can be mapped onto a juggling pattern. The Inter-Range Instrumentation Group (IRIG) 106 Class I PCM formats are a subset of all juggling patterns while the Class II PCM formats are equivalent to the set of all juggling patterns (within some mathematically precise definitions). There are actually quite a few mathematical results regarding juggling patterns. This paper will also discuss how these topics relate to tessellations, bin packing, PCM format design, and dynamic spectrum allocation. One of the shortcomings of human nature is the tendency to get caught up in a particular topic or viewpoint. This is true of the telemetry community as well. It is hoped that this paper will increase the awareness that there are a variety of areas of theory outside of telemetry that may be applicable to the field.
|
460 |
Τεχνικές για προσαρμοστική και προσωποποιημένη πρόσβαση σε ιστοσελίδεςΤσάκου, Αναστασία 10 June 2014 (has links)
Ο μεγάλος όγκος σελίδων και υπηρεσιών στο Διαδίκτυο αρκετές φορές δημιουργεί προβλήματα πλοήγησης με αποτέλεσμα η αναζήτηση εγγράφων και πληροφοριών να είναι μια εξαιρετικά χρονοβόρα και δύσκολη διαδικασία. Για το λόγο αυτό είναι απαραίτητη η πρόβλεψη των αναγκών των χρηστών με στόχο τη βελτίωση της χρηστικότητας του Διαδικτύου αλλά και της παραμονής του χρήστη σε έναν δικτυακό τόπο. Ο στόχος αυτής της διπλωματικής εργασίας είναι αρχικά να παρουσιάσει μεθόδους και τεχνικές που χρησιμοποιούνται για την εξατομίκευση και προσαρμογή στα ενδιαφέροντα του χρήστη, δικτυακών τόπων. Η εξατομίκευση περιλαμβάνει τη χρήση πληροφοριών που προέρχονται από τα ενδιαφέρονται και τη συμπεριφορά πλοήγησης του χρήστη σε συνδυασμό με το περιεχόμενο και τη δομή του δικτυακού τόπου. Στη συνέχεια παρουσιάζεται ένα σύστημα αναδιοργάνωσης της δομής ενός δικτυακού τόπου, του οποίου η υλοποίηση βασίστηκε στη δημοτικότητα των σελίδων για κάθε χρήστη όπως αυτή προκύπτει από τα log αρχεία που διατηρεί ο server του δικτυακού τόπου. Τέλος, το σύστημα αυτό εφαρμόζεται σε έναν πειραματικό δικτυακό τόπο και γίνεται αξιολόγηση των αποτελεσμάτων εφαρμογής του. / The large number of web pages on many Web sites has raised navigation problems. As a result, users often miss the goal of their inquiry, or receive ambiguous results when they try to navigate through them. Therefore, the requirement for predicting user needs in order to improve the usability and user retention of a Web Site is more than ever, indispensable. The primary purpose of this thesis is to explore methods and techniques for improving or “personalizing” Web Sites. Web personalization includes any action that adapts the information or services provided by a Web site to the needs of a particular user or a set of users, taking advantage of the knowledge gained from the users’ navigation behavior and interests in combination with the content and structure of the Web Site. Secondly, this thesis describes the implementation of a tool (reorganization software) which parses log files and uses specific metrics related to web page accesses, in order to reorganize the structure of a web site according to its users’ preferences. Finally, the tool is applied in an experimental Web Site and the results of this reorganization process are evaluated.
|
Page generated in 0.0452 seconds