• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 33
  • 7
  • 6
  • 6
  • 4
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 70
  • 70
  • 20
  • 20
  • 15
  • 14
  • 12
  • 10
  • 10
  • 10
  • 10
  • 10
  • 10
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The General Purpose Interface Bus

Baker, Ernest D. 01 January 1980 (has links) (PDF)
The General Purpose Interface Bus, as defined by the IEEE Standard, deals with systems that require digital data to be transferred between a group of instruments. An overview of this standard is presented which summarizes the interface's capabilities, functions and versatility by explaining the basic interface concepts. In addition, a GPIB testing application and a GPIB related design example are presented and investigated.
2

ADVANTAGES OF GENERAL PURPOSE TELEMETRY DATA AND CONTROL SYSTEMS

Hales, John C. 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1985 / Riviera Hotel, Las Vegas, Nevada / A key milestone for every telemetry design is that date when everyone agrees on a definition of the design requirements. Unfortunately, specifications often become obscured as test constraints change, additional requirements are uncovered, test objectives are more clearly defined, and budgets are cut in half. Historically, telemetry designs using technology, hardware, and philosophy that pre-date Christopher Columbus have caused obvious rigidity to the system design and its operation. Once completed, program managers become ruefully aware that these systems are difficult (if not impossible) to modify and are always very costly to change. Telemetry systems available today offer the flexibility necessary to accommodate a frequently changing measurement list. Not only can the measurement list be changed, it can be changed during the course of a test in progress. If requirements expand, hardware may be added. If the test is a non-destructive test, the system can be configured for use on future programs.
3

Implementation of the General Purpose Criterion of the Chemical Weapons Convention

Pearson, Graham S. January 2003 (has links)
Yes
4

Enabling high-performance, mixed-signal approximate computing

St Amant, Renee Marie 07 July 2014 (has links)
For decades, the semiconductor industry enjoyed exponential improvements in microprocessor power and performance with the device scaling of successive technology generations. Scaling limitations at sub-micron technologies, however, have ceased to provide these historical performance improvements within a limited power budget. While device scaling provides a larger number of transistors per chip, for the same chip area, a growing percentage of the chip will have to be powered off at any given time due to power constraints. As such, the architecture community has focused on energy-efficient designs and is looking to specialized hardware to provide gains in performance. A focus on energy efficiency, along with increasingly less reliable transistors due to device scaling, has led to research in the area of approximate computing, where accuracy is traded for energy efficiency when precise computation is not required. There is a growing body of approximation-tolerant applications that, for example, compute on noisy or incomplete data, such as real-world sensor inputs, or make approximations to decrease the computation load in the analysis of cumbersome data sets. These approximation-tolerant applications span application domains, such as machine learning, image processing, robotics, and financial analysis, among others. Since the advent of the modern processor, computing models have largely presumed the attribute of accuracy. A willingness to relax accuracy requirements, however, with goal of gaining energy efficiency, warrants the re-investigation of the potential of analog computing. Analog hardware offers the opportunity for fast and low-power computation; however, it presents challenges in the form of accuracy. Where analog compute blocks have been applied to solve fixed-function problems, general-purpose computing has relied on digital hardware implementations that provide generality and programmability. The work presented in this thesis aims to answer the following questions: Can analog circuits be successfully integrated into general-purpose computing to provide performance and energy savings? And, what is required to address the historical analog challenges of inaccuracy, programmability, and a lack of generality to enable such an approach? This thesis work investigates a neural approach as a means to address the historical analog challenges of inaccuracy, programmability, and generality and to enable the use of analog circuits in general-purpose, high-performance computing. The first piece of this thesis work investigates the use of analog circuits at the microarchitecture level in the form of an analog neural branch predictor. The task of branch prediction can tolerate imprecision, as roll-back mechanisms correct for branch mispredictions, and application-level accuracy remains unaffected. We show that analog circuits enable the implementation of a highly-accurate, neural-prediction algorithm that is infeasible to implement in the digital domain. The second piece of this thesis work presents a neural accelerator that targets approximation-tolerant code. Analog neural acceleration provides application speedup of 3.3x and energy savings of 12.1x with a quality loss less than 10% for all except one approximation-tolerant benchmark. These results show that, using a neural approach, analog circuits can be applied to provide performance and energy efficiency in high-performance, general-purpose computing. / text
5

Implementation of the General Purpose Criterion in the Biological and Toxin Weapons Convention and Protocol

Robinson, Julian P.P., Whitby, Simon M. January 2000 (has links)
Yes / Julian P. Perry Robinson discusses the role of the General Purpose Criterion in the implementation of the Biological and Toxin Weapons Convention.
6

A General Purpose Field-Programmable Digital Microfluidic Biochip with Scannable Electrofluidic Control

Joseph, Rissen Alfonso 23 October 2014 (has links)
No description available.
7

Attitydanalys på ett internetforum med general-purpose-verktyg

Forssner, Mårten, Aldenbäck, Frida January 2022 (has links)
För människor uppstår det ofta problem vid behandling och organisering av stora mängder information. Detta ledde i slutet på 90-talet till att automatiserad textkategorisering började utvecklas. Sedan dess har tillväxten av sociala medier och sociala nätverk på internet ökat explosionsartat. Det senaste årtiondet har en betydande mängd forskning gjorts inom automatiserad textkategorisering. Syftet med denna studie var att undersöka vad som kan påverka hur general-purpose-verktyg för attitydanalys bedömer kommentarer från ett internet- forum. 240 kommentarer samlades in från sex olika forumtrådar på internetforumet Reddit. Attitydanalys utfördes på dessa kommentarer med hjälp av fyra general-purpose-verktyg, och verktygens bedömningar jämfördes sedan med en människas bedömning av samma kommentarer. Genom studien framkom ett antal faktorer som kan ha påverkat hur verktygen bedömde kommentarerna. En faktor var längden på en kommentar: verktygen TextBlob och MeaningCloud hade högre överensstämmelse med människan för korta kommentarer, medan Free Sentiment Analyzer och MonkeyLearn hade högre överensstämmelse för långa kommentarer. Vad en kommentar kategoriserats som var också en faktor: verktygen hade högre överensstämmelse med människan gällande kommentarer som inte tillhörde kategorier jämfört med kommentarer som tillhörde minst en kategori. / When processing and organizing large amounts of information, people tend to encounter problems. This led to the development of automated text categorization in the late 1990s. Since then, the growth of social media and social networks on the Internet has increased exponentially. Over the last decade a substantial amount of research has been conducted on the subject of automated text categorization. The purpose of this study was to investigate what factors may affect how general-purpose tools for sentiment analysis assess attitudes expressed on Internet forums. 240 comments were collected from six different forum threads on the Internet forum Reddit. Sentiment analysis was performed on these comments using four general-purpose tools, and the tools' assessments were then compared with a human’s assessments of the same comments. A number of factors were identified that may have affected how the tools assessed the comments. One factor was the comment length: TextBlob and MeaningClouds assessments were more congruent with the human’s assessments regarding short comments, while Free Sentiment Analyzer and MonkeyLearns assessments were more congruent regarding long comments. Another factor was the categories associated with a comment: all tools had higher congruence with the human’s assessments regarding comments that were not assigned to a category compared to comments that fit the criteria for at least one category.
8

Modèles de calcul sur les réels, résultats de comparaison / Computation on the reals. Comparison of some models

Hainry, Emmanuel 07 December 2006 (has links)
Il existe de nombreux modèles de calcul sur les réels. Ces différents modèles calculent diverses fonctions, certains sont plus puissants que d'autres, certains sont deux à deux incomparables. Le calcul sur les réels est donc de ce point de vue bien différent du calcul sur les entiers qui est unifié par la thèse de Church-Turing affirmant que tous les modèles raisonnables calculent les mêmes fonctions. Nous montrons des équivalences entre les fonctions récursivement calculables et une certaine classe de fonctions R-récursives et entre les fonctions GPAC-calculables et les fonctions récursivement calculables. Nous montrons également une hiérarchie de classes de fonctions R-récursives qui caractérisent les fonctions élémentairement calculables, les fonctions de la hiérarchie de Grzegorczyk et les fonctions récursivement calculables à l'aide d'un opérateur de limite. Ces résultats constituent donc une avancée vers une éventuelle unification des modèles de calcul sur les réels / Computation on the real numbers can be modelised in several different ways. There indeed exist a lot of different computation models on the reals. However, there are few results for comparing those models, and most of these results are incomparability results. The case of computation over the reals hence is quite different from the classical case where Church thesis claims that all reasonable models compute exactly the same functions. We show that recursively computable functions (in the sense of computable analysis) can be shown equivalent to some adequately defined class of R-recursive functions, and also to GPAC-computable functions. More than an analog characterization of recursively enumerable functions, we show that the limit operator we defined can be used to provide an analog characterization of elementarily computable functions and functions from Grzegorczyk's hierarchy. Those results can be seen as a first step toward a unification of computable functions over the reals
9

Development and application of a coupled geomechanics model for a parallel compositional reservoir simulator

Pan, Feng 03 June 2010 (has links)
For a stress-sensitive or stress-dependent reservoir, the interactions between its seepage field and in situ stress field are complex and affect hydrocarbon recovery. A coupled geomechanics and fluid-flow model can capture these relations between the fluid and solid, thereby presenting more precise history matchings and predictions for better well planning and reservoir management decisions. A traditional reservoir simulator cannot adequately or fully represent the ongoing coupled fluid-solid interactions during the production because of using the simplified update-formulation for porosity and the static absolute permeability during simulations. Many researchers have studied multiphase fluid-flow models coupled with geomechanics models during the past fifteen years. The purpose of this research is to develop a coupled geomechanics and compositional model and apply it to problems in the oil recovery processes. An equation of state compositional simulator called the General Purpose Adaptive Simulator (GPAS) is developed at The University of Texas at Austin and uses finite difference / finite control volume methods for the solution of its governing partial differential equations (PDEs). GPAS was coupled with a geomechanics model developed in this research, which uses a finite element method for discretization of the associated PDEs. Both the iteratively coupled solution procedure and the fully coupled solution procedure were implemented to couple the geomechanics and reservoir simulation modules in this work. Parallelization, testing, and verification for the coupled model were performed on parallel clusters of high-performance workstations. MPI was used for the data exchange in the iteratively coupled procedure. Different constitutive models were coded into GPAS to describe complicated behaviors of linear or nonlinear deformation in the geomechanics model. In addition, the geomechanics module was coupled with the dual porosity model in GPAS to simulate naturally fractured reservoirs. The developed coupled reservoir and geomechanics simulator was verified using analytical solutions. Various reservoir simulation case studies were carried out using the coupled geomechanics and GPAS modules. / text
10

Capturing Information and Communication Technologies as a General Purpose Technology / Les technologies de l'information et de la communication, une technologie générique

Le Hir, Boris 20 November 2012 (has links)
Cette thèse a pour objet l'étude des Technologies de l'Information et de la Communication (TIC) en tant que Technologie Générique (TG) ainsi que leur rôle dans l'évolution de la productivité du travail aux Etats-Unis et en Europe durant les dernières décennies. La thèse est constituée de trois parties axées chacune sur l'une des trois propriétés fondamentales des TG: le progrès rapide de la technologie, l'ubiquité de la technologie et la capacité à créer des opportunités technologiques. La première partie décrit, dans un premier chapitre, l'innovation dans le domaine des TIC, en commençant par un bref historique de ces technologies, suivie d'une analyse des données contemporaines sur l'innovation dans ce domaine. Elle montre en particulier comment les Etats-Unis ont été, jusqu'à présent, plus performants que les pays Européens dans le développement des TIC. Dans un deuxième chapitre, cette première partie inventorie les difficultés de mesures induites par la vitesse et la nature du changement généré par ces technologies. La seconde partie de la thèse traite de la nature ubiquitaire des TIC. Elle décrit d'abord la diffusion des TIC au cours du temps à travers les pays et les secteurs économiques, puis, établit une revue de la littérature sur la contribution directe de la diffusion des TIC à la croissance de la productivité du travail aux US et en Europe. Le second chapitre de cette partie s'intéresse au comportement de demande de facteurs de production dans les secteurs producteurs de TIC ou intensifs en utilisation des TIC. Enfin, la troisième partie de la thèse se concentre sur la capacité des TIC à générer des opportunités d'innovation. Pour cela elle identifie d'abord la nature des innovations complémentaires et les efforts menant à ces innovations. Elle montre alors la nécessité d'améliorer la comptabilité nationale afin de prendre en compte ces efforts comme des investissements. Cette partie révèle ensuite que, dans les onze pays européens étudiés, le problème est particulièrement concentré sur quelques pays qui investissent peu en TIC et en actifs innovants et que ces deux types d'efforts sont complémentaires. / This thesis aims to study Information and Communication Technologies (ICT) as a General Purpose Technology (GPT) and their role in the labor productivity evolution in the United States and Europe during recent decades. This thesis is organized in three parts corresponding to the fundamental GPT features: the wide possibilities for development, the ubiquity of the technology and the ability to create large technological opportunities. The first part depicts, at first, the innovation in ICT, beginning with a short historical review of ICT inventions followed by the analysis of current data on innovation in this field. In particular, it shows how the US was better than the European countries in inventing ICT until now. Second, this first part makes an inventory of measurement difficulties due to the rate and the nature of the change created by such technologies. The second part of the thesis deals with the ubiquitous nature of ICT. It first describes the ICT diffusion across countries and industries and reviews the economic literature on the direct contribution of ICT on labor productivity growth in the US and Europe. The next chapter studies the factor demand's behaviour in sectors that are either ICT producers or ICT intensive users. The third part focuses on the ICT ability to create opportunities for complementarity innovations. Firstly, it identifies the nature of ICT complementary innovations and the corresponding efforts. It shows, then, that national accounts must be improved in order to take these efforts into account as investments. Secondly, this part shows that, among the eleven European countries studied, the problem is highly concentrated in a few countries that invest less both in ICT and in innovative assets and that these two types of effort are complementary.

Page generated in 0.059 seconds