• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 67
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 86
  • 57
  • 57
  • 10
  • 8
  • 8
  • 8
  • 8
  • 8
  • 8
  • 5
  • 5
  • 5
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Please Light Up and Other Poems

Powers, Ted 01 January 2015 (has links)
This is a collection of poems.
62

The information rate and other parameters of probabilistic context free grammars and their parsers

Kestner, Simon January 1974 (has links)
Probabilistic context-free languages are defined by giving predetermined probabilities (preprobabilities) for the choices that their grammars make when generating. Chapter 1 shows how to carry out the above definition, and how to calculate some parameters or the language; for instance: average length or work, mean square length, digraph probabilities, entropy. Chapter 2 introduces generating ffunctions related to grammars. It uses them to derive a condition for which preprobabilities give rise to well-fformed probability spaces. Two ffunctions, the length and entropy generating ffunctions are studied in detail. They are algebraic ffunctions, can in general only be defined implicitly, but can be used to give unified explicit methods or calculating all the parameters or chapter I (and more). Chapter 3 defines and shows how to calculate the information rate or a language. As a by-blow, Macmillan's theorem is extended (for a small class or processes) to an analogue or the Central Limit Theorem. Chapter 4 tries to compare the efficiencies or different parsing algorithms. In a reasonable sense, all deterministic parsers take equal average time to parse, any backtracking parser is slower, but there is no general algorithm for calculating the speed or a backtracking parser.
63

Large scale dynamics and fluctuations in non-equilibrium stochastic particle systems

Pizzoferrato, Andrea January 2018 (has links)
Statistical Mechanics (SM) is a branch of theoretical physics which inquires the properties of systems made up of a large number of interacting components. Starting from the details of the dynamics describing the interaction among the smaller components, the goal of statistical mechanics is to determine the emergent collective behaviour of the system resulting from the interaction of its constitutive parts. In this way, SM constitutes a bridge between the microscopic and macroscopic worlds. Starting from the beginning of the last century, SM has proven to be successful in describing many phenomena related to condensed matter physics like, just to cite a few, ferromagnetism, law of gases, glass states etc. Notwithstanding, the fields of application of this subject goes far beyond pure physics and they reach artificial intelligence, quantitative sociology, economics, biology etc. For all these reasons, SM is a vibrant and active discipline which keeps holding the attention of the scientific community. To describe the emergent collective behaviours, we need to de ne macroscopic observables which are linked to microscopic quantities. In the present thesis, we will mainly focus on the density of particles in a system and their current, that is the net flow of particles in a specific direction. In particular, we will be interested in the stationary regime of the macroscopic current which is given by a suitable average over the microscopic and quickly fluctuating particle flow. When the detailed balance condition is satisfied the system is said to be in equilibrium, and on average there is no net stationary current. In general, a system may be able to exhibit many equilibrium configurations which can be selected by choosing suitable values of a tuning parameter, such as temperature. This abrupt change as a parameter varies is often called a phase transition and determining the critical thresholds among phases is one of the fundamental questions which SM tries to answer. Instead, when the macroscopic stationary current has a value different from zero, the system is in a Non-Equilibrium Stationary State (NESS) and this will be of central interest for the present work.
64

Translating Lucid data flow into message passing actors

Pilgram, Paul Theo January 1983 (has links)
This thesis is the first translation of full Lucid into code for von Neumann machines ("imperative code"). It demonstrates that it is possible to produce efficient code even in the presence of advanced features such as "currenting", recursive functions or operators whose semantics favour concurrency. Earlier compiled implementations stopped well short of this. Lucid is a family of non-procedural programming languages, invented by Wadge and Ashcroft. Lucid is neither tied to any particular data algebra, nor to a particular implementation technique. However. Data Flow (with its variants) lends itself particularly well to the implementation of Lucid. Message Passing Actors is an imperative programming technique which leaves scope for cooperating concurrency. This benefits hardware (multi—computers, transputers'') and software technology alike. In this thesis, LUX. a PASCAL-like language with Message Passing Actors, has been chosen as the target language. It is shown that there is a subset of Lucid (a "nucleus") which has the same expressive capacity as full Lucid. The nucleus is easier to implement than full Lucid. As a prerequisite for the translation, a LUX actor equivalent is formulated for each operator of the nucleus, once and for all. The design of these operator—actors is strongly guided by the execution strategy of demand driven Data Flow (''lazy evaluation"). Their data storage is based on FIFO queues ("pipelines"). The actors operate concurrently, but they harmonise their actions by exchanging messages which follow an agreed protocol. The translation is carried out in successive stages. First the Lucid program is transformed to make it lie entirely within the nucleus. The program is then mapped into LUX, where each operator is represented by an operatoi—actor and the references to the variables are manifested in the environment setup of these actors. Finally, the LUX code is made more efficient by the application of a variety of analysis and optimisation methods. Lucid programs can be analysed for various properties, and the resulting information can assist the code optimisation (while also revealing program errors). Particularly important among these program analyses is a queue length determination based on Wadge’s Cycle Sum Test.
65

Marker-less human body part detection, labelling and tracking for human activity recognition

Azhar, Faisal January 2015 (has links)
This thesis focuses on the development of a real-time and cost effective marker-less computer vision method for significant body point or part detection (i.e., the head, arm, shoulder, knee, and feet), labelling and tracking, and its application to activity recognition. This work comprises of three parts: significantbody point detection and labelling, significant body point tracking, and activity recognition. Implicit body models are proposed based on human anthropometry, kinesiology, and human vision inspired criteria to detect and label significant body points. The key idea of the proposed method is to fit the knowledge from the implicit body models rather than fitting the predefined models in order to detect and label significant body points. The advantages of this method are that it does not require manual annotation, an explicit fitting procedure, and a training (learning) phase, and it is applicable to humans with different anthropometric proportions. The experimental results show that the proposed method robustly detects and labels significant body points in various activities of two different (low and high) resolution data sets. Furthermore, a Particle Filter with memory and feedback is proposed that combines temporal information of the previous observation and estimation with feedback to track significant body points in occlusion. In addition, in order to overcome the problem presented by the most occluded body part, i.e., the arm, a Motion Flow method is proposed. This method considers the human arm as a pendulum attached to the shoulder joint and defines conjectures to track the arm since it is the most occluded body part. The former method is invoked as default and the latter is used as per a user's choice. The experimental results show that the two proposed methods, i.e., Particle Filter and Motion Flow methods, robustly track significant body points in various activities of the above-mentioned two data sets and also enhance the performance of significant body point detection. A hierarchical relaxed partitioning system is then proposed that employs features extracted from the significant body points for activity recognition when multiple overlaps exist in the feature space. The working principle of the proposed method is based on the relaxed hierarchy (postpone uncertain decisions) and hierarchical strategy (group similar or confusing classes) while partitioning each class at different levels of the hierarchy. The advantages of the proposed method lie in its real-time speed, ease of implementation and extension, and non-intensive training. The experimental results show that it acquires valuable features and outperforms relevant state-of-the-art methods while comparable to other methods, i.e., the holistic and local feature approaches. In this context, the contribution of this thesis is three-fold: Pioneering a method for automated human body part detection and labelling. Developing methods for tracking human body parts in occlusion. Designing a method for robust and efficient human action recognition.
66

Papers, please : Hur stävjas en humanitär moralisk utveckling i Papers, please? / Papers, please : What hinders a humanitarian moral development in Papers, please?

Malmström, Marcus, Rolf, Alexander January 2021 (has links)
Welcome to Arstotzka, in this game you play as a passport inspector at the border to the newly conquered land of Kolechia. The game Papers, please is named after the phrase your character says to everyone trying to get access to the country of Arstotzka. The gameplay is a simple point-and-click game within the simulation category of video games, a simulation of a bureaucratic job which consists of checking documentation. The gameplay is not a fantasy nor is it escapism, so in what way is this a compelling game? Papers, please tackle something larger than enjoyable gameplay with satisfying graphics, it tackles the story of how an entire totalitarian state is being upheld by regular people. People just like you and me.
67

Pedestrian Leadership and Egress Assistance Simulation Environment (PLEASE)

Feuz, Kyle D. 01 December 2011 (has links)
Over the past decade, researchers have been developing new ways to model pedestrian egress especially in emergency situations. The traditional methods of modeling pedestrian egress, including ow-based modeling and cellular automata, have been shown to be poor models of human behavior at an individual level, as well as failing to capture many important group social behaviors of pedestrians. This has led to the exploration of agent-based modeling for crowd simulations including those involving pedestrian egress. Using this model, we evaluate different heuristic functions for predicting good egress routes for a variety of real building layouts. We also introduce reinforcement learning as a means to represent individualized pedestrian route knowledge. Finally, we implement a group formation technique, which allows pedestrians in a group to share route knowledge and reach a consensus in route selection. Using the group formation technique, we consider the effects such knowledge sharing and consensus mechanisms have on pedestrian egress times.
68

From diversity to convergence : British computer networks and the Internet, 1970-1995

Rutter, Dorian James January 2005 (has links)
The Internet's success in the 21st century has encouraged analysts to investigate the origin of this network. Much of this literature adopts a teleological approach. Works often begin by discussing the invention of packet switching, describe the design and development of the ARPANET, and then examine how this network evolved into the Internet. Although the ARPANET was a seminal computer network, these accounts usually only briefly consider the many other diverse networks that existed. In addition, apart from momentary asides to alternative internetworking solutions, such as the Open Systems Interconnection (OSI) seven-layer reference model, this literature concentrates exclusively on the ARPANET, the Internet, and the World Wide Web. While focusing on these subjects is important and therefore justified, it can leave the reader with the impression that the world of networking started with the ARPANET and ended with the Internet. This thesis is an attempt to help correct this misconception. This thesis analyses the evolution of British computer networks and the Internet between the years 1970 and 1995. After an introduction in Chapter 1, the thesis analyses several networks. In Chapters 2 and 3, the focus is on academic networks, especially JANET and SuperJANET. Attention moves to videotex networks in Chapter 4, specifically Prestel, and in Chapter 5, the dissertation examines electronic mail networks such as Telecom Gold and Cable & Wireless Easylink. Chapter 6 considers online services, including CompuServe, American Online, and the Microsoft Network, and the thesis ends with a conclusion in Chapter 7. All of the networks discussed used protocols that were incompatible with each other which limited the utility of the networks for their users. Although it was possible that OSI or another solution could have solved this problem, the Internet's protocols achieved this objective. This thesis shows how the networks converged around TCP/IP.
69

A conceptual approach to the early learning of algebra using a computer

Thomas, Michael Oliver John January 1988 (has links)
This thesis describes an investigation into the conceptual understanding of algebra by early learners (age 11-13 years) and how a computer-based approach may be used to improve such, without any consequent loss of manipulative skills. The psychological framework for the investigation centred on the importance of the individual child's construction of a cognitive framework of knowledge and the relevance of the current state of this to the facilitating of concept acquisition. As such it incorporates elements of the developmental psychology of Piaget, Ausubel and Skemp. Furthermore, in order to assist in the synthesis of a sufficiently broad psychological theory of education it was necessary to postulate the formulation of a new integrated bi-modal model of learning. This is described, along with details of its application and significance to a theory of cognitive integration which is designed to promote versatile learning (after Brumby, 1982) in mathematics through a relational linking of global/holistic and serialist/analytic schemas. The research comprised two initial investigations followed by the main experiment. The results of the initial investigations with early learners of algebra showed that the dynamic algebra module written for the research produced a significant improvement in the children's conceptual understanding of algebra. The main experiment sought to further clarify this improvement and to compare and contrast it with that produced by a traditional skill-based algebra module. In order to facilitate this comparison, the performance of 57 matched pairs of pupils from two groups of three parallel forms of the first year of a 12+ entry co-educational secondary school was analysed. The results of the investigation confirmed the value of the dynamic algebra module as a generic organiser (in the sense of Tall, 1986) for the understanding of algebraic concepts, producing a significant difference in conceptual understanding, without any detrimental effect on manipulative skills. Furthermore, the beneficial effects of the programme were such that its results showed that it had provided a better base than the skill-based approach for the extension of algebraic understanding past the initial stages and into more involved areas such as linear equations and inequalities. The findings of this research show that the use of a module based on a computer environment, with its many advantages for conceptual learning, prior to the more formal introduction of algebraic techniques, is of great cognitive value. They also provide evidence for the theoretical model of learning proposed in the thesis, and suggest that for the production of a versatile learner in mathematics, more attention should be paid to the integration of the global/holistic abilities of the individual with his/her serialist/analytic abilities. The implications for the future are that such abilities, and hence mathematical competence may well be improved in other areas of the curriculum by the use of the computer within a similar theoretical framework.
70

Två sidor av samma mynt : att marknadsföra översatt litteratur

Rüegg, Jana January 2014 (has links)
No description available.

Page generated in 0.0184 seconds