• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 18
  • 17
  • 6
  • 5
  • 2
  • 1
  • 1
  • Tagged with
  • 60
  • 13
  • 10
  • 8
  • 7
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Behavior recording with the scoring program MouseClick : A study in cross platform and precise timing developing

Karlsson, Erik January 2010 (has links)
This thesis will deal with problems and solutions of cross-platform developing using MoNo framework as a replacement of Microsoft .NET framework on Linux and Mac OS-X platforms. It will take in account matters such as limitations in the filesystem to problems with deploying released programs. It will also deal with demands of precise timing and the need of efficient code on precise tasks to construct a program used for creating data from recordings of animals. These animals is set to perform a task, for example exploring a labyrinth or running on a rod, and it is all recorded on video. These videos are later reviewed by an observer which transcripts the recordings into data based on predefined behaviors and the time and frequency with which the animal is expressing them.
32

Behavior recording with the scoring program MouseClick : A study in cross platform and precise timing developing

Karlsson, Erik January 2010 (has links)
This thesis will deal with problems and solutions of cross-platform developing using MoNo framework as a replacement of Microsoft .NET framework on Linux and Mac OS-X platforms. It will take in account matters such as limitations in the filesystem to problems with deploying released programs. It will also deal with demands of precise timing and the need of efficient code on precise tasks to construct a program used for creating data from recordings of animals. These animals is set to perform a task, for example exploring a labyrinth or running on a rod, and it is all recorded on video. These videos are later reviewed by an observer which transcripts the recordings into data based on predefined behaviors and the time and frequency with which the animal is expressing them.
33

A Linguística de Corpus na formação do tradutor: compilação e proposta de análise de um corpus paralelo de aprendizes de tradução / Corpus linguistics in translator education: compilation and analysis proposal of a parallel corpus of translation learners

Oliveira, Joacyr Tupinambás de 01 December 2014 (has links)
Os estudos sobre o ensino da tradução no Brasil ainda oferecem muito espaço para discussões. Valendo-se disso, este trabalho traz como um de seus objetivos uma breve reflexão sobre a sala de aula e sugere um possível método de ensino de tradução baseado na análise de material produzido por tradutores-aprendizes. A intenção é que, por meio da Linguística de Corpus, consigamos observar o processo de construção do texto de chegada pela ótica do aluno, nos mesmos moldes que o fazemos ao analisar material produzido por aprendizes de idiomas. Para tanto, compilamos um corpus de aprendizes de tradução, constituído por oito textos originais e cerca de 100 traduções para cada um deles. Alinhar tantas traduções referentes a um original de modo a permitir análises não foi tarefa fácil. A estratégia empregada para superar tal dificuldade foi o desenvolvimento de uma metodologia específica de alinhamento tendo como ferramenta planilhas eletrônicas. Tal metodologia tornou-se o foco central desta pesquisa. A utilização de fórmulas para a manipulação de dados textuais na planilha eletrônica resultou em um corpus alinhado, com todos os textos de partida e suas referidas traduções com cabeçalhos e com todas as linhas etiquetadas. Esse procedimento possibilitou a organização de um corpus para ser analisado tanto no editor de planilhas eletrônicas quando em programas como AntConc e WordSmith Tools. Além disso, também apresentamos a planilha eletrônica como uma ferramenta didática para ser usada nas aulas de prática de tradução. / Studies on the teaching of translation in Brazil still offer room for discussions. Having that in mind, one of the goals of this research aims at fostering a brief reflection upon the classroom and proposes a teaching method based on the analyses of material produced by translation learners. We show that Corpus Linguistics can be used to analyze student translations in the same way we do when we analyze material produced by language learners. For that purpose, we compiled a corpus of translations produced by learners, consisting of eight source texts in English and about 800 translations into Portuguese, approximately 100 for each text. Aligning so many translations to their original texts to favor analyses was not a simple task. Such difficulties were overcome by the development of a methodology for alignment, which became the central focus of this research. By utilizing formulas to deal with textual data in spreadsheets resulted in an aligned corpus containing source texts and their referring translations with headers and all lines tagged. Such procedure allowed us to come up with a corpus to be analyzed in both the spreadsheet editor and in programs such as AntConc and WordSmith Tools. In addition to that, we also introduced the spreadsheets as a didactic tool to be used in translation practice classes.
34

A Linguística de Corpus na formação do tradutor: compilação e proposta de análise de um corpus paralelo de aprendizes de tradução / Corpus linguistics in translator education: compilation and analysis proposal of a parallel corpus of translation learners

Joacyr Tupinambás de Oliveira 01 December 2014 (has links)
Os estudos sobre o ensino da tradução no Brasil ainda oferecem muito espaço para discussões. Valendo-se disso, este trabalho traz como um de seus objetivos uma breve reflexão sobre a sala de aula e sugere um possível método de ensino de tradução baseado na análise de material produzido por tradutores-aprendizes. A intenção é que, por meio da Linguística de Corpus, consigamos observar o processo de construção do texto de chegada pela ótica do aluno, nos mesmos moldes que o fazemos ao analisar material produzido por aprendizes de idiomas. Para tanto, compilamos um corpus de aprendizes de tradução, constituído por oito textos originais e cerca de 100 traduções para cada um deles. Alinhar tantas traduções referentes a um original de modo a permitir análises não foi tarefa fácil. A estratégia empregada para superar tal dificuldade foi o desenvolvimento de uma metodologia específica de alinhamento tendo como ferramenta planilhas eletrônicas. Tal metodologia tornou-se o foco central desta pesquisa. A utilização de fórmulas para a manipulação de dados textuais na planilha eletrônica resultou em um corpus alinhado, com todos os textos de partida e suas referidas traduções com cabeçalhos e com todas as linhas etiquetadas. Esse procedimento possibilitou a organização de um corpus para ser analisado tanto no editor de planilhas eletrônicas quando em programas como AntConc e WordSmith Tools. Além disso, também apresentamos a planilha eletrônica como uma ferramenta didática para ser usada nas aulas de prática de tradução. / Studies on the teaching of translation in Brazil still offer room for discussions. Having that in mind, one of the goals of this research aims at fostering a brief reflection upon the classroom and proposes a teaching method based on the analyses of material produced by translation learners. We show that Corpus Linguistics can be used to analyze student translations in the same way we do when we analyze material produced by language learners. For that purpose, we compiled a corpus of translations produced by learners, consisting of eight source texts in English and about 800 translations into Portuguese, approximately 100 for each text. Aligning so many translations to their original texts to favor analyses was not a simple task. Such difficulties were overcome by the development of a methodology for alignment, which became the central focus of this research. By utilizing formulas to deal with textual data in spreadsheets resulted in an aligned corpus containing source texts and their referring translations with headers and all lines tagged. Such procedure allowed us to come up with a corpus to be analyzed in both the spreadsheet editor and in programs such as AntConc and WordSmith Tools. In addition to that, we also introduced the spreadsheets as a didactic tool to be used in translation practice classes.
35

Development of a calculator for estimation and management of GHG emissions from public transit agency operations

Weigel, Brent Anthony 08 July 2010 (has links)
As managers of extensive vehicle fleets and transportation infrastructures, public transit agencies present unique opportunities for reducing greenhouse gas (GHG) emissions from the transportation sector. To achieve substantial and cost-effective GHG emissions reductions from their activities, public transit agencies need tools and resources that enable effective GHG emissions management. This research thesis presents the background, methodology, and results of the author's development of a public transit agency-level life cycle GHG emissions calculator. The development of the calculator involved a series of research efforts aimed at identifying and addressing the needs of transit agency GHG emissions management: a review of background information on climate change and public transit's role in mitigating climate change; a review of existing GHG emissions calculators for public transit agencies, a review of the methodologies for life cycle GHG emissions analysis; integration and adaption of existing calculation resources; development of calculator spreadsheets for estimating relevant lifecycle GHG emissions and quantifying GHG emission reduction cost-effectiveness; application of the developed calculator to a carbon footprint analysis for a typical mid-size to large-size transit agency; and application of the developed calculator to the evaluation of the cost-effectiveness of various potential strategies for reducing transit agency GHG emissions. The developed calculator provides an integrative resource for quantifying GHG emissions and costs of public transit agency activities, including GHG emission reduction strategies. Further research is needed to calibrate the estimation of upstream life cycle GHG emissions, particularly for vehicle manufacture and maintenance.
36

Investigation of Stress Changes at Mount St. Helens, Washington, and Receiver Functions at the Katmai Volcanic Group, Alaska, with an Additional Section on the Assessment of Spreadsheet-based Modules.

Lehto, Heather L. 01 January 2012 (has links)
Forecasting eruptions using volcano seismology is a subject that affects the lives and property of millions of people around the world. However, there is still much to learn about the inner workings of volcanoes and how this relates to the chance of eruption. This dissertation attempts to increase the breadth of knowledge aimed at helping to understand when a volcano is likely to erupt and how large that eruption might be. Chapters 2 and 3 focus on a technique that uses changes in the local stress field beneath a volcano to determine the source of these changes and help forecast eruptions, while Chapter 4 focuses on a technique that shows great potential to be used to image magma chambers beneath volcanoes by using receiver functions. In Chapters 2 and 3 the source mechanisms of shallow volcano-tectonic earthquakes recorded at Mount St. Helens are investigated by calculating hypocenter locations and fault plane solutions (FPS) for shallow earthquakes recorded during two eruptive periods (1981-1986 and 2004-2008) and two non-eruptive periods (1987-2004 and 2008-2011). FPS show a mixture of normal, reverse, and strike-slip faulting during all periods, with a sharp increase in strike-slip faulting observed in 1987-1997 and an increase in normal faulting between 1998 and 2004 and again on September 25-29, 2004. FPS P-axis orientations (a proxy for ó1) show a ~90° rotation with respect to regional ó1 (N23°E) during 1981-1986 and 2004-2008, bimodal orientations (~N-S and ~E-W) during 1987-2004, and bimodal orientations at ~N-E and ~S-W from 2008-2011. These orientations are believed to be due to pressurization accompanying the shallow intrusion and subsequent eruption of magma as domes during 1981-1986 and 2004-2008, and the buildup of pore pressure beneath a shallow seismogenic volume during 1987-2004 and 2008-2011. Chapter 4 presents a study using receiver functions, which show the relative response of the Earth beneath a seismometer. Receiver functions are produced by deconvolving the vertical component of a seismogram from the horizontal components. The structure of the ground beneath the seismometer can then be inferred from the arrivals of P-to-S converted phases. Receiver functions were computed for the Katmai Volcanic Group, Alaska, at two seismic stations (KABU and KAKN) between January 2005 and July 2011. Receiver functions from station KABU clearly showed the arrival of the direct P-wave and the arrival from the Moho; however, receiver functions from station KAKN did not show the arrival from the Moho. In addition, changes in the amplitude and polarity of arrivals on receiver functions suggested that the structure beneath both KABU and KAKN was complex. Station KABU is likely underlain by dipping layers and/or anisotropy, while station KAKN may lie over a basin structure, an attenuating body, or some other highly complex structure. However, it is impossible to say for certain what the structure is under either station as the azimuthal coverage is poor and thus the structure is unable to be modeled. This dissertation also includes a section (Chapter 6) on the assessment of spreadsheet-based modules used in two Introductory Physical Geology courses at the University of South Florida (USF). When faculty at USF began using spreadsheet-based modules to help teach students math and geology concepts the students complained that they spent more time learning how to use Excel than they did learning the concepts presented in the modules. To determine whether the sharp learning curve for Excel was hindering learning we divided the students in two Introductory Physical Geology courses into two groups: one group was given a set of modules which instructed them to use Excel for all calculations; the other group was simply told to complete the calculations but was not instructed what method to use. The results of the study show that whether or not the students used Excel had very little to do with the level of learning they achieved. Despite complaints that Excel was hindering their learning, students in the study attained high gains for both the math and geology concepts presented in the modules whether they used Excel or not.
37

Implementering av ett verktyg för dokumentering av riskanalyser / Implementation of a tool for the documentation of risk analysis

Hariri, Bashar, Sven, Skalleberg January 2014 (has links)
För att en maskin ska anses som säker för användning krävs det att den uppfyller maskindirektivet. Maskindirektivet är väldigt generellt och kortfattat. Därför finns det standarder som förklarar mer specifikt maskindirektivets olika delar. Uppfylls inte standarderna så uppfylls inte maskindirektivet för respektive område och därmed är maskinen inte redo för användning. Examensarbetet har utförts på Rapid Granulator AB. Företaget är idag en av världens största producenter av granuleringskvarnar. Granuleringskvarnarna maler ner plastavfall till en mer kompakt form, granulat. Granulatet kan sedan återanvändas i form av formsprutning. Syftet med examensarbetet är att finna ett smidigare verktyg för att dokumentera riskanalyser. Uppdragsgivarens önskemål är att verktyget ska erbjuda användaren en tydligare överblick av riskanalysens ingående delar. Den inmatade informationen i verktyget ska vara lätt att granska och redigera. För att uppnå syftet och på bästa sätt möta uppdragsgivarens behov har följande två frågeställningar tagits fram: Hur utför andra företag sina riskanalyser och vilka verktyg använder de sig av?   Vilket verktyg är bäst anpassat för Rapid Granulator AB och hur är det uppbyggt?   Utifrån en förstudie som har bestått av metoderna: intervjuer och observation, har ett verktyg valts ut. Intervjuerna utfördes tillsammans med fem medelstora företag och observationen skedde hos Rapid Granulator AB. Förstudiefasen följdes sedan upp av en genomförandefas, där metoden litteraturstudier togs till hjälp. Litteraturstudien skapade möjligheten till att få en klarare förståelse av verktyget och dess användning.   Förstudien har resulterat i att författarna fann fyra olika verktyg. Med tanke på behoven och bristerna i nuvarande verktyg föll valet på verktyget Microsoft Access. Utifrån företagets tidigare riskanalyser skapade författarna en databas i Access. I verktyget implementerades sedan funktioner som grundade sig på uppdragsgivarens önskemål. Funktioner så som rullgardinsmeny, färgkodning och modulbasering infördes. Detta resulterade i ett smidigare verktyg, jämfört med det som användes tidigare.                                                                    Slutsatsen är att företaget gynnas mer vid användning av Microsoft Access i fortsättningen. Rekommendationen till företaget är att fortsätta använda sig av Microsoft Access vid kommande riskanalyser och därmed bygga ut databasen ytterligare. / The Machinery Directive gives a general overview of legal safety regulations that machinery must align to. Further specific safety requirements and standards for each individual components or assembly of parts must also be met in order to achieve a low safety risk. This work has been carried out for Rapid Granulator, currently one of the world’s largest producers of granulators. Granulators grind down plastic waste into a compact form of granules that can then be reused in the form of injection moulding. The purpose of this report is to propose a flexible tool for documenting the risk analysis for the granulator machinery, either by building on an existing tool or developing an entirely new one. The priority is to create a database that offers the user a clear view of the risk analysis for each component and allows data to be easily edited. In order to achieve this purpose and fully understand the client's needs, the following two questions have been considered: How do others business perform their risk analysis and what tools do they use? Which tools are best suited for Rapid Granulator AB and how is it structured? A tool has been selected based on a pilot study that consisted of interviews and an observation. The interviews were conducted with five medium-sized enterprises and the observation took place at Rapid Granulator AB. The pre-study phase of the supporting method literature resulted in a clearer understanding of the tool and its use. Authours found four different tools that can be used for documentation of risk analysis. Microsoft Access was selected as the most suitable tool to overcome the deficiencies of the previous tool. The authors have implemented features to meet the client's wishes. Features such as dropdown menu, color-coding and modulation basing are key additions that have resulted in a more flexible tool. The conclusion suggests that the company should exploit this Microsoft Access risk analyses database that has been designed specifically for their machinery and should consider developing the tool further.
38

Pattern-based data and application integration in service oriented architectures

Kongdenfha, Woralak, Computer Science & Engineering, Faculty of Engineering, UNSW January 2009 (has links)
The success of Web services comes from the benefits that it brings in reducing the cost and the time needed to develop data and applications by reusing them, and simplifying their integrations through standardization. However, standardization in Web services does not remove the need for adapters due to possible heterogeneity among service interface and protocol definitions. Moreover, the current service APIs are targeted toward professional programmers, but not accessible to a wider class of users without programming expertise, but would never the less like to build their own integrated applications. In this dissertation, we propose methods and tools to support both service developers and non-expert users in their data and application integration tasks. To support service developers, we propose a framework that enables rapid development of Web service adapters. We investigate particularly the problem of service adaptation focusing on business interface and protocol layers. Our study shows that many differences between business interfaces and protocols are recurring. We introduce mismatch patterns to capture these recurring differences and provide solutions to resolve them. We present the notion of adaptation aspects, which is based on the aspect-oriented programming paradigm, to enable rapid development and deployment of service adapters. We also present a comparative study between standalone and aspect-oriented adapters development. The study shows that the aspect-oriented approach is preferable in many cases, especially when adapters need to access internal states of services. The proposed approach is implemented in a prototype tool, which is used to illustrate how it simplifies adapters development through a case study. To support users without programming expertise, we propose a spreadsheet-based Web mashups development framework, which enables users to develop mashups in the popular spreadsheet environment. First, we provide a mechanism that makes structured data first class values of spreadsheet cells. Second, we propose a new component model that can be used to develop fairly sophisticated mashups, involving joining data sources and keeping spreadsheet data up to date. Third, to simplify mashup development, we provide a collection of spreadsheet-based mashup patterns that captures common Web data access and spreadsheet presentation functionalities. Users can reuse and customize these patterns to build spreadsheet-based Web mashups instead of developing them from scratch. Fourth, we enable users to manipulate structured data presented on spreadsheet in a drag-and-drop fashion. Finally, we have developed and tested a prototype tool to demonstrate the utility of the proposed framework.
39

Pattern-based data and application integration in service oriented architectures

Kongdenfha, Woralak, Computer Science & Engineering, Faculty of Engineering, UNSW January 2009 (has links)
The success of Web services comes from the benefits that it brings in reducing the cost and the time needed to develop data and applications by reusing them, and simplifying their integrations through standardization. However, standardization in Web services does not remove the need for adapters due to possible heterogeneity among service interface and protocol definitions. Moreover, the current service APIs are targeted toward professional programmers, but not accessible to a wider class of users without programming expertise, but would never the less like to build their own integrated applications. In this dissertation, we propose methods and tools to support both service developers and non-expert users in their data and application integration tasks. To support service developers, we propose a framework that enables rapid development of Web service adapters. We investigate particularly the problem of service adaptation focusing on business interface and protocol layers. Our study shows that many differences between business interfaces and protocols are recurring. We introduce mismatch patterns to capture these recurring differences and provide solutions to resolve them. We present the notion of adaptation aspects, which is based on the aspect-oriented programming paradigm, to enable rapid development and deployment of service adapters. We also present a comparative study between standalone and aspect-oriented adapters development. The study shows that the aspect-oriented approach is preferable in many cases, especially when adapters need to access internal states of services. The proposed approach is implemented in a prototype tool, which is used to illustrate how it simplifies adapters development through a case study. To support users without programming expertise, we propose a spreadsheet-based Web mashups development framework, which enables users to develop mashups in the popular spreadsheet environment. First, we provide a mechanism that makes structured data first class values of spreadsheet cells. Second, we propose a new component model that can be used to develop fairly sophisticated mashups, involving joining data sources and keeping spreadsheet data up to date. Third, to simplify mashup development, we provide a collection of spreadsheet-based mashup patterns that captures common Web data access and spreadsheet presentation functionalities. Users can reuse and customize these patterns to build spreadsheet-based Web mashups instead of developing them from scratch. Fourth, we enable users to manipulate structured data presented on spreadsheet in a drag-and-drop fashion. Finally, we have developed and tested a prototype tool to demonstrate the utility of the proposed framework.
40

Basic Expeditionary Airfield Resource (BEAR) Requirements Analysis Tool (BRAT)

Hunt, Andrew W. January 2008 (has links)
Thesis (Master of Military Studies)-Marine Corps Command and Staff College, 2008. / Title from title page of PDF document (viewed on: Jan 5, 2010). Includes bibliographical references.

Page generated in 0.0464 seconds