• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 23
  • 12
  • 7
  • 5
  • 5
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 62
  • 34
  • 10
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Aprimorando o corretor gramatical CoGrOO / Refining the CoGrOO Grammar Checker

Silva, William Daniel Colen de Moura 06 March 2013 (has links)
O CoGrOO é um corretor gramatical de código aberto em uso por milhares de usuários de uma popular suíte de escritório de código aberto. Ele é capaz de identificar erros como: colocação pronominal, concordância nominal, concordância sujeito-verbo, uso da crase, concordância nominal e verbal e outros erros comuns de escrita em Português do Brasil. Para tal, o CoGrOO realiza uma análise híbrida: inicialmente o texto é anotado usando técnicas estatísticas de Processamento de Linguagens Naturais e, em seguida, um sistema baseado em regras é responsável por identificar os possíveis erros gramaticais. O objetivo deste trabalho é reduzir a quantidade de omissões e intervenções indevidas e, ao mesmo tempo, aumentar a quantidade de verdadeiros positivos sem, entretanto, adicionar novas regras de detecção de erros. A última avaliação científica do corretor gramatical foi realizada em 2006 e, desde então, não foram realizados estudos detalhados quanto ao seu desempenho, apesar de o código do sistema ter passado por substancial evolução. Este trabalho contribuirá com uma detalhada avaliação dos anotadores estatísticos e os resultados serão comparados com o estado da arte. Uma vez que os anotadores do CoGrOO estão disponíveis como software livre, melhorias nesses módulos gerarão boas alternativas a sistemas proprietários. / CoGrOO is an open source Brazilian Portuguese grammar checker currently used by thousands of users of a popular open source office suite. It is capable of identifying Brazilian Portuguese mistakes such as pronoun placement, noun agreement, subject-verb agreement, usage of the accent stress marker, subject-verb agreement, and other common errors of Brazilian Portuguese writing. To accomplish this, it performs a hybrid analysis; initially it annotates the text using statistical Natural Language Processing (NLP) techniques, and then a rule-based check is performed to identify possible grammar errors. The goal of this work is to reduce omissions and false alarms while improving true positives without adding new error rules. The last rigorous evaluation of the grammar checker was done in 2006 and since then there has been no detailed study on how it has been performing. This work will also contribute a detailed evaluation of low-level NLP modules and the results will be compared to state-of-the-art results. Since the low-level NLP modules are available as open source software, improvements on their performance will make them robust, free and ready-to-use alternatives for other systems.
12

Aprimorando o corretor gramatical CoGrOO / Refining the CoGrOO Grammar Checker

William Daniel Colen de Moura Silva 06 March 2013 (has links)
O CoGrOO é um corretor gramatical de código aberto em uso por milhares de usuários de uma popular suíte de escritório de código aberto. Ele é capaz de identificar erros como: colocação pronominal, concordância nominal, concordância sujeito-verbo, uso da crase, concordância nominal e verbal e outros erros comuns de escrita em Português do Brasil. Para tal, o CoGrOO realiza uma análise híbrida: inicialmente o texto é anotado usando técnicas estatísticas de Processamento de Linguagens Naturais e, em seguida, um sistema baseado em regras é responsável por identificar os possíveis erros gramaticais. O objetivo deste trabalho é reduzir a quantidade de omissões e intervenções indevidas e, ao mesmo tempo, aumentar a quantidade de verdadeiros positivos sem, entretanto, adicionar novas regras de detecção de erros. A última avaliação científica do corretor gramatical foi realizada em 2006 e, desde então, não foram realizados estudos detalhados quanto ao seu desempenho, apesar de o código do sistema ter passado por substancial evolução. Este trabalho contribuirá com uma detalhada avaliação dos anotadores estatísticos e os resultados serão comparados com o estado da arte. Uma vez que os anotadores do CoGrOO estão disponíveis como software livre, melhorias nesses módulos gerarão boas alternativas a sistemas proprietários. / CoGrOO is an open source Brazilian Portuguese grammar checker currently used by thousands of users of a popular open source office suite. It is capable of identifying Brazilian Portuguese mistakes such as pronoun placement, noun agreement, subject-verb agreement, usage of the accent stress marker, subject-verb agreement, and other common errors of Brazilian Portuguese writing. To accomplish this, it performs a hybrid analysis; initially it annotates the text using statistical Natural Language Processing (NLP) techniques, and then a rule-based check is performed to identify possible grammar errors. The goal of this work is to reduce omissions and false alarms while improving true positives without adding new error rules. The last rigorous evaluation of the grammar checker was done in 2006 and since then there has been no detailed study on how it has been performing. This work will also contribute a detailed evaluation of low-level NLP modules and the results will be compared to state-of-the-art results. Since the low-level NLP modules are available as open source software, improvements on their performance will make them robust, free and ready-to-use alternatives for other systems.
13

Korektor anglické gramatiky: určité a neurčité členy / English grammar checker and corrector: the determiners

Auersperger, Michal January 2017 (has links)
Correction of the articles in English texts is approached as an article generation task, i.e. each noun phrase is assigned with a class corresponding to the definite, indefinite or zero article. Supervised machine learning methods are used to first replicate and then improve upon the best reported result in the literature known to the author. By feature engineering and a different choice of the learning method, about 34% drop in error is achieved. The resulting model is further compared to the performance of expert annotators. Although the comparison is not straightforward due to the differences in the data, the results indicate the performance of the trained model is comparable to the human-level performance when measured on the in-domain data. On the other hand, the model does not generalize well to different types of data. Using a large-scale language model to predict an article (or no article) for each word of the text has not proved successful. 1
14

En utredande jämförelse av programvaror vid BIM-samordning / BIM: an Investigative Comparison of Software for BIM Coordination

Victor, Nyström January 2018 (has links)
BIM är en fras som har tagits upp allt frekventare inom byggbranschen senaste åren. Uttrycket har olika innebörd beroende på vem som tillfrågas och dennes kunskap inom området. BIM-verktyg är något som med åren också har ökat på marknaden. Verktygen har olika användningsområden beroende på vilken aktör som ska utnyttja det. I BIM-projekt används såkallade samordningsverktyg för att bland annat kontrollera olika discipliners modeller där valet av programvara är avgörande beroende på funktionalitet och nyttjare.  I detta arbete studerades två programvaror som kan användas vid BIM-samordning. Genom en fallstudie har det undersökts vilket utav programvarorna Autodesk Navisworks Manage 2015 och Solibri Model Checker 9.5 som passar bäst för Grontmij AB i Eskilstuna efter deras behov och förutsättningar. Efter en grundlig studie av båda programvarorna påvisar rapporten att den rekommenderade programvaran specifikt för Grontmij AB i Eskilstuna är Solibri Model Checker 9.5 i enlighet med företagets behov och förutsättningar. / BIM is a phrase that has been raised increasingly frequent in the construction industry in recent years. The term has different meanings depending on who is asked. BIM tools is something that over the years also has increased on the market. The tools have different use depending on the user. Bim projects use so-called coordination tools for including control of various disciplines models where the choice of software is different, depending on functionality and users. This thesis addresses two software programs that can be used in BIM coordination. Through a case study, the author investigated which of the softwares Autodesk Navisworks Manage 2015 and Solibri Model Checker 9.5 is best for Grontmij AB in Eskilstuna to their needs and circumstances.
15

Kvalitetssäkring av objektsbaserade IFC-filer för mängdavtagningar och kostnadskalkyler : En fallstudie av levererade konstruktionsmodeller, vid Tyréns AB Stockholm / Quality assurance of object-based IFC-files for quantity takeoff and cost estimating : A case study of delivered construction models, at Tyréns AB Stockholm

Gustavsson, Sam, Johansson, Filip January 2013 (has links)
Att utföra kostnadskalkyler och mängdavtagningar utifrån objektbaserade 3D-modeller har visat sig vara problematiskt då modellerna inte uppfyller de förväntade kraven på informationsnivå. De bristande modellerna leder till en större osäkerhet då kalkylatorn skall upprätta kostnadskalkyler, vilket utgör bakgrunden till de frågeställningar som rapporten utgörs av. Projektet Biomedicum, som skall bli det toppmoderna laboratoriet åt Karolinska Institutet bedrivs som ett fullskaligt BIM-projekt, där de objektbaserade 3D-modellerna utgör underlaget för bland annat kostnadskalkyler i systemhandlingsskedet. Kalkylatorn i projektet, har uttryckt ett missnöje gällande kvaliteten i modellerna, och menar på att det finns stora brister som leder till att risken för att osäkra kalkyler upprättas ökar. Denna rapport kommer således att se över hur det är möjligt att kvalitetssäkra en objektbaserad 3D-modell i IFC-format i programvaran Solibri Model Checker, utifrån de projektspecifika kraven som råder i projektet. Rapporten utgörs av en fallstudie och en litteraturstudie, och till viss del även av intervjuer. I fallstudien undersöks det om Solibri Model Checker är kapabel till att kontrollera modellerna utifrån de projektspecifika krav som råder i projektet, samt att analysera resultatet av kontrollerna som har utförts. Intervjuerna i sin tur har legat till grund för ett antal problem som har ansetts vara relevanta för rapporten. Problem som kan härledas till juridiska svårigheter, avsaknad av en gemensam branschstandard, meningsskiljaktigheter kring innehållet i en objektbaserad 3D-modell etc.  I fallstudien har det visat sig att 3D-modellerna är bristande ur kalkylsynpunkt. Detta leder till svårigheter och osäkerheter då Skanska upprättar kalkylerna. Anledningarna till detta är flera. Det handlar bland annat om att konstruktören använder sig av en modelleringsteknik som inte är anpassad för kalkylering, men även att det inte finns tydliga riktlinjer om vilken information som bör finnas i objekten i systemhandlingsskedet, vilket har framgått under intervjuerna. Det har även framgått under intervjuerna, att ett avgörande problem som kan anknytas till att modellerna är av bristande kvalitet ur kalkylsynpunkt, är på grund av att det inte har ställts krav tillräckligt tidigt i processen. Vad som är viktigt för att få en fungerande process där de objektbaserade 3D-modellerna skall användas för mängdavtagningar och kostnadskalkyler, är att i ett tidigt skede fastslå vilken information som behövs i objekten för att trovärdiga kalkyler skall vara möjliga att upprätta. Vikten av att det finns en tydlig IT-handledning är även stor, då det är handledningen som ligger till underlag om vilken information som skall finnas med i modellerna. / To perform quantity takeoffs and cost estimates based on 3D models is problematic, if the models do not meet the expected demands of information. The weaknesses of the models leads to uncertainty when the calculator will establish cost estimates, which forms the background to the issues that the thesis constitutes. The project Biomedicum is conducted as a full-fledged BIM project, where the object based 3D models are the basis for cost estimates in the system phase. The calculator in the project has expressed dissatisfaction regarding the quality of the models, and believes that there are major flaws that lead to the risk of uncertain calculations increasing. Of this reason, this thesis will analyze the possibilities about ensuring the quality of an object based 3D model in IFC format, by using the software Solibri Model Checker. This thesis consists of a literature review, interviews and a case study. The case study examines whether Solibri Model Checker is capable of checking the models based on the project specific requirements, and to analyze the results of the checks that have carried out in the case study. The interviews have been the basis for the number of problems that have been considered relevant for the thesis. Problems that can be traced to legal difficulties, lack of a common set requirements, disagreements about the content of an object based 3D model etc. The case study has shown that the 3D models are inadequate to perform cost estimates, which leads to difficulties and uncertainties in the calculation process. There are several reasons for this. The designers of the project have used wrong modeling technique for quantity takeoff and cost estimating. But also that there are no clear guidelines as to what information should be included in the items in the document stage system, which has emerged during the interviews. It has also emerged during the interviews that crucial problems that may be linked to the models are of poor quality, because it has not been required early enough in the process. What is important to get a working process in which the object-based 3D models to be used for quantity takeoffs and cost estimates, is that in the early stages determine what information is needed in the objects to credible estimates will be possible to establish. The importance of a clear IT manual is also great, as it is the manual who serves as the basis of the information to be included in the models.
16

En granskning av IFC-exporter från Tekla Structures / An examination of IFC-exports from Tekla Structures

Sanna, Karemsijan, Kasabian, Maral January 2019 (has links)
Idag tillämpas BIM (Building Information Modeling) som ett ledande verktyg inom byggprojektering och möjliggör lösningar som är hållbara och optimerade mellan olika discipliner för att effektivt hantera informationsflöden. Vid överföring av data mellan programvaror används ett neutralt filformat, IFC (Industry Foundation Classes). Detta filformat exporteras från 3Dmodelleringsprogrammet Tekla Structures. Syftet med detta examensarbete är att granska orsaken till saknad information vid IFC-export från Tekla Structures. Studien innefattar en fallstudie, komparativ studie samt intervjuer. I fallstudien studeras inställningarna för en IFC-export från en komplex modell i Tekla Structures. Den komparativa studien berör två samordningsprogram Solibri Model Checker och Navisworks Manage, som sedan ställs mot varandra. Det som undersöks är hur IFC-filer lämpar sig i samordningsprogrammen. Intervjuerna har legat till grund och har gett tydliga riktlinjer under arbetets gång. Resultatet av denna studie tyder på att orsakerna kring saknad information i IFC-filer är den mänskliga faktorn och bristande kunskaper av inställningarna i en IFC-export. Ytterligare resultat är jämförelsen mellan samordningsprogrammen som visar att båda är lämpade för IFC-formatet. Genom den komparativa studien har det visat att Solibri lämpar sig bättre för IFC-exporter från Tekla Structures. Med hänsyn till examensarbetets resultat är den mänskliga faktorn avgörande för informationsförluster. Den mänskliga faktorn utgör att det råder bristande kommunikation samt bristande kunskap om IFC-exporter. / Today, BIM (Building Information Modeling) is applied as a leading tool in structural design and modeling and enables solutions that are sustainable and optimized between different disciplines to effectively manage information flows. In connection with transfers between software’s a neutral file format, IFC (Industry Foundation Classes), is used. This file format is exported from the 3D-modeling program Tekla Structures. The purpose of this thesis is to examine the reason for missing information on IFC exports from Tekla Structures. The study includes a case study, comparative study and interviews In the case study, the settings for an IFC- export are studied from a complex 3D- model in Tekla Structures. The comparative study concerns two coordination programs Solibri Model Checker and Navisworks Manage, which are compared against each other. What is examined is how IFC- files are suitable in the coordination programs. The interviews have given clear guidelines during the work. The results indicate that the causes of missing information in IFC -files are the human factor and the lack of knowledge of the settings of an IFC- export. Additional to the results that was achieved are the comparison between the coordination programs shows that both programs are suitable for the IFCformat. The difference between the coordination programs is how one of them signalized the format better than the other Regarding the results, the human factor is crucial for the information loss. The human factor has shown that the lack of communication and knowledge of IFC- exports is a reason for the information loss. Through the comparative study, it has also been shown that Solibri is better suited for IFCexports from Tekla Structures.
17

Explanation of the Model Checker Verification Results

Kaleeswaran, Arut Prakash 20 December 2023 (has links)
Immer wenn neue Anforderungen an ein System gestellt werden, müssen die Korrektheit und Konsistenz der Systemspezifikation überprüft werden, was in der Praxis in der Regel manuell erfolgt. Eine mögliche Option, um die Nachteile dieser manuellen Analyse zu überwinden, ist das sogenannte Contract-Based Design. Dieser Entwurfsansatz kann den Verifikationsprozess zur Überprüfung, ob die Anforderungen auf oberster Ebene konsistent verfeinert wurden, automatisieren. Die Verifikation kann somit iterativ durchgeführt werden, um die Korrektheit und Konsistenz des Systems angesichts jeglicher Änderung der Spezifikationen sicherzustellen. Allerdings ist es aufgrund der mangelnden Benutzerfreundlichkeit und der Schwierigkeiten bei der Interpretation von Verifizierungsergebnissen immer noch eine Herausforderung, formale Ansätze in der Industrie einzusetzen. Stellt beispielsweise der Model Checker bei der Verifikation eine Inkonsistenz fest, generiert er ein Gegenbeispiel (Counterexample) und weist gleichzeitig darauf hin, dass die gegebenen Eingabespezifikationen inkonsistent sind. Hier besteht die gewaltige Herausforderung darin, das generierte Gegenbeispiel zu verstehen, das oft sehr lang, kryptisch und komplex ist. Darüber hinaus liegt es in der Verantwortung der Ingenieurin bzw. des Ingenieurs, die inkonsistente Spezifikation in einer potenziell großen Menge von Spezifikationen zu identifizieren. Diese Arbeit schlägt einen Ansatz zur Erklärung von Gegenbeispielen (Counterexample Explanation Approach) vor, der die Verwendung von formalen Methoden vereinfacht und fördert, indem benutzerfreundliche Erklärungen der Verifikationsergebnisse der Ingenieurin bzw. dem Ingenieur präsentiert werden. Der Ansatz zur Erklärung von Gegenbeispielen wird mittels zweier Methoden evaluiert: (1) Evaluation anhand verschiedener Anwendungsbeispiele und (2) eine Benutzerstudie in Form eines One-Group Pretest-Posttest Experiments. / Whenever new requirements are introduced for a system, the correctness and consistency of the system specification must be verified, which is often done manually in industrial settings. One viable option to traverse disadvantages of this manual analysis is to employ the contract-based design, which can automate the verification process to determine whether the refinements of top-level requirements are consistent. Thus, verification can be performed iteratively to ensure the system’s correctness and consistency in the face of any change in specifications. Having said that, it is still challenging to deploy formal approaches in industries due to their lack of usability and their difficulties in interpreting verification results. For instance, if the model checker identifies inconsistency during the verification, it generates a counterexample while also indicating that the given input specifications are inconsistent. Here, the formidable challenge is to comprehend the generated counterexample, which is often lengthy, cryptic, and complex. Furthermore, it is the engineer’s responsibility to identify the inconsistent specification among a potentially huge set of specifications. This PhD thesis proposes a counterexample explanation approach for formal methods that simplifies and encourages their use by presenting user-friendly explanations of the verification results. The proposed counterexample explanation approach identifies and explains relevant information from the verification result in what seems like a natural language statement. The counterexample explanation approach extracts relevant information by identifying inconsistent specifications from among the set of specifications, as well as erroneous states and variables from the counterexample. The counterexample explanation approach is evaluated using two methods: (1) evaluation with different application examples, and (2) a user-study known as one-group pretest and posttest experiment.
18

A language-independent static checking system for coding conventions

Mount, Sarah January 2013 (has links)
Despite decades of research aiming to ameliorate the difficulties of creating software, programming still remains an error-prone task. Much work in Computer Science deals with the problem of specification, or writing the right program, rather than the complementary problem of implementation, or writing the program right. However, many desirable software properties (such as portability) are obtained via adherence to coding standards, and therefore fall outside the remit of formal specification and automatic verification. Moreover, code inspections and manual detection of standards violations are time consuming. To address these issues, this thesis describes Exstatic, a novel framework for the static detection of coding standards violations. Unlike many other static checkers Exstatic can be used to examine code in a variety of languages, including program code, in-line documentation, markup languages and so on. This means that checkable coding standards adhered to by a particular project or institution can be handled by a single tool. Consequently, a major challenge in the design of Exstatic has been to invent a way of representing code from a variety of source languages. Therefore, this thesis describes ICODE, which is an intermediate language suitable for representing code from a number of different programming paradigms. To substantiate the claim that ICODE is a universal intermediate language, a proof strategy has been developed: for a number of different programming paradigms (imperative, declarative, etc.), a proof is constructed to show that semantics-preserving translation exists from an exemplar language (such as IMP or PCF) to ICODE. The usefulness of Exstatic has been demonstrated by the implementation of a number of static analysers for different languages. This includes a checker for technical documentation written in Javadoc which validates documents against the Sun Microsystems (now Oracle) Coding Conventions and a checker for HTML pages against a site-specifc standard. A third system is targeted at a variant of the Python language, written by the author, called python-csp, based on Hoare's Communicating Sequential Processes.
19

L'apport des correcticiels pour la correction de textes d'élèves du secondaire

Mireault, Marie-Hélène January 2009 (has links)
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal.
20

User-Defined XML-to-Relational Mapping / User-Defined XML-to-Relational Mapping

Kohan, Tomáš January 2007 (has links)
In the present work we study opportunities of mapping the XML data into relational systems. In the first part we describe basic terminology used in this work and subsequently also basic techniques for mapping XML data into the relational database. In the next part we engaged in theoretical methods like MXM and ShreX, which were proposed on premises of a university or by a research group. In the third part we describe mapping methods, that are used in some commercial systems like Oracle, DB2 and MS SQL. In the whole second half of this work we propose a new mapping method (XRM), which bring in several new features, while the origin positive features are kept. At the end we analyze the prototype implementation of the proposed mapping method.

Page generated in 0.0206 seconds