• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 93
  • 28
  • 18
  • 12
  • 11
  • 8
  • 7
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 209
  • 45
  • 38
  • 32
  • 30
  • 29
  • 22
  • 22
  • 21
  • 18
  • 18
  • 17
  • 17
  • 17
  • 17
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

A Study of Backward Compatible Dynamic Software Update

January 2015 (has links)
abstract: Dynamic software update (DSU) enables a program to update while it is running. DSU aims to minimize the loss due to program downtime for updates. Usually DSU is done in three steps: suspending the execution of an old program, mapping the execution state from the old program to a new one, and resuming execution of the new program with the mapped state. The semantic correctness of DSU depends largely on the state mapping which is mostly composed by developers manually nowadays. However, the manual construction of a state mapping does not necessarily ensure sound and dependable state mapping. This dissertation presents a methodology to assist developers by automating the construction of a partial state mapping with a guarantee of correctness. This dissertation includes a detailed study of DSU correctness and automatic state mapping for server programs with an established user base. At first, the dissertation presents the formal treatment of DSU correctness and the state mapping problem. Then the dissertation presents an argument that for programs with an established user base, dynamic updates must be backward compatible. The dissertation next presents a general definition of backward compatibility that specifies the allowed changes in program interaction between an old version and a new version and identified patterns of code evolution that results in backward compatible behavior. Thereafter the dissertation presents formal definitions of these patterns together with proof that any changes to programs in these patterns will result in backward compatible update. To show the applicability of the results, the dissertation presents SitBack, a program analysis tool that has an old version program and a new one as input and computes a partial state mapping under the assumption that the new version is backward compatible with the old version. SitBack does not handle all kinds of changes and it reports to the user in incomplete part of a state mapping. The dissertation presents a detailed evaluation of SitBack which shows that the methodology of automatic state mapping is promising in deal with real world program updates. For example, SitBack produces state mappings for 17-75% of the changed functions. Furthermore, SitBack generates automatic state mapping that leads to successful DSU. In conclusion, the study presented in this dissertation does assist developers in developing state mappings for DSU by automating the construction of state mappings with a correctness guarantee, which helps the adoption of DSU ultimately. / Dissertation/Thesis / Doctoral Dissertation Computer Science 2015
72

Teoria da relatividade restrita : uma introdução histórico-epistemológica e conceitual voltada ao ensino médio

Fuchs, Eduardo Ismael January 2016 (has links)
Este trabalho é a narrativa de uma experiência didática de aplicação de um módulo que abordou um tópico de Física Moderna e Contemporânea, a Teoria da Relatividade Restrita, no Ensino Médio. A proposta foi aplicada em uma escola particular situada no município de Arroio do Meio, RS, em uma turma de terceira série do nível médio regular, sob o referencial teórico da teoria cognitiva de Jean William Fritz Piaget (1896-1980) e sob o referencial epistemológico de Thomas Samuel Kuhn (1922-1996). Descreve-se o planejamento das aulas, a implementação da proposta e os resultados obtidos com sua aplicação em sala de aula na modalidade de um curso extraclasse. A forma como o módulo foi pensado e o nível de profundidade que foi possível alcançar aparecem ao longo do texto, que também oferece uma revisão da literatura em que a relevância da inclusão da Física Moderna e Contemporânea no currículo do Ensino Médio é discutida. Os resultados indicam que é possível trabalhar tópicos de Física Moderna e Contemporânea no ensino regular, que os alunos apreciaram e mostraram disposição para aprender assuntos atuais e que o esforço para introduzir pequenas atualizações curriculares é válido e precisa ser incentivado como uma das possíveis alternativas para se alcançar a melhoria de qualidade de ensino na Educação Básica. Ao final, um produto educacional em formato de texto de apoio, orientação e motivação aos professores de Física é apresentado. / This work is the narrative of a didactic experience of application of a module that addressed a topic of Modern and Contemporary Physics, the Special Theory of Relativity, in High School. The proposal was applied in a private school located in the county of Arroio do Meio, RS, for one group of third grade of regular secondary level, under the theoretical framework of cognitive theory of Jean William Fritz Piaget (1896-1980) and under the epistemological framework of Thomas Samuel Kuhn (1922-1996). It describes the planning of classes, the implementation of the proposal and the results obtained from its application in the classroom in the form of an extracurricular course. The way the module has been designed and the level of depth that was achieved is discussed throughout the text, which also provide a review of the literature in which importance of inclusion of Modern and Contemporary Physics in High School curriculum is discussed. The results indicate that it is possible teach subjects of Modern and Contemporary Physics in regular education. The results also indicated that students enjoyed and showed willingness to learn current issues and the effort to introduce small curriculum updates is valid and needs to be encouraged as one of the possible alternatives to achieve the improvement of teaching quality in basic education. At the end, an educational product in form of text, guidance and motivation to physics teachers is presented.
73

Aktualizace obsahových, metodických a didaktických prvků e-learningové učebnice Mikroekonomie / Content, metodic and didactic elements update of e-learning textbook called Microeconomics

SVOBODOVÁ, Šárka January 2008 (has links)
The practical part of this work, which product is called Project "ME08", based on update content, methodic and didactic elements of distance education{\crq}s textbook called Základy pro porozumění tržní ekonomice (Bases Principles to Knowledge of Marked Economy) by PhDr. Pavel Hejtman, CSc. The main part of this work was to get acquainted with main content, methodic and didactic elements principles how to write and prepare training publication and concept parts of updates. In this area there is the head part how to make best of Internet{\crq}s information source, how to usage knowledge about based studies materials for www together with graphics aspects of documents, how to usage illustration and documentation counterparts which have been written by students like student{\crq}s projects. Outcome of practical part of this work is modern distance learning e-book, which is wider than normal textbook. This e-book is included in this theoretical part as a CD-ROM (demo version). A theoretical part is directed at write and prepares publication for distance education and how to use this known ledges in practical part of Project "ME08". In another part there are description updates of older digital version "EDEN3", which has been setup on Internet education platform e-Amos until this date. At last part there is a description of the new part of e-book Project "ME08".
74

Ajuste de modelos numericos de elementos finitos usando metodos de otimização / Finite element model updating using optimization methods

Araújo, Aldecir Alves de, 1975- 16 July 2007 (has links)
Orientador: Jose Maria Campos dos Santos / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica / Made available in DSpace on 2018-08-10T04:43:20Z (GMT). No. of bitstreams: 1 Araujo_AldecirAlvesde_M.pdf: 2211948 bytes, checksum: 75186f6627585ec7625dd5fb621ea2cc (MD5) Previous issue date: 2007 / Resumo: Predições de respostas dinâmicas de estruturas baseadas em técnicas numéricas ou experimentais estão sujeitas a erros e incertezas. Geralmente os modelos numéricos são idealizados e não levam em consideração certos fatores existentes na prática. A atualização de modelos numéricos a partir de medidas experimentais consiste em uma adequação matemática dos resultados obtidos nas técnicas de predição. Neste trabalho realizam-se ajustes de estruturas mecânicas modeladas pelo Método dos Elementos Finitos baseado em dados modais experimentais (freqüências naturais e formas dos modos), através dos métodos de otimização conhecidos como Método da Função Objetivo (MFO) e Método da Aproximação por Subproblema (MAS). Os resultados do MFO foram obtidos através de programas computacionais em MatLab, enquanto no MAS os mesmos foram obtidos utilizando o módulo de otimização de projetos (design optimization) do programa ANSYS. Um exemplo de uma viga engastada-livre foi realizado com ambos os métodos a partir de dados experimentais simulados. Um segundo exemplo de uma treliça espacial foi também avaliado por ambos os métodos com base em dados experimentais medidos. Um modelo simplificado de uma aeronave usando apenas o método MAS foi avaliado usando dados experimentais medidos. Os resultados são apresentados e as vantagens e desvantagens encontradas em cada método são apontadas e discutidas / Abstract: Prediction of structural dynamic responses basecÍ on numerical or experimental techniques are subj.ects to errors and uncertainties. Usually numerical models are idealized and do not take into account certain factors found in real structures. A numerical mo dei updating based on experimental measurements are basically a mathematical adjustment of the results obtained from prediction techniques. In this work model updating of mechanical structures modeled by the Finite Element Method based on experimental modal data (natural frequencies and modal shapes) are done. These are based on optimization methods know as Objective Function Method (OFM) and Subproblem Approximation Method (SAM). OFM results were obtained through MatLab computational codes, while SAM results were obtained using the ANSYS design optimization. An example of a clamped-free beam was done with both methods using simulated experimental data. A second example of a spatial truss was also evaluated for both methods, but using measured experimental data. A simple airplane model using only SAM was evaluated with measured experimental data. The results are presented and the advantages and disadvantages found in each method are pointed out and discussed. / Mestrado / Mecanica dos Sólidos e Projeto Mecanico / Mestre em Engenharia Mecânica
75

Sumarização Automática de Atualização para a língua portuguesa / Update Summarization for the portuguese language

Fernando Antônio Asevêdo Nóbrega 12 December 2017 (has links)
O enorme volume de dados textuais disponível na web caracteriza-se como um cenário ideal para inúmeras aplicações do Processamento de Língua Natural, tal como a tarefa da Sumarização Automática de Atualização (SAA), que tem por objetivo a geração automática de resumos a partir de uma coleção textual admitindo-se que o leitor possui algum conhecimento prévio sobre os textos-fonte. Dessa forma, um bom resumo de atualização deve ser constituído pelas informações mais relevantes, novas e atualizadas com relação ao conhecimento prévio do leitor. Essa tarefa implica em diversos desafios, sobretudo nas etapas de seleção e síntese de conteúdo para o sumário. Embora existam inúmeras abordagens na literatura, com diferentes níveis de complexidade teórica e computacional, pouco dessas investigações fazem uso de algum conhecimento linguístico profundo, que pode auxiliar a identificação de conteúdo mais relevante e atualizado. Além disso, os métodos de sumarização comumente empregam uma abordagem de síntese extrativa, na qual algumas sentenças dos textos-fonte são selecionadas e organizadas para compor o sumário sem alteração de seu conteúdo. Tal abordagem pode limitar a informatividade do sumário, uma vez que alguns segmentos sentenciais podem conter informação redundante ou irrelevante ao leitor. Assim, esforços recentes foram direcionados à síntese compressiva, na qual alguns segmentos das sentenças selecionadas para o sumário são removidos previamente à inserção no sumário. Nesse cenário, este trabalho de doutorado teve por objetivo a investigação do uso de conhecimentos linguísticos, como a Teoria Discursiva Multidocumento (CST), Segmentação de Subtópicos e Reconhecimento de Entidades Nomeadas, em distintas abordagens de seleção de conteúdo por meio das sínteses extrativas e compressivas visando à produção de sumários de atualização mais informativos. Tendo a língua Portuguesa como principal objeto de estudo, foram organizados três novos córpus, o CSTNews-Update, que viabiliza experimentos de SAA, e o PCSC-Pares e G1-Pares, para o desenvolvimento/avaliação de métodos de Compressão Sentencial. Ressalta-se que os experimentos de sumarização foram também realizados para a língua inglesa. Após as experimentações, observou-se que a Segmentação de Subtópicos foi mais efetiva para a produção de sumários mais informativos, porém, em apenas poucas abordagens de seleção de conteúdo. Além disso, foram propostas algumas simplificações para o método DualSum por meio da distribuição de Subtópicos. Tais métodos apresentaram resultados muito satisfatórios com menor complexidade computacional. Visando a produção de sumários compressivos, desenvolveram-se inúmeros métodos de Compressão Sentencial por meio de algoritmos de Aprendizado de Máquina. O melhor método proposto apresentou resultados superiores a um trabalho do estado da arte, que faz uso de algoritmos de Deep Learning. Além dos resultados supracitados, ressalta-se que anteriormente a este trabalho, a maioria das investigações de Sumarização Automática para a língua Portuguesa foi direcionada à geração de sumários a partir de um (monodocumento) ou vários textos relacionados (multidocumento) por meio da síntese extrativa, sobretudo pela ausência se recursos que viabilizassem a expansão da área de Sumarização Automática para esse idioma. Assim, as contribuições deste trabalho engajam-se em três campos, nos métodos de SAA propostos com conhecimento linguísticos, nos métodos de Compressão Sentencial e nos recursos desenvolvidos para a língua Portuguesa. / The huge amount of data that is available online is the main motivation for many tasks of Natural Language Processing, as the Update Summarization (US) which aims to produce a summary from a collection of related texts under the assumption the user/reader has some previous knowledge about the texts subject. Thus, a good update summary must be produced with the most relevant, new and updated content in order to assist the user. This task presents many research challenges, mainly in the processes of content selection and synthesis of the summary. Although there are several approaches for US, most of them do not use of some linguistic information that may assist the identification relevant content for the summary/user. Furthermore, US methods frequently apply an extractive synthesis approach, in which the summary is produced by picking some sentences from the source texts without rewriting operations. Once some segments of the picked sentences may contain redundant or irrelevant content, this synthesis process can to reduce the summary informativeness. Thus, some recent efforts in this field have focused in the compressive synthesis approach, in which some sentences are compressed by deletion of tokens or rewriting operations before be inserted in the output summary. Given this background, this PhD research has investigated the use of some linguistic information, as the Cross Document Theory (CST), Subtopic Segmentation and Named Entity Recognition into distinct content selection approaches for US by use extractive and compressive synthesis process in order to produce more informative update summaries. Once we have focused on the Portuguese language, we have compiled three new resources for this language, the CSTNews-Update, which allows the investigation of US methods for this language, the PCST-Pairs and G1-Pairs, in which there are pairs of original and compressed sentences in order to produce methods of sentence compression. It is important to say we also have performed experiments for the English language, in which there are more resources. The results show the Subtopic Segmentation assists the production of better summaries, however, this have occurred just on some content selection approaches. Furthermore, we also have proposed a simplification for the method DualSum by use Subtopic Segments. These simplifications require low computation power than DualSum and they have presented very satisfactory results. Aiming the production of compressive summaries, we have proposed different compression methods by use machine learning techniques. Our better proposed method present quality similar to a state-of-art system, which is based on Deep Learning algorithms. Previously this investigation, most of the researches on the Automatic Summarization field for the Portuguese language was focused on previous traditional tasks, as the production of summaries from one and many texts that does not consider the user knowledge, by use extractive synthesis processes. Thus, beside our proposed US systems based on linguistic information, which were evaluated over English and Portuguese datasets, we have produced many Compressions Methods and three new resources that will assist the expansion of the Automatic Summarization field for the Portuguese Language.
76

Evaluation and Implementation for Pushing Automatic Updates to IoT Devices

Min, Menglei January 2017 (has links)
In recent years, Internet of Things has developed rapidly, and now has penetrated into human life and industrial production. It is speculated that the internet of things will become ubiquitous in the future, which will bring a series of problems. First, the large number of things will lead to operated system and software updates consuming a lot of manpower and resources. Another problem is the Internet of things facing security issues, in recent years for the means of Internet of things and tools have been increasing largely. Therefore, to achieve a secure automatic update on the Internet of Things is essential. This report will follow such an automatic update system based on Internet of things to expand. First it elaborated on the main motive of this problem, found three existing related works and three security methods for communication to analyze. Then combined results of analysis, put forward own a secure automatic update solution: manager and devices connect and mutual authentication in real time, at the same time, the manager will regularly check the database to see if there is new version application. When the administrator uploads a new version, the manager will download the version and then sends to all devices, then device installs and finally restart itself. Next, the report described how to implement this system in detail and evaluated it. In the end, this report summarized and introduces the future work.
77

Att engagera användare på Facebook : Uppdateringsrutiner och rekommendationer för Philips Sonicare / To engage Facebook users : Update routines and recommendations for Philips Sonicare

Garnås, Amelie January 2015 (has links)
Denna rapport beskriver arbetet med ett projekt som haft målet att skapa rutiner för Philips Sonicares Facebooksida, för att öka engagemanget hos deras följare. Två fokusgrupper har utförts och med hjälp av dessa och tidigare forskning skapades uppdateringsrutiner för Facebooksidan, dessa följdes sedan i tre veckor. Efter tre veckor sammanställdes resultatet för att se vilka rutiner som fungerat och vilka som kunde uppdateras ytterligare. Resultatet visade att inlägg med underhållande innehåll fick mer engagemang från följarna, inlägg med information hade större räckvidd. Bilder ökade engagemanget, de tidpunkter som var bäst att publicera var mitt på dagen eller tidigt på kvällen. / This report describes the process of a project with the sole purpose to create routines that will increase the user engagement for Philips Sonicares Facebook page. Two focus groups were held, with the help of those and previous research, routines could be created for the Facebook page and those were used for three weeks. After this period, the results were compiled to see which routines had worked, and which needed to be updated. The results showed that posts with entertaining content got more engagement from the followers and posts with information got bigger reach. Also, pictures increased engagement, and the best times to publish posts were midday or early evening.
78

Wrapping XML-Sources to Support Update Awareness

Thuresson, Marcus January 2000 (has links)
Data warehousing is a generally accepted method of providing corporate decision support. Today, the majority of information in these warehouses originates from sources within a company, although changes often occur from the outside. Companies need to look outside their enterprises for valuable information, increasing their knowledge of customers, suppliers, competitors etc. The largest and most frequently accessed information source today is the Web, which holds more and more useful business information. Today, the Web primarily relies on HTML, making mechanical extraction of information a difficult task. In the near future, XML is expected to replace HTML as the language of the Web, bringing more structure and content focus. One problem when considering XML-sources in a data warehouse context is their lack of update awareness capabilities, which restricts eligible data warehouse maintenance policies. In this work, we wrap XML-sources in order to provide update awareness capabilities. We have implemented a wrapper prototype that provides update awareness capabilities for autonomous XML-sources, especially change awareness, change activeness, and delta awareness. The prototype wrapper complies with recommendations and working drafts proposed by W3C, thereby being compliant with most off-the-shelf XML tools. In particular, change information produced by the wrapper is based on methods defined by the DOM, implying that any DOM-compliant software, including most off-the-shelf XML processing tools, can be used to incorporate identified changes in a source into an older version of it. For the delta awareness capability we have investigated the possibility of using change detection algorithms proposed for semi-structured data. We have identified similarities and differences between XML and semi-structured data, which affect delta awareness for XML-sources. As a result of this effort, we propose an algorithm for change detection in XML-sources. We also propose matching criteria for XML-documents, to which the documents have to conform to be subject to change awareness extension.
79

Using a Rule-System as Mediator for Heterogeneous Databases, exemplified in a Bioinformatics Use Case

Schroiff, Anna January 2005 (has links)
Databases nowadays used in all kinds of application areas often differ greatly in a number of properties. These varieties add complexity to the handling of databases, especially when two or more different databases are dependent. The approach described here to propagate updates in an application scenario with heterogeneous, dependent databases is the use of a rule-based mediator. The system EruS (ECA rules updating SCOP) applies active database technologies in a bioinformatics scenario. Reactive behaviour based on rules is used for databases holding protein structures. The inherent heterogeneities of the Structural Classification of Proteins (SCOP) database and the Protein Data Bank (PDB) cause inconsistencies in the SCOP data derived from PDB. This complicates research on protein structures. EruS solves this problem by establishing rule-based interaction between the two databases. The system is built on the rule engine ruleCore with Event-Condition-Action rules to process PDB updates. It is complemented with wrappers accessing the databases to generate the events, which are executed as actions. The resulting system processes deletes and modifications of existing PDB entries and updates SCOP flatfiles with the relevant information. This is the first step in the development of EruS, which is to be extended in future work. The project improves bioinformatics research by providing easy access to up-to-date information from PDB to SCOP users. The system can also be considered as a model for rule-based mediators in other application areas.
80

Architecture et processus de développement permettant la mise à jour dynamique de systèmes embarqués automobiles / Architecture and Development Process for Dynamic Updates within Automotive Embedded Systems

Martorell, Hélène 09 December 2014 (has links)
Dans le contexte automobile actuel, le standard pour les calculateurs enfouis est AUTOSAR. L'un des inconvénients majeurs de cette architecture est son manque de flexibilité. Cependant, les mises à jour et la personnalisation des systèmes embarqués sont de plus en plus, non seulement plébiscités, mais également nécessaires. En effet, la complexité grandissante des systèmes exige à présent de déployer des moyens supplémentaires pour permettre leur maintenance et leur évolution de manière plus aisée. Ainsi, partant de ces constats, ce travail étudie les possibilités de faire des mises à jour dans le contexte d'AUTOSAR. Les modifications nécessaires se retrouvent non seulement dans l'architecture, mais également au sein du processus de développement et des considérations temps-réel. Tous ces aspects sont donc regardés en détails pour permettre les mises à jour partielles dans le cadre du standard AUTOSAR. Cette thèse décrit donc le processus de développement logiciel AUTOSAR et propose certaines améliorations mises en place au cours de ce travail. Un certain nombre de concepts sont également définis, afin d'aménager des espaces d'adaptation logiciels. Ces espaces sont ensuite utilisés pour intégrer des mises à jour partielles dans le calculateur embarqué. Le processus de développement est également modifié pour intégrer ces concepts ainsi que les mécanismes nécessaires à la mise à jour. Les aspects temps-réel concernant la mise à jour partielle dans les systèmes embarqués automobiles sont également traités ici. Un modèle de tâches approprié est mis en place dans le cadre d'AUTOSAR. De plus l'analyse de sensibilité est utilisée spécifiquement pour déterminer la flexibilité disponible dans un système donné. Les aspects d'implémentation sont également détaillés. En particulier, la création de mises à jour dans un contexte donné, la gestion des différentes versions possibles pour une application, l'utilisation et l'écriture dans la mémoire embarquée et enfin, les moyens nécessaires à la prise en compte des aspects de sûreté de fonctionnement. Pour terminer, tous les concepts développés dans ce travail sont appliqués à une preuve de concept reposant sur une application embarquée fournie par Renault. L'approche proposée est donc appliquée de manière pratique. / Currently the standard for embedded ECUs (Electronic Control Unit) in the automotive industry is AUTOSAR. One of the drawbacks of this architecture lies in its lack of flexibility. However, updates and customization of embedded systems are increasingly demanded and necessary. Indeed, systems are more and more complex and therefore require new methods and means to ease maintenance. Thus, from these observations, we study the possibilities for updates and resulting modifications (both on an architectural level and within the development process, and from a real-time point of view) in order to integrate within the AUTOSAR standard partial updates. This PhD thesis describes the software development process in an AUTOSAR context with a number of improvement we designed in this work. We also define concepts that allow to introduce placeholders for further updates within the embedded ECU. The development process has to be subsequently modified for integrating spaces for the updates along with the necessary mechanisms. Real-time problematic regarding partial updates in automotive systems is also considered here. In particular, we deal with sensitivity analysis that helps determine flexibility within the system. A number of implementation aspects are also detailed. In particular, the creation of the updates, versions management, use of embedded memory and dependability. All these concepts are finally applied on a proof of concept using an embedded application from Renault. We present here in details how the proposed approach can be used in practice.

Page generated in 0.0677 seconds