• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 72
  • 33
  • 13
  • 11
  • 8
  • 8
  • 6
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 186
  • 40
  • 37
  • 36
  • 33
  • 22
  • 18
  • 18
  • 18
  • 17
  • 16
  • 15
  • 14
  • 14
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Scalable Predictive Maintenance through the Eclipse Arrowhead Framework

Johansson, Anton January 2022 (has links)
With the rise of Industry 4.0 and the 4:th industrial revolution withthe Internet of Things, infrastructures have become more prevalent to connect and monitor many different systems within an industrial set-ting. With many candidates for this IoT infrastructure, there is a need to evaluate the different candidates to determine the different strengthsand weaknesses of the infrastructure.This thesis investigates the use of the Eclipse Arrowhead framework in the application of scalable infrastructure used for predictive mainte-nance. This investigation is conducted by converting an existing pre-dictive maintenance implementation that is using Amazon Web Services as the IoT infrastructure into a predictive maintenance implementationusing the Eclipse Arrowhead framework as the infrastructure.This design science artifact which results from this thesis shows that the Eclipse Arrowhead framework is suitable for a scalable infrastruc-ture though some shortcomings of the framework were found during the implementation. And though it is a suitable infrastructure, the usage ofthe framework should depend on the specific needs of the infrastructureand should not be used as a “one size fits all” solution.
152

Towards Digitization and Machine learning Automation for Cyber-Physical System of Systems

Javed, Saleha January 2022 (has links)
Cyber-physical systems (CPS) connect the physical and digital domains and are often realized as spatially distributed. CPS is built on the Internet of Things (IoT) and Internet of Services, which use cloud architecture to link a swarm of devices over a decentralized network. Modern CPSs are undergoing a foundational shift as Industry 4.0 is continually expanding its boundaries of digitization. From automating the industrial manufacturing process to interconnecting sensor devices within buildings, Industry 4.0 is about developing solutions for the digitized industry. An extensive amount of engineering efforts are put to design dynamically scalable and robust automation solutions that have the capacity to integrate heterogeneous CPS. Such heterogeneous systems must be able to communicate and exchange information with each other in real-time even if they are based on different underlying technologies, protocols, or semantic definitions in the form of ontologies. This development is subject to interoperability challenges and knowledge gaps that are addressed by engineers and researchers, in particular, machine learning approaches are considered to automate costly engineering processes. For example, challenges related to predictive maintenance operations and automatic translation of messages transmitted between heterogeneous devices are investigated using supervised and unsupervised machine learning approaches. In this thesis, a machine learning-based collaboration and automation-oriented IIoT framework named Cloud-based Collaborative Learning (CCL) is developed. CCL is based on a service-oriented architecture (SOA) offering a scalable CPS framework that provides machine learning-as-a-Service (MLaaS). Furthermore, interoperability in the context of the IIoT is investigated. I consider the ontology of an IoT device to be its language, and the structure of that ontology to be its grammar. In particular, the use of aggregated language and structural encoders is investigated to improve the alignment of entities in heterogeneous ontologies. Existing techniques of entity alignment are based on different approaches to integrating structural information, which overlook the fact that even if a node pair has similar entity labels, they may not belong to the same ontological context, and vice versa. To address these challenges, a model based on a modification of the BERT_INT model on graph triples is developed. The developed model is an iterative model for alignment of heterogeneous IIoT ontologies enabling alignments within nodes as well as relations. When compared to the state-of-the-art BERT_INT, on DBPK15 language dataset the developed model exceeds the baseline model by (HR@1/10, MRR) of 2.1%. This motivated the development of a proof-of-concept for conducting an empirical investigation of the developed model for alignment between heterogeneous IIoT ontologies. For this purpose, a dataset was generated from smart building systems and SOSA and SSN ontologies graphs. Experiments and analysis including an ablation study on the proposed language and structural encoders demonstrate the effectiveness of the model. The suggested approach, on the other hand, highlights prospective future studies that may extend beyond the scope of a single thesis. For instance, to strengthen the ablation study, a generalized IIoT ontology that is designed for any type of IoT devices (beyond sensors), such as SAREF can be tested for ontology alignment. Next potential future work is to conduct a crowdsourcing process for generating a validation dataset for IIoT ontology alignment and annotations. Lastly, this work can be considered as a step towards enabling translation between heterogeneous IoT sensor devices, therefore, the proposed model can be extended to a translation module in which based on the ontology graphs of any device, the model can interpret the messages transmitted from that device. This idea is at an abstract level as of now and needs extensive efforts and empirical study for full maturity.
153

Revisiter les paramètres physiques de la naine brune LHS 6343 C grâce à des observations d’éclipses secondaires HST/WFC3

Frost, William 03 1900 (has links)
Les naines brunes sont définies comme des objets généralement plus massifs que les planètes géantes, mais qui demeurent moins massifs que les plus petites étoiles. Étant incapables de fusionner de l’hydrogène en hélium comme les étoiles de la séquence principale en raison de leur faible masse, les naines brunes rayonnent seulement leur chaleur initiale de formation et se refroidissent continuellement au fil du temps. Cette perpétuelle diminution en luminosité introduit une dégénérescence entre leurs propriétés physiques, car il devient impossible de distinguer par sa seule luminosité une jeune naine brune massive de celle d’une vielle naine brune moins massive. Une modélisation atmosphérique et évolutive devient donc nécessaire pour contraindre les propriétés physiques (masse, rayon, âge, température effective, métallicité) des naines brunes sans compagnons, où seulement la luminosité peut être mesurée directement. Le flux émergeant de ces modèles semble bien reproduire ceux des naines brunes observées jusqu’à présent. Cependant, les paramètres physiques qu’ils prédisent demeurent sans calibration empirique, car il n’existe pas suffisamment de mesures indépendantes de ces paramètres venant de naines brunes observées qui permettrait de vérifier les prédictions des modèles. L’étude de naines brunes binaires éclipsant une étoile ouvre la possibilité de prendre des mesures directes de ses caractéristiques physiques via des analyses de vitesses radiales, de transits et d’éclipses secondaires, le tout de manière indépendante des modèles. Ce mémoire porte sur l’étude d’une naine brune binaire éclipsante découverte en 2011 via photométrie de transit par le télescope Kepler: LHS 6343 C. Des observations de transit (Kepler) en plus d’observations de vitesses radiales (Keck/HIRES) et d’éclipses secondaires (Kepler, HST, Spitzer) permettent la mesure directe de tous ses paramètres physiques importants sauf l’âge. Ce mémoire apporte une première analyse des données d’éclipse secondaire HST pour obtenir un spectre d’émission de la naine brune dans la bande passante WFC3-G141 (1.1 à 1.7 µm), permettant d’identifier un type spectrale de T1.5. De plus, ce mémoire met à jour la masse et le rayon de LHS 6343 C en utilisant une distance Gaia DR3 et des relations stellaires empiriques. Ce nouvel ensemble de paramètres est ensuite comparé à ceux prédits par des modèles atmosphériques, où l’on trouve que ceux en déséquilibre chimique reproduisent mieux les données comparés à ceux en équilibre chimique. Finalement, des modèles d’évolution sont utilisés pour déterminer l’âge de la naine brune. / Brown dwarfs are defined as substellar objects that are generally more massive than giant planets, but which remain less massive than the smallest stars. Being unable to fuse hydrogen into helium like main-sequence stars due to their low mass, brown dwarfs do not have access to a long-term energy source. They therefore radiate only their initial heat of formation and cool continuously over time. This perpetual decrease in luminosity introduces a degeneracy between their physical properties, making it impossible to distinguish a young massive brown dwarf from an older less massive one based on their luminosity and spectra alone. Therefore, atmospheric and evolutionary modelling becomes necessary to obtain other properties (e.g. mass, radius, age, effective temperature) of field brown dwarfs, since only their luminosity can be measured directly. Fortunately, the luminosities and spectra of the best models reproduce observations well. However, the physical parameters they predict (i.e. mass, radius, effective temperature, metallicity) lack an empirical calibration; i.e. there are not enough independent measurements of these parameters to meaningfully confirm the predictive power of models. One of the scenarios allowing the direct measurement of several physical characteristics is provided by brown dwarf eclipsing binaries (BDEB), i.e. a brown dwarf orbiting a star. With radial velocity, transit, and secondary eclipse analyses, all but the age of a BDEB can be determined independently of models. This thesis pertains to the study of a minimally irradiated BDEB, LHS 6343 C, discovered in 2011 via transit photometry by the Kepler telescope. Since its discovery, a greater amount of transit (Kepler) observations in addition to radial velocity (Keck/HIRES) and secondary eclipse (Kepler, HST, Spitzer) observations allow for everything but an age measurement to be obtained. This thesis provides a first analysis of the HST secondary eclipse data to obtain a brown dwarf emission spectrum in the WFC3-G141 filter (1.1 to 1.7 µm), identifying it as a T1.5 dwarf. In addition, this thesis updates the physical parameters of previous studies using a Gaia DR3 distance and empirical stellar relations. This new set of parameters is then compared to those predicted by atmospheric models, where those in chemical nonequilibrium reproduce the observed flux better than chemical equilibrium or cloud models. Finally, evolutionary models are used to determine the age of the brown dwarf.
154

Changes within localization practices : A case study of the Fatal Frame series

Norén, Daniella January 2024 (has links)
Japanese video games have existed for decades now; however, the early games often lacked cultural elements and were thus easy to localize in other countries. As video games gained popularity and became more advanced, more of Japanese cultural elements started appearing in the games, which brought rise to translation problems. The localizers had to face the issue of whether to preserve or delete foreign culture elements within the game when these games were released in the West. As game localization of (Japanese) cultural elements is under-researched, this paper focuses solely on cultural terms. It attempts to see whether there is a change of translation techniques and if there is a shift regarding foreignization and domestication from the first game to the latest release within the game series Fatal Frame. The games were played in both their original form (Japanese) and the English localization in order to gather data which consists of cultural terms that appear in-game. Their translation was then analyzed to see if they were deemed to be foreignized or domesticated, followed by a comparison to see if there was a shift in the 22-year gap between the first and latest release. The result indicates that both games used the adaptation and equivalence (paraphrasing) techniques to a similar extent. The latest release preferred using the omission technique compared to the first, while the first game favored equivalence in conjunction with the borrowing technique. However, against expectation, the first game is considered more foreignized compared to the latest release, which goes against the trend seen in other mediums where translations are becoming more foreignized in modern times. This was also the result of a previous study within this area.
155

Controle eletrônico das etapas do processo de planejamento de pacientes em radioterapia / Electronic control of the steps of the planning process of patients in radiotherapy

Brito, Carlos de 20 October 2017 (has links)
Em diversos serviços de radioterapia brasileiros há a premente necessidade de sistematização de processos que ainda hoje, mesmo com acesso à tecnologia da informação, é realizada manualmente. Este fato acontece com o controle dos pacientes em planejamento radioterápico onde os dados de todas as etapas do processo são inseridos em um quadro físico conforme executadas. Esses dados consistem desde o agendamento da tomografia de planejamento até a liberação do mesmo para o tratamento. O objetivo deste estudo é o desenvolvimento de um software para controle eletrônico de pacientes em processo de planejamento radioterápico, substituindo o quadro físico utilizado atualmente. Foi utilizado para este estudo a técnica de elicitação de requisitos, que resulta em um documento de requisitos que serve de base para a modelagem do sistema através de diagramas UML e modelagem conceitual do banco de dados MySql e posteriormente a utilização da técnica de mapeamento objeto relacional (MOR). A consistência nos dados permite acesso seguro e confiável à esses dados, facilita a obtenção de dados estatísticos e agiliza o processo de planejamento, já que de posse de estatísticas consistentes é possível a identificação de gargalos no processo e sua resolução com maior rapidez. O resultado deste estudo é um software e banco de dados que integra todas as informações do processo de planejamento radioterápico. / In Brazilian radiotherapy departments there is a pressing need for systematization of processes, that even today with the access of information technology is performed manually. The control of the patients in radiotherapy planning with the data of all the steps of the process are inserted in a physical board as they are executed. These data consist of scheduling since the planning tomography until the release of the patient for the treatment. The objective of this study is the development of a software for electronic control of patients in radiotherapy planning process, replacing the physical board currently used. The requirements for the elicitation technique was used for this study, which results in a requirements document that serves as the basis for modeling the system through UML diagrams and conceptual modeling of the MySql database and after using the relational object mapping technique (MOR). The data consistency allows secure and reliable access to these data, facilitates the collection of statistical data and streamlines the planning process, this consistent statistics allow the identification of limitation in the process and faster resolution. The result of this study is a software and database that integrates all the information of the radiotherapy planning process.
156

Controle eletrônico das etapas do processo de planejamento de pacientes em radioterapia / Electronic control of the steps of the planning process of patients in radiotherapy

Carlos de Brito 20 October 2017 (has links)
Em diversos serviços de radioterapia brasileiros há a premente necessidade de sistematização de processos que ainda hoje, mesmo com acesso à tecnologia da informação, é realizada manualmente. Este fato acontece com o controle dos pacientes em planejamento radioterápico onde os dados de todas as etapas do processo são inseridos em um quadro físico conforme executadas. Esses dados consistem desde o agendamento da tomografia de planejamento até a liberação do mesmo para o tratamento. O objetivo deste estudo é o desenvolvimento de um software para controle eletrônico de pacientes em processo de planejamento radioterápico, substituindo o quadro físico utilizado atualmente. Foi utilizado para este estudo a técnica de elicitação de requisitos, que resulta em um documento de requisitos que serve de base para a modelagem do sistema através de diagramas UML e modelagem conceitual do banco de dados MySql e posteriormente a utilização da técnica de mapeamento objeto relacional (MOR). A consistência nos dados permite acesso seguro e confiável à esses dados, facilita a obtenção de dados estatísticos e agiliza o processo de planejamento, já que de posse de estatísticas consistentes é possível a identificação de gargalos no processo e sua resolução com maior rapidez. O resultado deste estudo é um software e banco de dados que integra todas as informações do processo de planejamento radioterápico. / In Brazilian radiotherapy departments there is a pressing need for systematization of processes, that even today with the access of information technology is performed manually. The control of the patients in radiotherapy planning with the data of all the steps of the process are inserted in a physical board as they are executed. These data consist of scheduling since the planning tomography until the release of the patient for the treatment. The objective of this study is the development of a software for electronic control of patients in radiotherapy planning process, replacing the physical board currently used. The requirements for the elicitation technique was used for this study, which results in a requirements document that serves as the basis for modeling the system through UML diagrams and conceptual modeling of the MySql database and after using the relational object mapping technique (MOR). The data consistency allows secure and reliable access to these data, facilitates the collection of statistical data and streamlines the planning process, this consistent statistics allow the identification of limitation in the process and faster resolution. The result of this study is a software and database that integrates all the information of the radiotherapy planning process.
157

Análise do campo elétrico atmosférico durante tempo bom e distúrbios geofísicos

Anaya, José Carlos Tacza 19 January 2015 (has links)
Made available in DSpace on 2016-03-15T19:35:52Z (GMT). No. of bitstreams: 1 JOSE CARLOS TACZA ANAYA.pdf: 7682166 bytes, checksum: f3eebed2cf5cb0f5ecda9415f8754978 (MD5) Previous issue date: 2015-01-19 / Conselho Nacional de Desenvolvimento Científico e Tecnológico / In this dissertation, we present the capability of a new network of sensors to monitor the atmospheric electric field at various locations in South America. The main goal is to obtain the characteristic Universal Time daily curve of the atmospheric electric field in fair-weather. That curve is known as the Carnegie curve, which is related to the currents flowing in the Global Atmospheric Electric Circuit. This has been accomplished using monthly, seasonal and annual averages. After obtaining our standard curve of variation of the electric field in fair-weather, the deviations related to phenomena such as solar flares, solar protons events, geomagnetic storms, total solar eclipse and seismic activity are analyzed and commented. / Neste trabalho de dissertação apresenta-se a capabilidade de uma nova rede de sensores para monitorar o campo elétrico atmosférico em vários locais na América do Sul. O objetivo principal é obter a curva diária do campo elétrico atmosférico de tempo bom. Para isto foram realizadas médias mensais, sazonais e anuais. Essa curva é comparada com a curva característica em Tempo Universal conhecida como a Curva de Carnegie, a qual é relacionada com as correntes fluindo no Circuito Elétrico Atmosférico Global. Depois de obter a curva padrão de variação do campo elétrico atmosférico de tempo bom, foram analisados e comentados os desvios relacionados a explosões solares, eventos de prótons solares, tempestades geomagnéticas, eclipse solar e atividade sísmica.
158

用於圖形化編輯器開發之專屬模型語言設計 / A Modeling Language of Domain Specific Editors Based on Graphical Modeling Framework

呂宗龍, Lv,Zong-Long Unknown Date (has links)
圖形化模型編輯器是模型編輯器開發長久以來一直努力的方向。Eclipse.org 已提供兩個功能強大的圖形化編輯器設計框架:GEF 圖形化編輯器設計框架,和將 EMF 與 GEF 結合並簡化其設計流程的 GMF 圖形化模型編輯器設計框架,來協助開發者發展圖形化模型編輯器。 / 使用設計框架時所面臨的高學習門檻是開發者選用設計框架上的一大障礙。為降低開發者在 GMF 使用上的學習門檻,我們希望能提供一個以模型開發為導向的圖形編輯器設計專屬模型語言,讓程式設計師能夠以其慣用的程式設計角度來描述其圖形化模型編輯器的組成架構。 / 本篇研究主要是在探討 GMF 圖形化模型編輯器開發流程的簡化。在研究中希望藉由提供 GMF 圖形化模型編輯器設計框架的 GM3 專屬模型語言,讓開發者能夠透過 MDA 的角度來開發圖形化模型編輯器的架構,藉以縮短 GMF 的開發流程。實驗中使用 JavaCC 文法剖析器產生工具來開發 GM3專屬模型語言與 GMF 各項模型定義間的模型轉換。 / The availability of a visual graphical editor for a target domain is the prerequisite of visual graphical modeling, which has been adopted by classical software development for decades and is especially emphasized in today's model-driven engineering. However, compared with traditional textual editors, developing a visual graphical editor from scratch is not an easy work. As a result, there were frameworks developed such as GEF and GMF aimed to simplify the construction of graphical editors. Even so, however, it is still though hard for an average programmer to construct a visual graphical editor by using these frameworks without a long time of learning. / Our result is a modeling langauge of graphcial editors called GM3, serving as a bridge betwen developers of graphical editors and the GMF framework. With GM3, the developer can specify the model of his editing domain, how each model element should be presented by which kind of graphical elements in the editor, and how the tool pallete should be filled with various kind of creation and manipulation tools for model elements. After the GM3 specification of an editor is produced, the GM3 transformation engine developed by us using the JavaCC parser generator can be used to generate all files required of the GMF framework and, finally, a subsequent application of the standard GMF code generation procedure can produce a complete graphical editor on Eclipse platform.
159

Identifying exoplanets and unmasking false positives with NGTS

Günther, Maximilian Norbert January 2018 (has links)
In my PhD, I advanced the scientific exploration of the Next Generation Transit Survey (NGTS), a ground-based wide-field survey operating at ESO’s Paranal Observatory in Chile since 2016. My original contribution to knowledge is the development of novel methods to 1) estimate NGTS’ yield of planets and false positives; 2) disentangle planets from false positives; and 3) accurately characterise planets. If an exoplanet passes (transits) in front of its host star, we can measure a periodic decrease in brightness. The study of transiting exoplanets gives insight into their size, formation, bulk composition and atmospheric properties. Transit surveys are limited by their ability to identify false positives, which can mimic planets and out-number them by a hundredfold. First, I designed a novel yield simulator to optimise NGTS’ observing strategy and identification of false positives (published in Günther et al., 2017a). This showed that NGTS’ prime targets, Neptune- and Earth-sized signals, are frequently mimicked by blended eclipsing binaries, allowing me to quantify and prepare strategies for candidate vetting and follow-up. Second, I developed a centroiding algorithm for NGTS, achieving a precision of 0.25 milli-pixel in a CCD image (published in Günther et al., 2017b). With this, one can measure a shift of light during an eclipse, readily identifying unresolved blended objects. Third, I innovated a joint Bayesian fitting framework for photometry, centroids, and radial velocity cross-correlation function profiles. This allows to disentangle which object (target or blend) is causing the signal and to characterise the system. My method has already unmasked numerous false positives. Most importantly, I confirmed that a signal which was almost erroneously rejected, is in fact an exoplanet (published in Günther et al., 2018). The presented achievements minimise the contamination with blended false positives in NGTS candidates by 80%, and show a new approach for unmasking hidden exoplanets. This research enhanced the success of NGTS, and can provide guidance for future missions.
160

Ondes de gravité atmosphériques observées par un réseau mondial de microbaromètres

Marty, Julien 12 October 2010 (has links) (PDF)
L'organisation du traité d'interdiction complète des essais nucléaires (OTICE) dispose actuellement d'un réseau mondial de quarante-deux stations opérationnelles qui enregistrent les fluctuations de pression atmosphériques produites au niveau du sol par les ondes infrasonores. Cette thèse démontre que la plupart de ces stations détectent également, avec une grande précision, les fluctuations de pression dans toute la bande des ondes de gravité. Ce travail inclut la réalisation et l'analyse de plusieurs expériences de métrologie en laboratoire et d'une campagne de mesure temporaire (M2008) déployée en Mongolie en 2008. La planification puis l'interprétation des données de ces expériences a conduit au développement d'un nouveau modèle linéaire spectral numérique pour simuler la génération et la propagation des ondes de gravité. Ce modèle est notamment utilisé pour quantifier les ondes de gravité induites par le refroidissement atmosphérique produit lors d'une éclipse de Soleil. Les fluctuations de pression attendues au niveau du sol lors de l'éclipse du 1er août 2008 sont estimées et comparées aux données enregistrées par les stations infrasons de l'OTICE et du réseau M2008. Une analyse détaillée de ces données a montré l'existence de deux ondes avec des caractéristiques temps-fréquence similaires aux ondes simulées pour des refroidissements stratosphérique et troposphérique. Ceci constitue à notre connaissance un résultat original. La validation des enregistrements de pression dans toute la bande des ondes de gravité a permis de mener des études statistiques pluriannuelles et d'échelle mondiale sur les spectres d'ondes de gravité et les marées atmosphériques. Ces travaux de thèse ont conduit à la publication de deux articles. Un troisième est en préparation.

Page generated in 0.0332 seconds