• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 73
  • 33
  • 13
  • 11
  • 8
  • 8
  • 6
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 188
  • 40
  • 37
  • 36
  • 33
  • 22
  • 18
  • 18
  • 18
  • 17
  • 16
  • 15
  • 14
  • 14
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Towards Digitization and Machine learning Automation for Cyber-Physical System of Systems

Javed, Saleha January 2022 (has links)
Cyber-physical systems (CPS) connect the physical and digital domains and are often realized as spatially distributed. CPS is built on the Internet of Things (IoT) and Internet of Services, which use cloud architecture to link a swarm of devices over a decentralized network. Modern CPSs are undergoing a foundational shift as Industry 4.0 is continually expanding its boundaries of digitization. From automating the industrial manufacturing process to interconnecting sensor devices within buildings, Industry 4.0 is about developing solutions for the digitized industry. An extensive amount of engineering efforts are put to design dynamically scalable and robust automation solutions that have the capacity to integrate heterogeneous CPS. Such heterogeneous systems must be able to communicate and exchange information with each other in real-time even if they are based on different underlying technologies, protocols, or semantic definitions in the form of ontologies. This development is subject to interoperability challenges and knowledge gaps that are addressed by engineers and researchers, in particular, machine learning approaches are considered to automate costly engineering processes. For example, challenges related to predictive maintenance operations and automatic translation of messages transmitted between heterogeneous devices are investigated using supervised and unsupervised machine learning approaches. In this thesis, a machine learning-based collaboration and automation-oriented IIoT framework named Cloud-based Collaborative Learning (CCL) is developed. CCL is based on a service-oriented architecture (SOA) offering a scalable CPS framework that provides machine learning-as-a-Service (MLaaS). Furthermore, interoperability in the context of the IIoT is investigated. I consider the ontology of an IoT device to be its language, and the structure of that ontology to be its grammar. In particular, the use of aggregated language and structural encoders is investigated to improve the alignment of entities in heterogeneous ontologies. Existing techniques of entity alignment are based on different approaches to integrating structural information, which overlook the fact that even if a node pair has similar entity labels, they may not belong to the same ontological context, and vice versa. To address these challenges, a model based on a modification of the BERT_INT model on graph triples is developed. The developed model is an iterative model for alignment of heterogeneous IIoT ontologies enabling alignments within nodes as well as relations. When compared to the state-of-the-art BERT_INT, on DBPK15 language dataset the developed model exceeds the baseline model by (HR@1/10, MRR) of 2.1%. This motivated the development of a proof-of-concept for conducting an empirical investigation of the developed model for alignment between heterogeneous IIoT ontologies. For this purpose, a dataset was generated from smart building systems and SOSA and SSN ontologies graphs. Experiments and analysis including an ablation study on the proposed language and structural encoders demonstrate the effectiveness of the model. The suggested approach, on the other hand, highlights prospective future studies that may extend beyond the scope of a single thesis. For instance, to strengthen the ablation study, a generalized IIoT ontology that is designed for any type of IoT devices (beyond sensors), such as SAREF can be tested for ontology alignment. Next potential future work is to conduct a crowdsourcing process for generating a validation dataset for IIoT ontology alignment and annotations. Lastly, this work can be considered as a step towards enabling translation between heterogeneous IoT sensor devices, therefore, the proposed model can be extended to a translation module in which based on the ontology graphs of any device, the model can interpret the messages transmitted from that device. This idea is at an abstract level as of now and needs extensive efforts and empirical study for full maturity.
152

Revisiter les paramètres physiques de la naine brune LHS 6343 C grâce à des observations d’éclipses secondaires HST/WFC3

Frost, William 03 1900 (has links)
Les naines brunes sont définies comme des objets généralement plus massifs que les planètes géantes, mais qui demeurent moins massifs que les plus petites étoiles. Étant incapables de fusionner de l’hydrogène en hélium comme les étoiles de la séquence principale en raison de leur faible masse, les naines brunes rayonnent seulement leur chaleur initiale de formation et se refroidissent continuellement au fil du temps. Cette perpétuelle diminution en luminosité introduit une dégénérescence entre leurs propriétés physiques, car il devient impossible de distinguer par sa seule luminosité une jeune naine brune massive de celle d’une vielle naine brune moins massive. Une modélisation atmosphérique et évolutive devient donc nécessaire pour contraindre les propriétés physiques (masse, rayon, âge, température effective, métallicité) des naines brunes sans compagnons, où seulement la luminosité peut être mesurée directement. Le flux émergeant de ces modèles semble bien reproduire ceux des naines brunes observées jusqu’à présent. Cependant, les paramètres physiques qu’ils prédisent demeurent sans calibration empirique, car il n’existe pas suffisamment de mesures indépendantes de ces paramètres venant de naines brunes observées qui permettrait de vérifier les prédictions des modèles. L’étude de naines brunes binaires éclipsant une étoile ouvre la possibilité de prendre des mesures directes de ses caractéristiques physiques via des analyses de vitesses radiales, de transits et d’éclipses secondaires, le tout de manière indépendante des modèles. Ce mémoire porte sur l’étude d’une naine brune binaire éclipsante découverte en 2011 via photométrie de transit par le télescope Kepler: LHS 6343 C. Des observations de transit (Kepler) en plus d’observations de vitesses radiales (Keck/HIRES) et d’éclipses secondaires (Kepler, HST, Spitzer) permettent la mesure directe de tous ses paramètres physiques importants sauf l’âge. Ce mémoire apporte une première analyse des données d’éclipse secondaire HST pour obtenir un spectre d’émission de la naine brune dans la bande passante WFC3-G141 (1.1 à 1.7 µm), permettant d’identifier un type spectrale de T1.5. De plus, ce mémoire met à jour la masse et le rayon de LHS 6343 C en utilisant une distance Gaia DR3 et des relations stellaires empiriques. Ce nouvel ensemble de paramètres est ensuite comparé à ceux prédits par des modèles atmosphériques, où l’on trouve que ceux en déséquilibre chimique reproduisent mieux les données comparés à ceux en équilibre chimique. Finalement, des modèles d’évolution sont utilisés pour déterminer l’âge de la naine brune. / Brown dwarfs are defined as substellar objects that are generally more massive than giant planets, but which remain less massive than the smallest stars. Being unable to fuse hydrogen into helium like main-sequence stars due to their low mass, brown dwarfs do not have access to a long-term energy source. They therefore radiate only their initial heat of formation and cool continuously over time. This perpetual decrease in luminosity introduces a degeneracy between their physical properties, making it impossible to distinguish a young massive brown dwarf from an older less massive one based on their luminosity and spectra alone. Therefore, atmospheric and evolutionary modelling becomes necessary to obtain other properties (e.g. mass, radius, age, effective temperature) of field brown dwarfs, since only their luminosity can be measured directly. Fortunately, the luminosities and spectra of the best models reproduce observations well. However, the physical parameters they predict (i.e. mass, radius, effective temperature, metallicity) lack an empirical calibration; i.e. there are not enough independent measurements of these parameters to meaningfully confirm the predictive power of models. One of the scenarios allowing the direct measurement of several physical characteristics is provided by brown dwarf eclipsing binaries (BDEB), i.e. a brown dwarf orbiting a star. With radial velocity, transit, and secondary eclipse analyses, all but the age of a BDEB can be determined independently of models. This thesis pertains to the study of a minimally irradiated BDEB, LHS 6343 C, discovered in 2011 via transit photometry by the Kepler telescope. Since its discovery, a greater amount of transit (Kepler) observations in addition to radial velocity (Keck/HIRES) and secondary eclipse (Kepler, HST, Spitzer) observations allow for everything but an age measurement to be obtained. This thesis provides a first analysis of the HST secondary eclipse data to obtain a brown dwarf emission spectrum in the WFC3-G141 filter (1.1 to 1.7 µm), identifying it as a T1.5 dwarf. In addition, this thesis updates the physical parameters of previous studies using a Gaia DR3 distance and empirical stellar relations. This new set of parameters is then compared to those predicted by atmospheric models, where those in chemical nonequilibrium reproduce the observed flux better than chemical equilibrium or cloud models. Finally, evolutionary models are used to determine the age of the brown dwarf.
153

Generación automática de servicios WEB a partir de modelos conceptuales

Ruiz Server, Marta 27 May 2010 (has links)
Los servicios Web facilitan el acceso a la funcionalidad de las aplicaciones a través de Internet, aportan grandes ventajas como el acceso a servicios desde cualquier punto de la red, simplifican el acceso a la funcionalidad y facilitan la interoperabilidad entre servicios y entre aplicaciones, permitiendo integrar la funcionalidad de distintas aplicaciones empresariales. Además, proporcionan estándares y mecanismos para llevar a cabo el comercio electrónico y han convertido la Web en un marco para el desarrollo de aplicaciones distribuidas, extendiéndose a todos los dominios de aplicación. Los servicios Web son el mecanismo ideal para la implementación de las Arquitecturas Orientadas a Servicios (SOA) para sistemas de información integrados y distribuidos. La evolución natural de los métodos de producción de software, y de OOMethod / OOWS en particular, plantean la necesidad de mejorar el proceso de producci'on de software. Para mejorarlo, se debe de dotar del soporte necesario para el desarrollo de aplicaciones Web sobre SOA disponiendo de una estrategia de generación automática de aplicaciones. La generación automática debe poder dar soporte, de forma transparente, a las diferentes aproximaciones tecnológicas existentes en el ámbito de los servicios Web en la actualidad. Esta tesis presenta un método, dentro del ámbito de la Ingeniería Web, que diseñaa e implementa de manera automática servicios Web a partir de modelos conceptuales. El método presentado ha sido desarrollado basándose en el Desarrollo de Software Dirigido por Modelos (MDD). De esta forma, el método utiliza un conjunto de modelos que representan la aplicación a desarrollar y mediante la aplicación de transformaciones de Modelo-A-Texto se obtiene el diseño e implementación de los servicios Web que representan el sistema modelado. El método propuesto en esta tesis está soportado por una herramienta llamada DISWOOM. / Ruiz Server, M. (2010). Generación automática de servicios WEB a partir de modelos conceptuales [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/8346
154

Étude de l'atmosphère de la Jupiter chaude WASP-80 b par spectroscopie d'éclipse secondaire avec JWST/NIRISS

Morel, Kim 07 1900 (has links)
La spectroscopie d'éclipse secondaire dans l'infrarouge proche permet de caractériser les atmosphères d'exoplanètes en détectant leur émission thermique et leur lumière réfléchie, fournissant de l'information sur leur structure thermique et leurs aérosols. Ces derniers affectent la balance énergétique d'une planète, qui peut être quantifiée avec des mesures d'albédo. Ce mémoire porte sur l'analyse du spectre d'éclipse secondaire de WASP-80 b entre 0,69 et 2,83 \(\mu\)m obtenu avec des données de l'instrument NIRISS en mode SOSS du JWST, incluant les premières mesures en deçà de 1,1 \(\mu\)m pour cette exoplanète. WASP-80 b est une des rares Jupiters chaudes en orbite autour d'une étoile naine M. Le spectre obtenu est dominé par la lumière réfléchie, suggérant la présence d'aérosols sur le côté jour de la planète, en accord avec des études antérieures. Les observations permettent de contraindre l'albédo géométrique associé à la lumière réfléchie pour les longueurs d'onde de SOSS à une valeur de \(0,17\pm0,05\). Cette mesure permet d'estimer que l'albédo de Bond est \(A_\mathrm{B}<0,33\), en accord avec les prédictions de recirculation de chaleur efficace dans l'atmosphère de planètes ayant des températures similaires à celle de WASP-80 b. En comparant le spectre de lumière réfléchie avec des modèles de nuages, la présence de nuages de silicates est exclue alors que des compositions de Cr ou de Na\(_\mathrm{2}\)S pourraient expliquer les observations. / Secondary eclipse spectroscopy in the near-infrared allows the characterization of exoplanet atmospheres by detecting both their thermal emission and reflected light, providing great insight into their thermal structure and aerosols. The latter have an impact on the energy balance of a planet, which can be quantified with albedo measurements. This thesis focuses on the analysis of the secondary eclipse spectrum of WASP-80 b between 0,69 and 2,83 \(\mu\)m obtained with JWST NIRISS/SOSS data, including the first measurements below 1,1 \(\mu\)m for this exoplanet. WASP-80 b is one of the rare hot Jupiters orbiting an M dwarf star. The resulting spectrum is dominated by reflected light, suggesting the presence of aerosols on the dayside of the planet, in agreement with previous studies. The observations constrain the geometric albedo from reflected light in the SOSS wavelength range to a value of \(0,17\pm0,05\). This measurement provides an estimate for the Bond albedo of \(A_\mathrm{B}<0,33\), in agreement with predictions of efficient heat recirculation in the atmospheres of planets with temperatures similar to that of WASP-80 b. By comparing the reflected light spectrum with different cloud models, the presence of silicate clouds is ruled out, while compositions of Cr or Na\(_\mathrm{2}\)S could explain the observations.
155

Changes within localization practices : A case study of the Fatal Frame series

Norén, Daniella January 2024 (has links)
Japanese video games have existed for decades now; however, the early games often lacked cultural elements and were thus easy to localize in other countries. As video games gained popularity and became more advanced, more of Japanese cultural elements started appearing in the games, which brought rise to translation problems. The localizers had to face the issue of whether to preserve or delete foreign culture elements within the game when these games were released in the West. As game localization of (Japanese) cultural elements is under-researched, this paper focuses solely on cultural terms. It attempts to see whether there is a change of translation techniques and if there is a shift regarding foreignization and domestication from the first game to the latest release within the game series Fatal Frame. The games were played in both their original form (Japanese) and the English localization in order to gather data which consists of cultural terms that appear in-game. Their translation was then analyzed to see if they were deemed to be foreignized or domesticated, followed by a comparison to see if there was a shift in the 22-year gap between the first and latest release. The result indicates that both games used the adaptation and equivalence (paraphrasing) techniques to a similar extent. The latest release preferred using the omission technique compared to the first, while the first game favored equivalence in conjunction with the borrowing technique. However, against expectation, the first game is considered more foreignized compared to the latest release, which goes against the trend seen in other mediums where translations are becoming more foreignized in modern times. This was also the result of a previous study within this area.
156

Controle eletrônico das etapas do processo de planejamento de pacientes em radioterapia / Electronic control of the steps of the planning process of patients in radiotherapy

Brito, Carlos de 20 October 2017 (has links)
Em diversos serviços de radioterapia brasileiros há a premente necessidade de sistematização de processos que ainda hoje, mesmo com acesso à tecnologia da informação, é realizada manualmente. Este fato acontece com o controle dos pacientes em planejamento radioterápico onde os dados de todas as etapas do processo são inseridos em um quadro físico conforme executadas. Esses dados consistem desde o agendamento da tomografia de planejamento até a liberação do mesmo para o tratamento. O objetivo deste estudo é o desenvolvimento de um software para controle eletrônico de pacientes em processo de planejamento radioterápico, substituindo o quadro físico utilizado atualmente. Foi utilizado para este estudo a técnica de elicitação de requisitos, que resulta em um documento de requisitos que serve de base para a modelagem do sistema através de diagramas UML e modelagem conceitual do banco de dados MySql e posteriormente a utilização da técnica de mapeamento objeto relacional (MOR). A consistência nos dados permite acesso seguro e confiável à esses dados, facilita a obtenção de dados estatísticos e agiliza o processo de planejamento, já que de posse de estatísticas consistentes é possível a identificação de gargalos no processo e sua resolução com maior rapidez. O resultado deste estudo é um software e banco de dados que integra todas as informações do processo de planejamento radioterápico. / In Brazilian radiotherapy departments there is a pressing need for systematization of processes, that even today with the access of information technology is performed manually. The control of the patients in radiotherapy planning with the data of all the steps of the process are inserted in a physical board as they are executed. These data consist of scheduling since the planning tomography until the release of the patient for the treatment. The objective of this study is the development of a software for electronic control of patients in radiotherapy planning process, replacing the physical board currently used. The requirements for the elicitation technique was used for this study, which results in a requirements document that serves as the basis for modeling the system through UML diagrams and conceptual modeling of the MySql database and after using the relational object mapping technique (MOR). The data consistency allows secure and reliable access to these data, facilitates the collection of statistical data and streamlines the planning process, this consistent statistics allow the identification of limitation in the process and faster resolution. The result of this study is a software and database that integrates all the information of the radiotherapy planning process.
157

Controle eletrônico das etapas do processo de planejamento de pacientes em radioterapia / Electronic control of the steps of the planning process of patients in radiotherapy

Carlos de Brito 20 October 2017 (has links)
Em diversos serviços de radioterapia brasileiros há a premente necessidade de sistematização de processos que ainda hoje, mesmo com acesso à tecnologia da informação, é realizada manualmente. Este fato acontece com o controle dos pacientes em planejamento radioterápico onde os dados de todas as etapas do processo são inseridos em um quadro físico conforme executadas. Esses dados consistem desde o agendamento da tomografia de planejamento até a liberação do mesmo para o tratamento. O objetivo deste estudo é o desenvolvimento de um software para controle eletrônico de pacientes em processo de planejamento radioterápico, substituindo o quadro físico utilizado atualmente. Foi utilizado para este estudo a técnica de elicitação de requisitos, que resulta em um documento de requisitos que serve de base para a modelagem do sistema através de diagramas UML e modelagem conceitual do banco de dados MySql e posteriormente a utilização da técnica de mapeamento objeto relacional (MOR). A consistência nos dados permite acesso seguro e confiável à esses dados, facilita a obtenção de dados estatísticos e agiliza o processo de planejamento, já que de posse de estatísticas consistentes é possível a identificação de gargalos no processo e sua resolução com maior rapidez. O resultado deste estudo é um software e banco de dados que integra todas as informações do processo de planejamento radioterápico. / In Brazilian radiotherapy departments there is a pressing need for systematization of processes, that even today with the access of information technology is performed manually. The control of the patients in radiotherapy planning with the data of all the steps of the process are inserted in a physical board as they are executed. These data consist of scheduling since the planning tomography until the release of the patient for the treatment. The objective of this study is the development of a software for electronic control of patients in radiotherapy planning process, replacing the physical board currently used. The requirements for the elicitation technique was used for this study, which results in a requirements document that serves as the basis for modeling the system through UML diagrams and conceptual modeling of the MySql database and after using the relational object mapping technique (MOR). The data consistency allows secure and reliable access to these data, facilitates the collection of statistical data and streamlines the planning process, this consistent statistics allow the identification of limitation in the process and faster resolution. The result of this study is a software and database that integrates all the information of the radiotherapy planning process.
158

Análise do campo elétrico atmosférico durante tempo bom e distúrbios geofísicos

Anaya, José Carlos Tacza 19 January 2015 (has links)
Made available in DSpace on 2016-03-15T19:35:52Z (GMT). No. of bitstreams: 1 JOSE CARLOS TACZA ANAYA.pdf: 7682166 bytes, checksum: f3eebed2cf5cb0f5ecda9415f8754978 (MD5) Previous issue date: 2015-01-19 / Conselho Nacional de Desenvolvimento Científico e Tecnológico / In this dissertation, we present the capability of a new network of sensors to monitor the atmospheric electric field at various locations in South America. The main goal is to obtain the characteristic Universal Time daily curve of the atmospheric electric field in fair-weather. That curve is known as the Carnegie curve, which is related to the currents flowing in the Global Atmospheric Electric Circuit. This has been accomplished using monthly, seasonal and annual averages. After obtaining our standard curve of variation of the electric field in fair-weather, the deviations related to phenomena such as solar flares, solar protons events, geomagnetic storms, total solar eclipse and seismic activity are analyzed and commented. / Neste trabalho de dissertação apresenta-se a capabilidade de uma nova rede de sensores para monitorar o campo elétrico atmosférico em vários locais na América do Sul. O objetivo principal é obter a curva diária do campo elétrico atmosférico de tempo bom. Para isto foram realizadas médias mensais, sazonais e anuais. Essa curva é comparada com a curva característica em Tempo Universal conhecida como a Curva de Carnegie, a qual é relacionada com as correntes fluindo no Circuito Elétrico Atmosférico Global. Depois de obter a curva padrão de variação do campo elétrico atmosférico de tempo bom, foram analisados e comentados os desvios relacionados a explosões solares, eventos de prótons solares, tempestades geomagnéticas, eclipse solar e atividade sísmica.
159

用於圖形化編輯器開發之專屬模型語言設計 / A Modeling Language of Domain Specific Editors Based on Graphical Modeling Framework

呂宗龍, Lv,Zong-Long Unknown Date (has links)
圖形化模型編輯器是模型編輯器開發長久以來一直努力的方向。Eclipse.org 已提供兩個功能強大的圖形化編輯器設計框架:GEF 圖形化編輯器設計框架,和將 EMF 與 GEF 結合並簡化其設計流程的 GMF 圖形化模型編輯器設計框架,來協助開發者發展圖形化模型編輯器。 / 使用設計框架時所面臨的高學習門檻是開發者選用設計框架上的一大障礙。為降低開發者在 GMF 使用上的學習門檻,我們希望能提供一個以模型開發為導向的圖形編輯器設計專屬模型語言,讓程式設計師能夠以其慣用的程式設計角度來描述其圖形化模型編輯器的組成架構。 / 本篇研究主要是在探討 GMF 圖形化模型編輯器開發流程的簡化。在研究中希望藉由提供 GMF 圖形化模型編輯器設計框架的 GM3 專屬模型語言,讓開發者能夠透過 MDA 的角度來開發圖形化模型編輯器的架構,藉以縮短 GMF 的開發流程。實驗中使用 JavaCC 文法剖析器產生工具來開發 GM3專屬模型語言與 GMF 各項模型定義間的模型轉換。 / The availability of a visual graphical editor for a target domain is the prerequisite of visual graphical modeling, which has been adopted by classical software development for decades and is especially emphasized in today's model-driven engineering. However, compared with traditional textual editors, developing a visual graphical editor from scratch is not an easy work. As a result, there were frameworks developed such as GEF and GMF aimed to simplify the construction of graphical editors. Even so, however, it is still though hard for an average programmer to construct a visual graphical editor by using these frameworks without a long time of learning. / Our result is a modeling langauge of graphcial editors called GM3, serving as a bridge betwen developers of graphical editors and the GMF framework. With GM3, the developer can specify the model of his editing domain, how each model element should be presented by which kind of graphical elements in the editor, and how the tool pallete should be filled with various kind of creation and manipulation tools for model elements. After the GM3 specification of an editor is produced, the GM3 transformation engine developed by us using the JavaCC parser generator can be used to generate all files required of the GMF framework and, finally, a subsequent application of the standard GMF code generation procedure can produce a complete graphical editor on Eclipse platform.
160

Identifying exoplanets and unmasking false positives with NGTS

Günther, Maximilian Norbert January 2018 (has links)
In my PhD, I advanced the scientific exploration of the Next Generation Transit Survey (NGTS), a ground-based wide-field survey operating at ESO’s Paranal Observatory in Chile since 2016. My original contribution to knowledge is the development of novel methods to 1) estimate NGTS’ yield of planets and false positives; 2) disentangle planets from false positives; and 3) accurately characterise planets. If an exoplanet passes (transits) in front of its host star, we can measure a periodic decrease in brightness. The study of transiting exoplanets gives insight into their size, formation, bulk composition and atmospheric properties. Transit surveys are limited by their ability to identify false positives, which can mimic planets and out-number them by a hundredfold. First, I designed a novel yield simulator to optimise NGTS’ observing strategy and identification of false positives (published in Günther et al., 2017a). This showed that NGTS’ prime targets, Neptune- and Earth-sized signals, are frequently mimicked by blended eclipsing binaries, allowing me to quantify and prepare strategies for candidate vetting and follow-up. Second, I developed a centroiding algorithm for NGTS, achieving a precision of 0.25 milli-pixel in a CCD image (published in Günther et al., 2017b). With this, one can measure a shift of light during an eclipse, readily identifying unresolved blended objects. Third, I innovated a joint Bayesian fitting framework for photometry, centroids, and radial velocity cross-correlation function profiles. This allows to disentangle which object (target or blend) is causing the signal and to characterise the system. My method has already unmasked numerous false positives. Most importantly, I confirmed that a signal which was almost erroneously rejected, is in fact an exoplanet (published in Günther et al., 2018). The presented achievements minimise the contamination with blended false positives in NGTS candidates by 80%, and show a new approach for unmasking hidden exoplanets. This research enhanced the success of NGTS, and can provide guidance for future missions.

Page generated in 0.0252 seconds