• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2596
  • 912
  • 381
  • 347
  • 331
  • 101
  • 66
  • 49
  • 40
  • 36
  • 34
  • 32
  • 31
  • 27
  • 26
  • Tagged with
  • 5940
  • 1422
  • 871
  • 726
  • 722
  • 669
  • 492
  • 490
  • 479
  • 447
  • 421
  • 414
  • 386
  • 365
  • 340
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
611

Etude des propriétés physiques des aérosols de la moyenne et haute atmosphère à partir d'une nouvelle analyse des observations du GOMOS-ENVISAT pour la période 2002-2006 / Study of the physical properties of aerosols in the middle and high atmosphere from a new analysis of GOMOS-ENVISAT observations for the 2002-2006 period

Salazar, Veronica 13 December 2010 (has links)
L´étude des aérosols de la stratosphère est primordiale pour modéliser précisément le bilan radiatif terrestre, et pour évaluer l´influence des particules sur le cycle de destruction de l´ozone. Depuis la découverte de la couche de Junge, ce domaine de recherche connaît différents décors, du plus important contenu en aérosols du dernier siècle après l´éruption du Mont Pinatubo en 1991, à un rétablissement vers les faibles niveaux atteints dans les années 2000, qui permet l´étude des particules autres que celles d´origine volcanique. Cependant, à ce jour, le degré de connaissance est faible quant à la distribution spatiale et verticale de ces aérosols dans la moyenne et haute stratosphère. Leur détection présente plusieurs difficultés: les particules ont une grande variété d´origines, compositions, tailles et formes, et leurs faibles épaisseurs optiques rendent indispensables des résultats précis. Un algorithme d´inversion développé au LPC2E a été adapté à l´analyse des données de niveau 1b de l´instrument GOMOS à bord d´ENVISAT, qui emploie la technique d´occultation stellaire, et fournit une bonne (mais irrégulière) couverture géographique et temporelle des mesures; un critère de sélection est d´ailleurs nécessaire du fait de l´utilisation de sources lumineuses de propriétés différentes. La méthode mise au point est validée pour l´étude de l´extinction induite par les aérosols; une climatologie globale est alors établie pour la période allant d´août 2002 à juillet 2006, et indique la présence permanente de particules dans l´ensemble du globe, jusqu´à environ 45 km d´altitude. La variabilité temporelle de l´extinction montre une augmentation progressive du contenu moyen depuis 2002 aux latitudes tropicales dans la basse stratosphère, et a permis d´évaluer l´effet de l´oscillation quasi-biennale et d´étudier d´autres variations saisonnières. La dépendance spectrale permet de déduire certaines spécificités concernant la taille et la nature des aérosols, majoritairement des particules sulfatées, mais également des suies en provenance de la troposphère et des particules d´origine interplanétaire. / The study of stratospheric aerosols is crucial for modeling precisely the earth´s radiative budget and because of their influence on ozone depletion. Since the discovery of Junge layer, this area of research has been through various situations: from the greatest volcanic upload of last century after Mount Pinatubo eruption in 1991, and slowly recovering to background levels reached in the 2000s, which allow the study of other than volcanic particles. However, the vertical and spatial distribution of these aerosols in the middle and high stratosphere is still poorly documented and not yet totally understood. Their detection presents many difficulties: the particles have a great variety of origins, compositions, shapes and sizes, and their low optical thicknesses make accurate results necessary. An inversion algorithm developed in the LPC2E has been adapted to the analysis of level 1b data from GOMOS instrument onboard ENVISAT. The star occultation technique leads to a good (but irregular) spatial and temporal sampling, and a data selection criteria allows the analysis of accurate results, which validation is led for the study of aerosol extinction. A global climatology is then established for the August 2002 to July 2006 period, and shows the permanent presence of aerosol particles around the globe, up to 45 km altitude. The temporal variability shows a progressive enhancement of the mean content from 2002 in the tropics, and was useful to study the influence of the quasi-biennial oscillation in the middle stratosphere, as well as some seasonal features. The study of the spectral dependence informs about the size and nature of the particles, mainly sulfate aerosols, but also soot coming from the troposphere and aerosols of extra-terrestrial origin.
612

Como chegamos até aqui: a migração da memória e os instrumentos de controle

Silva, Silvio Ferreira da 26 October 2016 (has links)
Submitted by Filipe dos Santos (fsantos@pucsp.br) on 2016-12-02T12:38:40Z No. of bitstreams: 1 Silvio Ferreira da Silva.pdf: 1559404 bytes, checksum: 53cabe18964143039fcfc890e6dd0199 (MD5) / Made available in DSpace on 2016-12-02T12:38:40Z (GMT). No. of bitstreams: 1 Silvio Ferreira da Silva.pdf: 1559404 bytes, checksum: 53cabe18964143039fcfc890e6dd0199 (MD5) Previous issue date: 2016-10-26 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / This paper provides an overview of the relationship between man and machine, specifically the computer. It is a reflexive work, addressing the expanded world of culture in an exploratory approach on memory and the evolution of its artificial supports. It begins narrating the historical evolution to the present time, focusing in the inventor geniuses, and its works., whose improvements resulted in achievements in technology. The text is subsidized by considerations regarding collection and data quality, design and content management algorithms and artificial intelligence. This study highlights the ubiquity of this intelligence in everyday society, as well as the submission of the society to its designs. It analyzes the impact that technology has on citizens from the perspective of contemporary thinkers, especially Vilém Flusser and Kurt Kurzweil, and their visions of the future. It also approaches speech surveillance, ethics, error and fraud. This paper exposes the risks to which the society is subject to undergoing processes whose managers are unaware of, with practical examples of damage that occurred in the recent past. At last, the discussion proposes the challenge of creating a code of ethics for the developers of software, because the dilemmas, social and legal, that emerge every now and then, for which there is not yet clarity of approach / Esta dissertação traça um panorama sobre o relacionamento entre o homem e a máquina, especificamente o computador. Trata-se de um trabalho reflexivo, abordando o mundo expandido da cultura numa abordagem exploratória sobre a memória e a evolução de seus suportes artificiais. Inicia-se narrando a evolução histórica até os tempos atuais, apresentando os gênios inventores e seus engenhos., engenhos esses cujos aprimoramentos resultaram nas tecnologias do presente. O texto é subsidiado por considerações a respeito de coleta e qualidade dos dados, concepção e gestão de conteúdo dos algoritmos, inteligência artificial e destaca a onipresença desta inteligência no dia a dia da sociedade, bem como a submissão dessa sociedade aos seus desígnios. Analisa o impacto que a tecnologia exerce sobre o cidadão sob a ótica de pensadores contemporâneos com destaque a Vilém Flusser e Kurt Kurzweil e suas visões de futuro. Fala de vigilância, ética, erro e fraude. Expõe os riscos a que a sociedade está sujeita ao se submeter a processos cujos gestores desconhece, com exemplos práticos de danos ocorridos em passado recente. Finaliza propondo o desafio de se criar um código de ética para os desenvolvedores dos softwares, em razão dos dilemas, sociais e legais, que emergem a cada momento e para os quais ainda não existe clareza de abordagem
613

Algorithmes d'algèbre linéaire pour la cryptographie / Linear algebra algorithms for cryptography

Delaplace, Claire 21 November 2018 (has links)
Dans cette thèse, nous discutons d’aspects algorithmiques de trois différents problèmes, en lien avec la cryptographie. La première partie est consacrée à l’algèbre linéaire creuse. Nous y présentons un nouvel algorithme de pivot de Gauss pour matrices creuses à coefficients exacts, ainsi qu’une nouvelle heuristique de sélection de pivots, qui rend l’entière procédure particulièrement efficace dans certains cas. La deuxième partie porte sur une variante du problème des anniversaires, avec trois listes. Ce problème, que nous appelons problème 3XOR, consiste intuitivement à trouver trois chaînes de caractères uniformément aléatoires de longueur fixée, telles que leur XOR soit la chaîne nulle. Nous discutons des considérations pratiques qui émanent de ce problème et proposons un nouvel algorithme plus rapide à la fois en théorie et en pratique que les précédents. La troisième partie est en lien avec le problème learning with errors (LWE). Ce problème est connu pour être l’un des principaux problèmes difficiles sur lesquels repose la cryptographie à base de réseaux euclidiens. Nous introduisons d’abord un générateur pseudo-aléatoire, basé sur la variante dé-randomisée learning with rounding de LWE, dont le temps d’évaluation est comparable avec celui d’AES. Dans un second temps, nous présentons une variante de LWE sur l’anneau des entiers. Nous montrerons que dans ce cas le problème est facile à résoudre et nous proposons une application intéressante en re-visitant une attaque par canaux auxiliaires contre le schéma de signature BLISS. / In this thesis, we discuss algorithmic aspects of three different problems, related to cryptography. The first part is devoted to sparse linear algebra. We present a new Gaussian elimination algorithm for sparse matrices whose coefficients are exact, along with a new pivots selection heuristic, which make the whole procedure particularly efficient in some cases. The second part treats with a variant of the Birthday Problem with three lists. This problem, which we call 3XOR problem, intuitively consists in finding three uniformly random bit-strings of fixed length, such that their XOR is the zero string. We discuss practical considerations arising from this problem, and propose a new algorithm which is faster in theory as well as in practice than previous ones. The third part is related to the learning with errors (LWE) problem. This problem is known for being one of the main hard problems on which lattice-based cryptography relies. We first introduce a pseudorandom generator, based on the de-randomised learning with rounding variant of LWE, whose running time is competitive with AES. Second, we present a variant of LWE over the ring of integers. We show that in this case the problem is easier to solve, and we propose an interesting application, revisiting a side-channel attack against the BLISS signature scheme.
614

Eye tracking scanpath trend analysis on Web pages

Eraslan, Sukru January 2016 (has links)
Web pages are typically comprised of different kinds of visual elements such as menus, headers and footers. To improve user experience, eye tracking has been widely used to investigate how users interact with such elements. In particular, eye movement sequences, called scanpaths, have been analysed to understand the path that people follow in terms of these elements. However, individual scanpaths are typically complicated and they are related to specific users, and therefore any processing done with those scanpaths will be specific to individuals and will not be representative of multiple users. Therefore, those scanpaths should be clustered to provide a general direction followed by users. This direction will allow researchers to better understand user interactions with web pages, and then improve the design of the pages accordingly. Existing research tends to provide a very short scanpath which is not representative for understanding user behaviours. This thesis introduces a new algorithm for clustering scanpaths, called Scanpath Trend Analysis (STA). In contrast to existing research, in STA, if a particular element is not shared by all users but it gets at least the same attention as the fully shared elements, it is included in the resulting scanpath. Thus, this algorithm provides a richer understanding of how users interact with web pages. The STA algorithm was evaluated with a series of eye tracking studies where the web pages used were automatically segmented into their visual elements by using different approaches. The results show that the outputs of the STA algorithm are significantly more similar to the inputted scanpaths in comparison with the outputs of other existing work, and this is not limited to a particular segmentation approach. The effects of the number of users were also investigated on the STA algorithm as the number of users required for scanpath analysis has not been studied in depth in the literature. The results show the possibility to reach the same results with a smaller group of users. The research presented in this thesis should be of value to eye tracking researchers, to whom the STA algorithm has been made available to analyse scanpaths, and to behaviour analysis researchers, who can use the algorithm to understand user behaviours on web pages, and then design, develop and present the pages accordingly.
615

Cognitive smart agents for optimising OpenFlow rules in software defined networks

Sabih, Ann Faik January 2017 (has links)
This research provides a robust solution based on artificial intelligence (AI) techniques to overcome the challenges in Software Defined Networks (SDNs) that can jeopardise the overall performance of the network. The proposed approach, presented in the form of an intelligent agent appended to the SDN network, comprises of a new hybrid intelligent mechanism that optimises the performance of SDN based on heuristic optimisation methods under an Artificial Neural Network (ANN) paradigm. Evolutionary optimisation techniques, including Particle Swarm Optimisation (PSO) and Genetic Algorithms (GAs) are deployed to find the best set of inputs that give the maximum performance of an SDN-based network. The ANN model is trained and applied as a predictor of SDN behaviour according to effective traffic parameters. The parameters that were used in this study include round-trip time and throughput, which were obtained from the flow table rules of each switch. A POX controller and OpenFlow switches, which characterise the behaviour of an SDN, have been modelled with three different topologies. Generalisation of the prediction model has been tested with new raw data that were unseen in the training stage. The simulation results show a reasonably good performance of the network in terms of obtaining a Mean Square Error (MSE) that is less than 10−6 [superscript]. Following the attainment of the predicted ANN model, utilisation with PSO and GA optimisers was conducted to achieve the best performance of the SDN-based network. The PSO approach combined with the predicted SDN model was identified as being comparatively better than the GA approach in terms of their performance indices and computational efficiency. Overall, this research demonstrates that building an intelligent agent will enhance the overall performance of the SDN network. Three different SDN topologies have been implemented to study the impact of the proposed approach with the findings demonstrating a reduction in the packets dropped ratio (PDR) by 28-31%. Moreover, the packets sent to the SDN controller were also reduced by 35-36%, depending on the generated traffic. The developed approach minimised the round-trip time (RTT) by 23% and enhanced the throughput by 10%. Finally, in the event where SDN controller fails, the optimised intelligent agent can immediately take over and control of the entire network.
616

The complexity of greedoid Tutte polynomials

Knapp, Christopher N. January 2018 (has links)
We consider the computational complexity of evaluating the Tutte polynomial of three particular classes of greedoid, namely rooted graphs, rooted digraphs and binary greedoids. Furthermore we construct polynomial-time algorithms to evaluate the Tutte polynomial of these classes of greedoid when they're of bounded tree-width. We also construct a Möbius function formulation for the characteristic polynomial of a rooted graph and determine the computational complexity of computing the coefficients of the Tutte polynomial of a rooted graph.
617

Um algoritmo genético de chaves aleatórias viciadas para o problema de atracamento molecular / A biased random key genetic algorithm for the molecular docking problem

Oliveira, Eduardo Spieler de January 2016 (has links)
O Atracamento Molecular é uma importante ferramenta utilizada no descobrimento de novos fármacos. O atracamento com ligante flexível é um processo computacionalmente custoso devido ao número alto de graus de liberdade do ligante e da rugosidade do espaço de busca conformacional representando a afinidade entre o receptor e uma molécula ligante. O problema é definido como a busca pela solução de menor energia de ligação proteína-ligante. Considerando uma função suficientemente acurada, a solução ótima coincide com a melhor orientação e afinidade entre as moléculas. Assim, o método de busca e a função de energia são partes fundamentais para a resolução do problema. Muitos desafios são enfrentados para a resolução do problema, o tratamento da flexibilidade, algoritmo de amostragem, a exploração do espaço de busca, o cálculo da energia livre entre os átomos, são alguns dos focos estudados. Esta dissertação apresenta uma técnica baseada em um Algoritmo Genético de Chaves Aleatórias Viciadas, incluindo a discretização do espaço de busca e métodos de agrupamento para a multimodalidade do problema de atracamento molecular. A metodologia desenvolvida explora o espaço de busca gerando soluções diversificadas. O método proposto foi testado em uma seleção de complexos proteína-ligante e foi comparado com softwares existentes: AutodockVina e Dockthor. Os resultados foram estatisticamente analisados em termos estruturais. O método se mostrou eficiente quando comparado com outras ferramentas e uma alternativa para o problema de Atracamento Molecular. / Molecular Docking is a valuable tool for drug discovery. Receptor and flexible Ligand docking is a very computationally expensive process due to a large number of degrees of freedom of the ligand and the roughness of the molecular binding search space. A Molecular Docking simulation starts with a receptor and ligand unbounded structures and the algorithm tests hundreds of thousands of ligands conformations and orientations to find the best receptor-ligand binding affinity by assigning and optimizing an energy function. Despite the advances in the conception of methods and computational strategies for search the best protein-ligand binding affinity, the development of new strategies, the adaptation, and investigation of new approaches and the combination of existing and state-of-the-art computational methods and techniques to the Molecular Docking problem are clearly needed. We developed a Biased Random-Key Genetic Algorithm as a sampling strategy to search the protein-ligand conformational space. The proposed method has been tested on a selection of protein-ligand complexes and compared with existing tools AutodockVina and Dockthor. Compared with other traditional docking software, the proposed method has the best average Root-Mean-Square Deviation. Structural results were statistically analyzed. The proposed method proved to be efficient and a good alternative to the molecular docking problem.
618

Low Complexity Beamformer structures for application in Hearing Aids

Koutrouli, Eleni January 2018 (has links)
Background noise is particularly damaging to speech intelligibility for people with hearing loss. The problem of reducing noise in hearing aids is one of great importance and great difficulty. Over the years, many solutions and different algorithms have been implemented in order to provide the optimal solution to the problem. Beamforming has been used for a long time and has therefore been extensively researched. Studying the performance of Minimum Variance Distortionless Response (MVDR) beamforming with a three- and four- microphone array compared to the conventional two-microphone array, the aim is to implement a speech signal enhancement and a noise reduction algorithm. By using multiple microphones, it is possible to achieve spatial selectivity, which is the ability to select certain signals based on the angle of incidence, and improve the performance of noise reduction beamformers. This thesis proposes the use of beamforming, an existing technique in order to create a new way to reduce noise transmitted by hearing aids. In order to reduce the complexity of that system, we use hybrid cascades, which are simpler beamformers of two inputs each and connected in series. The configurations that we consider are a three-microphone linear array (monaural beamformer), a three-microphone configuration with a two-microphone linear array and the 3rd microphone in the ear (monaural beamformer), a three-microphone configuration with a two-microphone linear array and the 3rd microphone on contra-lateral ear (binaural beamformer), and finally four-microphone configurations. We also investigate the performance improvement of the beamformer with more than two microphones for the different configurations, against the two-microphone beamformer reference. This can be measured by using objective measurements, such as the amount of noise suppression, target energy loss, output SNR, speech intelligibility index and speech quality evaluation. These objective measurements are good indicators of subjective performance. In this project, we prove that most hybrid structures can perform satisfyingly well compared to the full complexity beamformer. The low complexity beamformer is designed with a fixed target location (azimuth), where its weights are calibrated with respect to a target signal located in front of the listener and for a diffuse noise field. Both second- and third- order beamformers are tested in different acoustic scenarios, such as a car environment, a meeting room, a party occasion and a restaurant place. In those scenarios, the target signal is not arriving at the hearing aid directly from the front side of the listener and the noise field is not always diffuse. We thoroughly investigate what are the performance limitations in that case and how well the different cascades can perform. It is proven that there are some very critical factors, which can affect the performance of the fixed beamformer, concerning all the hybrid structures that were examined. Finally, we show that lower complexity cascades for both second- and third- order beamformers can perform similarly well as the full complexity beamformers when tested for a set of multiple Head Related Transfer Functions (HRTFs) that correspond to a real head shape.
619

Escalonamento de workflow com anotações de tarefas sensitivas para otimização de segurança e custo em nuvens / Workflow scheduling with sensitive task annotations for security and cost optimization in clouds

Shishido, Henrique Yoshikazu 11 December 2018 (has links)
A evolução dos computadores tem possibilitado a realização de experimentos in-silico, incluindo aplicações baseadas no modelo de workflow. A execução de workflows é uma atividade que pode ser computacional dispendiosa, onde grades e nuvens são adotadas para a sua execução. Inserido nesse contexto, os algoritmos de escalonamento de workflow permitem atender diferentes critérios de execução como o tempo e o custo monetário. Contudo, a segurança é um critério que tem recebido atenção, pois diversas organizações hesitam em implantar suas aplicações em nuvens devido às ameaças presentes em um ambiente aberto e promíscuo como a Internet. Os algoritmos de escalonamento direcionados à segurança consideram dois cenários: (a) nuvens híbridas: mantêm os tarefas que manipulam dados sensitivos/confidenciais na nuvem privada e exporta as demais tarefas para nuvens públicas para satisfazer alguma restrição (ex.: tempo), e; (b) nuvens públicas: considera o uso de serviços de segurança disponíveis em instâncias de máquinas virtuais para proteger tarefas que lidam com dados sensitivos/confidenciais. No entanto, os algoritmos de escalonamento que consideram o uso de serviços de segurança selecionam as tarefas de forma aleatória sem considerar a semântica dos dados. Esse tipo de abordagem pode acabar atribuindo proteção a tarefas não-sensitivas e desperdiçando tempo e recursos, e deixando dados sensitivos sem a proteção necessária. Frente a essas limitações, propõe-se nesta tese duas abordagens de escalonamento de workflow: o Workflow Scheduling - Task Selection Policies (WS-TSP) e a Sensitive Annotation for Security Tasks (SAST). O WS-TSP é uma abordagem de escalonamento que usa um conjunto de políticas para a proteção de tarefas. O SAST é outra abordagem que permite utilizar o conhecimento prévio do Desenvolvedor de Aplicação para identificar quais tarefas devem receber proteção. O WS-TSP e a SAST consideram a aplicação de serviços de segurança como autenticação, verificação de integridade e criptografia para proteger as tarefas sensitivas do workflow. A avaliação dessas abordagens foi realizada através de uma extensão do simulador WorkflowSim que incorpora a sobrecarga dos serviços de segurança no tempo, do custo e do risco de execução do workflow. As duas abordagens apresentaram menor risco de segurança do que as abordagens da literatura, sob um custo e makespan razoáveis. / The evolution of computers has enabled in-silico experiments to take place, including applications based on the workflow model. The execution of workflows is an activity that can be computationally expensive, where grids and clouds are adopted for its execution. In this context, the workflow scheduling algorithms allow meeting different execution criteria such as time and monetary cost. However, security is a criterion that has received attention because several organizations hesitate to deploy their applications in clouds due to the threats present in an open and promiscuous environment like the Internet. Security-oriented scheduling algorithms consider two scenarios: (a) hybrid clouds: holds tasks that manipulate sensitive data in the private cloud and export the other tasks to public clouds to satisfy some constraints (eg, time); (b) public clouds: considers the use of security services available in instances of virtual machines to protect tasks that deal with sensitive data. However, scheduling algorithms that consider the use of security services randomly select tasks without considering data semantics. This type of approach may end up assigning protection to non-sensitive tasks and wasting time and resources and leaving sensitive data without the necessary protection. In view of these limitations, two workflow scheduling approaches are proposed: Workflow Scheduling (WS-TSP) and Sensitive Annotation for Security Tasks (SAST). WS-TSP is a scheduling approach that uses a set of policies for task protection. SAST is another approach that allows using the Application Developers prior knowledge to identify which tasks should be protected. WS-TSP and SAST consider implementing security services such as authentication, integrity verification, and encryption to protect sensitive tasks. The evaluation of these approaches was carried out through an extension of the simulatorWorkflowSim that incorporates the overhead of security services in the execution time, the cost and the risk of execution The two approaches presented a lower security risk than the literature approaches, at a reasonable cost and makespan.
620

A Semi-Automated Algorithm for Segmenting the Hippocampus in Patient and Control Populations

Muncy, Nathan McKay 01 June 2016 (has links)
Calculating hippocampal volume from Magnetic Resonance (MR) images is an essential task in many studies of neurocognition in healthy and diseased populations. The `gold standard' method involves hand tracing, which is accurate but laborious, requiring expertly trained researchers and significant amounts of time. As such, segmenting large datasets with the standard method is impractical. Current automated pipelines are inaccurate at hippocampal demarcation and volumetry. We developed a semi-automated hippocampal segmentation pipeline based on the Advanced Normalization Tools (ANTs) suite of programs to segment the hippocampus. We applied the semi-automated segmentation pipeline to 70 participant scans (26 female) from groups that included participants diagnosed with autism spectrum disorder, healthy older adults (mean age 74) and healthy younger controls. We found that hippocampal segmentations obtained with the semi-automated pipeline more closely matched the segmentations of an expert rater than those obtained using FreeSurfer or the segmentations of novice raters. Further, we found that the pipeline performed best when including manually- placed landmarks and when using a template generated from a heterogeneous sample (that included the full variability of group assignments) than a template generated from more homogeneous samples (using only individuals within a given age or with a specific neuropsychiatric diagnosis). Additionally, the semi-automated pipeline required much less time (5 minutes per brain) than manual segmentation (30-60 minutes per brain) or FreeSurfer (8 hours per brain).

Page generated in 0.0502 seconds