• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 647
  • 276
  • 107
  • 85
  • 67
  • 44
  • 31
  • 14
  • 12
  • 11
  • 9
  • 5
  • 5
  • 4
  • 4
  • Tagged with
  • 1554
  • 148
  • 137
  • 98
  • 91
  • 91
  • 84
  • 80
  • 79
  • 79
  • 77
  • 75
  • 72
  • 70
  • 69
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Controlled Semi-Markov Processes With Partial Observation

Goswami, Anindya 03 1900 (has links) (PDF)
No description available.
322

Amostragem assíncrona baseada em cruzamentos por zero / Asynchronous sampling based in zero crossing

Santos, Jefferson França 31 January 2017 (has links)
Conselho Nacional de Pesquisa e Desenvolvimento Científico e Tecnológico - CNPq / Synchronous sampling is currently the most widely used analog to digital conversion method, mainly due to its implementation ease, since it employs a constant sampling rate. However, a fixed sampling rate can cause unnecessary activations of the sample and hold circuit, increasing power consumption. Asynchronous analog to digital converters can be used to solve this problem, sampling only when particulars events occur, such as amplitude level crossings. This approach has been intensely studied over the last decades, and arose as an alternative to synchronous sampling. Another asynchronous sampling approach is the one proposed by Voelcker (1966), where sampling instants are the zero crossings of the signal. According to him complex zeros must be sampled for perfect signal reconstruction, in addition to real zeros. Although being physically undetectable, Voelcker proposes that complex zeros can be “transformed” using the real zeros of all nth signal derivatives. Nevertheless this can be unfeasible and this work proposes the use of the Zero Crossing method with a limited number of derivatives. Such approach is justifiable because in consecutive derivatives, real zeros tends to be close or even repeat themselves, thus not aggregating any more information about the original signal. Therefore, this work verifies the applicability of the proposed method for systems that need low power consumption and a good reconstruction of the sampled signal, being suggest from the results of this work as a good solution of compromise between synchronous sampling and Level Crossing. / A amostragem síncrona é o método mais amplamente utilizado na conversão analógica/ digital para sinais do cotidiano, principalmente devido à facilidade de implementálos, pois a taxa de amostragem é constante. Entretanto, a taxa de amostragem fixa pode causar ativações desnecessárias do circuito de Sample and Hold, gerando um alto consumo energético. Os conversores analógicos digitais assíncronos podem ser utilizados para resolver este problema, amostrando somente quando ocorrem eventos particulares, tais como os cruzamentos de níveis de amplitude (em inglês - Level Crossing). Esta abordagem está sendo intensamente estudada nas últimas décadas, e surgiu como alternativa à amostragem síncrona. Outra alternativa de amostragem assíncrona e a abordagem proposta por Voelcker, onde os instantes de amostragem são os cruzamentos por zero do sinal. Segundo ele, para que seja possível obter a perfeita reconstrução do sinal, além dos zeros reais, os zeros complexos devem ser amostrados. Embora eles sejam fisicamente indetectáveis, Voelcker propôs que estes podem ser “transformados” em zeros reais a partir das informações de todos os zeros reais das n-ésimas derivadas do sinal. Contudo isto pode ser impraticável, e neste trabalho é proposto um novo método de Zero Crossing que utiliza somente até a 3a derivada. Tal abordagem é justificada pela possibilidade, aqui apresentada de que ao realizar muitas derivadas do sinal, os zeros reais tendem a ficar muito próximos, ou até se repetir, não agregando mais informação sobre o sinal original. Diante disso, foi verificado neste trabalho a aplicabilidade do método proposto para sistemas que necessitem de um baixo consumo energético e uma boa reconstrução do sinal amostrado, sendo apontado a partir dos resultados deste trabalho como uma boa solução de compromisso entre a amostragem síncrona e o Level Crossing.
323

Protocolo de Identificação baseado em Polinômios Multivariáveis Quadráticos / Multivariate Quadratic Polynomials Identification Protocol

Fabio de Salles Monteiro 03 December 2012 (has links)
Os sistemas criptográficos de chave pública amplamente utilizados hoje em dia tem sua segurança baseada na suposição da intratabilidade dos problemas de fatoração de inteiros e do logaritmo discreto, sendo que ambos foram demonstrados inseguros sob o advento dos computadores quânticos. Sistemas criptográficos baseados em Multivariáveis Quadráticas (MQ) utilizam como base o problema MQ, que consiste em resolver um sistema de equações polinomiais multivariáveis quadráticas sobre um corpo finito. O problema MQ foi provado como sendo NP-completo e até hoje não se conhece algoritmo, nem mesmo quântico, de tempo polinomial que possa resolver o problema, fazendo com que sistemas criptográficos baseados nesta primitiva mereçam ser investigados e desenvolvidos como reais candidatos a proverem nossa criptografia pós-quântica. Durante a CRYPTO\'2011 Sakumoto, Shirai e Hiwatari introduziram dois novos protocolos de identificação baseados em polinômios multivariáveis quadráticos, os quais chamamos de MQID-3 e MQID-5, e que em especial e pela primeira vez, tem sua segurança reduzida apenas ao problema MQ. Baseados nestas propostas iremos apresentar uma versão aprimorada do protocolo MQID-3 na qual teremos uma redução da comunicação necessária em aproximadamente 9%. / The public-key cryptography widely used nowadays have their security based on the assumption of the intractability of the problems of integer factorization and discrete logarithm, both of which were proven unsafe in the advent of quantum computers. Cryptographic systems based on Multivariate Quadratic polynomials (MQ) are based on the MQ problem, which consists in solve a system of multivariate quadratic polynomials over a finite field. The MQ problem has been proven NP-complete and so far no polynomial time algorithm is known, not even quantum, which would resolve this problem, making worthwhile to be investigated and developed as a real candidate to provide post-quantum cryptography. In CRYPTO\'2011 Sakumoto, Shirai and Hiwatari introduced two new identification protocols based on multivariate quadratic polynomials, which we call MQID-3 and MQID-5, in particular, for the first time, their security is based only on the MQ problem. Using these proposals, we will present an improved version of the protocol MQID-3 that reduces communication by approximately 9%.
324

Zero-one law for (a,k)-regularized resolvent families and the Blackstock-Crighton-Westervelt equation on Banach spaces /

Gambera, Laura Rezzieri. January 2020 (has links)
Orientador: Andréa Cristina Prokopczyk Arita / Abstract: This work presents some results of the theory of the (a,k)-regularized resolvent families, that are the main tool used in this thesis. Related with this families, one result proved in this work is the zero-one law, providing new insights on the structural properties of the theory of (a,k)-regularized resolvent families including strongly continuous semigroups, strongly continuous cosine families, integrated semigroups, among others. Moreover, an abstract nonlinear degenerate hyperbolic equation is considered, that includes the semilinear Blackstock-Crighton-Westervelt equation. By proposing a new approach based on strongly continuous semigroups and resolvent families of operators, it is proved an explicit representation of the strong and mild solutions for the linearized model by means of a kind of variation of parameters formula. In addition, under nonlocal initial conditions, a mild solution of the nonlinear equation is established. / Resumo: Este trabalho apresenta alguns resultados da teoria de famílias resolventes (a,k)- regularizadas, que é a principal ferramenta utilizada nesta tese. Relacionado com estas famílias, um resultado provado neste trabalho é a lei zero-um, que fornece novas percepções de propriedades estruturais da teoria de famílias resolventes (a,k)- regularizadas, incluindo os semigrupos fortemente contínuos, as famílias cosseno fortemente contínuas, os semigrupos integrados, entre outras. Além disso, uma equação hiperbólica degenerada não-linear abstrata é considerada, a qual inclui a equação semilinear de Blackstock-Crighton-Westervelt. Propondo uma nova abordagem baseada em semigrupos fortemente contínuos e famílias resolvente, é demonstrada uma representação explícita das soluções forte e branda para a linearização do modelo por uma espécie de método de variação dos parâmetros. Por fim, sob condições iniciais não-locais, uma solução branda da equação não-linear é estabelecida. / Doutor
325

Metriky pro detekci útoků v síťovém provozu / Metrics for Intrusion Detection in Network Traffic

Homoliak, Ivan January 2012 (has links)
Publication aims to propose and apply new metrics for intrusion detection in network traffic according to analysis of existing metrics, analysis of network traffic and behavioral characteristics of known attacks. The main goal of the thesis is to propose and implement new collection of metrics which will be capable to detect zero day attacks.
326

A world without packaging? : How can food retailers reframe the practice of packaging?

Röjning, Fredrik, Petersson, Fredrik January 2020 (has links)
Considering the increasing competition between brands and products, packaging has become an important framing tool to influence customers' purchasing decisions. However, given the growing environmental concerns, zero packaging has emerged as a new practice to face the challenges of preventing and encouraging the use of packaging. With the introduction of zero packaging, marketers have been forced to reframe the practice of packaging, as artifacts used to create identification and familiarity to form a state of resonance have been removed. To extend the research of resonance within the marketing communication science, the study employed a qualitative approach to explore how food retailers are utilizing the framing concept of resonance as a means to revamp the traditional packaging into zero packaging. To reframe the practice of packaging, the study embraces the concept of cognitive and emotional resonance. The findings impose that food retailers need to create personal alignments with product artifacts, environmental values and containers. By reviewing the contextual marketing communication field, zero packaging, a third resonance was utilized to understand how the food retailers adequately attract, change and retain customers. Subsequently, affirmation was discovered as the key mechanism to achieve motivational resonance, by interfering with customers’ intrinsic and personalized values/desires.
327

Game contingent claims

Eliasson, Daniel January 2012 (has links)
Abstract Game contingent claims (GCCs), as introduced by Kifer (2000), are a generalization of American contingent claims where the writer has the opportunity to terminate the contract, and must then pay the intrinsic option value plus a penalty. In complete markets, GCCs are priced using no-arbitrage arguments as the value of a zero-sum stochastic game of the type described in Dynkin (1969). In incomplete markets, the neutral pricing approach of Kallsen and Kühn (2004) can be used. In Part I of this thesis, we introduce GCCs and their pricing, and also cover some basics of mathematical finance. In Part II, we present a new algorithm for valuing game contingent claims. This algorithm generalises the least-squares Monte-Carlo method for pricing American options of Longstaff and Schwartz (2001). Convergence proofs are obtained, and the algorithm is tested against certain GCCs. A more efficient algorithm is derived from the first one using the computational complexity analysis technique of Chen and Shen (2003). The algorithms were found to give good results with reasonable time requirements. Reference implementations of both algorithms are available for download from the author’s Github page https://github.com/del/ Game-option-valuation-library
328

CONTROLLING THE PROPERTIES OF HOMOGENEOUS EPSILON NEAR ZERO MATERIALS AND THEIR SWITCHING BEHAVIOR

Mustafa Goksu Ozlu (12476655) 28 April 2022 (has links)
<p>One of the longstanding goals of photonics research has been to obtain strong optical nonlinearities. A promising method to achieve this goal is to operate in the so-called epsilon near zero (ENZ) spectral regime, where the real part of the dielectric permittivity changes sign. If accompanied by low losses, this region enables a platform to achieve extraordinarily high nonlinear response, along with many other interesting optical phenomena. In this work, some of the common all-optical switching structures employing homogeneous ENZ materials are investigated under varying conditions of frequency, incidence angle, and polarization. The optimum switching conditions have been highlighted to pave the way forward to the best experimental configurations in future studies. Moreover, the properties of some of the emerging novel plasmonic materials such as aluminum-doped zinc oxide (AZO) and titanium nitride (TiN) are investigated, specifically for ENZ applications. Their thickness-dependent crystalline structure and carrier densities are employed as a method to control their optical properties. A near-perfect absorption scheme is demonstrated utilizing the Ferrell-Berreman mode occurring at the ENZ region of ultrathin AZO and TiN film. The ENZ frequency and the associated absorption peak of AZO are engineered through thickness-dependence to cover most of the telecom range. This work covers the theoretical background for ENZ nonlinearities and looks into the materials aspect for better control of nonlinearities in experimental realizations.</p>
329

Green stabilization of nanoscale zero-valent iron (nZVI) with rhamnolipids produced by agro-industrial waste : application on nitrate reduction /

Moura, Cinthia Cristine de. January 2019 (has links)
Orientador: Jonas Contiero / Resumo: A contaminação ambiental causada por compostos orgânicos é um importante problema que afeta solos e água superficiais. Para reduzir ou remover esses poluentes, os locais contaminados são geralmente tratados com métodos físicos e químicos. No entanto, a maioria dessas técnicas de remediação é custosa e geralmente leva à remoção incompleta e à produção de resíduos secundários. A nanotecnologia consiste na produção e aplicação de estruturas extremamente pequenas, cujas dimensões estão na faixa de 1 a 100 nm, neste cenário a nanopartícula de ferro zero valente representa uma nova geração de tecnologias de remediação ambiental. É não tóxica, abundante, barata, fácil de produzir, e seu processo de produção é simples. No entanto, a fim de diminuir a tendência de agregação, a nanopartícula de ferro zero é frequentemente revestida com surfactantes. A maioria dos surfactantes é quimicamente sintetizado a partir de fontes petroquímicas, eles são persistentes ou parcialmente biodegradáveis, enquanto oferecem baixos riscos à saúde humana, esses compostos podem prejudicar plantas e animais. Para diminuir o uso de métodos químicos, a síntese e estabilização verde de nanomateriais metálicos apresentam-se como uma opção menos perigosa ao meio ambiente. Os biossurfactantes podem potencialmente substituir qualquer surfactante sintético, eles são compostos extracelulares produzidos por microrganismos, como bactérias, e cultivados em diferentes fontes de carbono, podendo ser substratoshidrofílico... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Environmental contamination caused by organic compounds is the most important challenge that affects a huge number of soils and water surfaces. To reduce or remove these pollutants, contaminated sites are usually treated using physical and chemical methods. However, most of these remediation techniques are expensive and commonly lead to incomplete removal and to the production of secondary wastes. Nanotechnology is the production and application of extremely small structures, whose dimensions are in the range of 1 to 100 nm and Nanoscale zero-valent iron represents a new generation of environmental remediation technologies, is non-toxic, abundant, cheap, easy to produce, and its reduction process requires little maintenance. Nonetheless, in order to diminish the tendency of aggregation, nanoscale zero-valent iron is often coated with surfactants. Most surfactants are chemically synthesized from petrochemical sources, they are slowly or partially biodegradable, while offer low harm to humans, such compounds can influence plants and animals. To decrease the use of chemical methods green synthesis and stabilization of metallic nanomaterials viable option. Biosurfactants can potentially replace virtually any synthetic they are extracellular compounds produced by microbes such as by bacteria and grown on different carbon sources containing hydrophobic/hydrophilic substrates. The biosurfactants have a wide variety of chemical structures and surface properties and among them is the ... (Complete abstract click electronic access below) / Doutor
330

Statistical Inferences on Inflated Data Based on Modified Empirical Likelihood

Stewart, Patrick 06 August 2020 (has links)
No description available.

Page generated in 0.1575 seconds