• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 91
  • 22
  • 20
  • 14
  • 10
  • 10
  • 8
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 220
  • 33
  • 32
  • 23
  • 22
  • 21
  • 19
  • 18
  • 18
  • 18
  • 17
  • 16
  • 15
  • 13
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Resonant Frequency: Artefacts in Response to Time

Moulton, Clay Robert 13 November 2007 (has links)
This Industrial Design graduate thesis is a response to the discussion surrounding the question, How can Design move from Green to Good? Three artefacts have been designed. These artefacts respond to a context. Context, in this body of work, is time. Time as context is about knowing the before now, applying it to the now, to positively affect the after now. The artefacts respond to three distinct lifetimes: 5 minutes 45 seconds, 8 hours 45 minutes, and 10 years. The intent is to utilize a Natural system, time, in a manner beyond typical product life-cycle-analysis. Also included are a series of essays which investigate and comment on issues and insights encountered during the Design process of this thesis. / Master of Science
102

Alternative Shared Kitchen

Akintade, Temitope January 2024 (has links)
This master’s thesis aims to contribute to the design and innovation management literature by exploring to understand an alternative shared social kitchen that is people-driven around shared interests, promoting social well-being. This is in order to explore the potential of systemic, regenerative, speculative sustainable design in addressing challenges faced by shared kitchens, while understanding diverse user needs within enabling contexts. Through a design process of ethnography research, frameworks, imperatives and solution finding, the kitchen was explored from different participants’ perspectives and context. The resulting scenario from the ethnography research highlighted technology, food waste, circularity and food system scenarios. The study contributes to future dialogue and understanding of the different processes and perspectives in imaginative shared kitchen using an explorative approach. It is vital to understand the holistic picture of the shared kitchen and not only focus on the functional, environmental and economic aspect, but also on the social development for a more inclusive design.
103

Détection et conciliation d'erreurs intégrées dans un décodeur vidéo : utilisation des techniques d'analyse statistique / Error detection and concealment integrated in a video decoder : using technics of statistical analysis

Ekobo Akoa, Brice 31 October 2014 (has links)
Ce manuscrit présente les travaux de recherche réalisés au cours de ma thèse, dont le but est de développer des algorithmes de correction d'erreurs dans un décodage numérique d'images et d'assurer un haut niveau de la qualité visuelle des images décodées. Nous avons utilisé des techniques d'analyse statistique pour détecter et dissimuler les artefacts. Une boucle de contrôle de la qualité est implémentée afin de surveiller et de corriger la qualité visuelle de l'image. Le manuscrit comprend six chapitres. Le premier chapitre présente les principales méthodes d'évaluation de la qualité des images trouvées dans l'état de l'art et introduit notre proposition. Cette proposition est en fait un outil de mesure de la qualité des vidéos (OMQV) qui utilise le système visuel humain pour indiquer la qualité visuelle d'une vidéo (ou d'une image). Trois modèles d'OMQV sont conçus. Ils sont basés sur la classification, les réseaux de neurones artificiels et la régression non linéaire, et sont développés dans le deuxième, troisième et quatrième chapitre respectivement. Le cinquièmechapitre présente quelques techniques de dissimulation d'artefacts présents dans l'état de l'art. Le sixième et dernier chapitre utilise les résultats des quatre premiers chapitres pour mettre au point un algorithme de correction d'erreurs dans les images. La démonstration considère uniquement les artefacts flou et bruit et s'appuie sur le filtre de Wiener, optimisé sur le critère du minimum linéaire local de l'erreur quadratique moyenne. Les résultats sont présentés et discutés afin de montrer comment l'OMQV améliore les performances de l'algorithme mis en œuvre pour la dissimulation des artefacts. / This report presents the research conducted during my PhD, which aims to develop an efficient algorithm for correcting errors in a digital image decoding process and ensure a high level of visual quality of decoded images. Statistical analysis techniques are studied to detect and conceal the artefacts. A control loop is implemented for the monitoring of image visual quality. The manuscript consists in six chapters. The first chapter presents the principal state of art image quality assessment methods and introduces our proposal. This proposal consists in a video quality measurement tool (VQMT) using the Human Visual System to indicate the visual quality of a video (or an image). Three statistical learning models of VQMT are designed. They are based on classification, artificial neural networks and non-linear regression and are developed in the second, third and fourth chapter respectively. The fifth chapter presents the principal state of art image error concealment technics. The latter chapter uses the results of the four former chapters to design an algorithm for error concealment in images. The demonstration considers blur and noise artefacts and is based on the Wiener filter optimized on the criterion of local linear minimum mean square error. The results are presented and discussed to show how the VQMT improves the performances of the implemented algorithm for error concealment.
104

Activity Space in a Terminal Classic Maya HouseholdXuenkal, Yucatan, Mexico

Coakley, Corrine 29 July 2014 (has links)
No description available.
105

Constructs: Truth, Lies, and Humanity

Hansen, Nathan 30 April 2010 (has links)
This thesis is a discussion of my ideas, struggles and outcomes experienced during the making of my two bodies of work, Devices and Relics. These two bodies of work explore fleeting moments, the intrinsic values of labor and imagination with reference to sedentary living and labor in contemporary American society.
106

Wearable Forehead Pulse Oximetry: Minimization of Motion and Pressure Artifacts

Dresher, Russell Paul 03 May 2006 (has links)
Although steady progress has been made towards the development of a wearable pulse oximeter to aid in remote physiological status monitoring (RPSM) and triage operations, the ability to extract accurate physiological data from a forehead pulse oximeter during extended periods of activity and in the presence of pressure disturbances acting on the sensor remains a significant challenge. This research was undertaken to assess whether the attachment method used to secure a pulse oximeter sensor affects arterial oxygen saturation (SpO2) and heart rate (HR) accuracy during motion. Additionally, two sensor housings were prototyped to assess whether isolating the sensor from external pressure disturbances could improve SpO2 and HR accuracy. The research revealed that measurement accuracy during walking is significantly affected by the choice of an attachment method. Specifically, the research indicated that an elastic band providing a contact pressure of 60 mmHg can result in decreased measurement error and improved reliability. Furthermore, the research validated that the two isolating housings we have investigated improve SpO2 and HR errors significantly at pressures as high as 1200 mmHg (160 kPa) compared to current commercial housings. This information may be helpful in the design of a more robust pulse oximeter sensor for use in RPSM.
107

Um artefato para execução da estratégia em uma empresa varejista

Voos, Jerri Sidnei 30 May 2017 (has links)
Submitted by JOSIANE SANTOS DE OLIVEIRA (josianeso) on 2017-08-07T13:11:45Z No. of bitstreams: 1 Jerri Sidnei Voos_.pdf: 729220 bytes, checksum: a0582daae85e405e396bcb9978cf4a24 (MD5) / Made available in DSpace on 2017-08-07T13:11:45Z (GMT). No. of bitstreams: 1 Jerri Sidnei Voos_.pdf: 729220 bytes, checksum: a0582daae85e405e396bcb9978cf4a24 (MD5) Previous issue date: 2017-05-30 / Nenhuma / A execução da estratégia ainda contempla um dos principais desafios existentes nas organizações - em especial, para redes varejistas - em razão dos pontos de vendas serem distantes da matriz e também pelas constantes mudanças que atingem a forma como o cliente consome. Esta realidade do setor gera necessidade de grande alinhamento organizacional, uma vez que a execução ocorre distante do centro administrativo, o que dificulta a presente identificação de gap´s que possam vir a ocorrer. A complexidade na execução da estratégia se deve principalmente ao fato de não ser possível promover uma estratégia corporativa por meio de uma pessoa ou de um departamento, ou seja, uma execução de sucesso consiste em um grande alinhamento organizacional. Nessa perspectiva, o presente trabalho teve como objetivo propor um artefato: por artefato entende-se a organização de recursos internos para alcançar objetivos externos. Nesta proposta, o artefato é composto por alavancas estruturais e processos, que visam contribuir para elevar as chances de sucesso na execução da estratégia em redes varejistas. Para tanto, partiu-se da seguinte problemática: quais aspectos estruturais e processos podem contribuir para elevar as chances de sucesso na execução da estratégia? Foi objeto da pesquisa uma empresa varejista. Neste ambiente, entrevistas foram aplicadas ao diretor, aos supervisores, ao gerente de logística e aos gerentes comerciais. Como resultado desta etapa, conheceu-se a estrutura e os processos atuais da empresa relacionados à execução da estratégia, identificando-se os gap’s entre as recomendações da literatura e a realidade diagnosticada. De posse dessas informações, elaborou-se um artefato customizado - um ciclo sistêmico - voltado para a execução da estratégia, considerando-se as recomendações da literatura e a realidade da empresa, visando assim à superação dos gap’s detectados no ambiente empresarial. / The execution of the strategy still contemplates one of the main challenges in organizations - especially retailers - because the sales points are distant from the matrix and also by the constant changes that affect the way the customer consumes. This reality of the sector generates the need for a great organizational alignment, once the execution occurs far from the administrative center, which makes difficult the present identification of gap's that may occur. The complexity in executing the strategy is mainly due to the fact that it is not possible to promote a corporate strategy through a person or department, that is, a successful execution consists of a great organizational alignment. In this perspective, the present work had as objective to propose an artifact: by artifact is understood the organization of internal resources to reach external objectives. In this proposal, the artifact is composed of structural levers and processes, which aim to contribute to increase the chances of success in executing the strategy. To do so, we started with the following problem: what structural aspects and processes can contribute to increase the chances of success in the execution of the strategy? The research was a retailer company. In this environment, interviews were applied to the director, supervisors, logistics manager and commercial managers. As a result of this stage, the company's current structure and processes related to the execution of the strategy were identified, identifying the gap between the recommendations of the literature and the diagnosed reality. With this information, a customized artifact - a systemic cycle - was developed to execute the strategy, considering the recommendations of the literature and the reality of the company, aiming at overcoming the gap detected in the business environment.
108

Avaliação do comportamento de seladores endodônticos e a influência destes materiais sobre o diagnóstico de fraturas radiculares por meio de tomografia computadorizada por feixe cônico / Endodontic fillings behavior evaluation and their influence in root fracture diagnosis by means of cone beam computed tomography

Salineiro, Fernanda Cristina Sales 20 April 2018 (has links)
A tomografia computadorizada por feixe cônico (TCFC) tem apresentado melhores resultados em relação a radiografia convencional no diagnóstico da região de cabeça e pescoço. Porém a presença de artefatos nas imagens tomográficas podem interferir na acurácia do diagnóstico. O presente estudo teve como objetivo avaliar a influência de diferentes tipos de preenchimentos endodônticos na formação de artefato nas imagens de TCFC e avaliar a influência destes no diagnóstico de fratura radicular. Foram utilizados 200 dentes (100 incisivos centrais superiores e 100 prémolares inferiores) subdivididos em 10 grupos diferenciando o tipo de preenchimento endodôntico. Os dentes foram escaneados utilizando um tomógrafo computadorizado por feixe cônico. Inicialmente foi realizada uma medição nas imagens tomográficas com a ferramenta de informação sobre a área (ROI) e avaliação subjetiva das imagens para realizar uma comparação entre os cimentos endodônticos. Após essa fase, 100 dentes foram fraturados para realizar a última análise que foi o diagnóstico de fratura radicular. Para análise dos dados da ferramenta ROI foram utilizados a análise de Kruskal-Wallis e o teste post hoc Dunn, para comparar os dados subjetivos foi utilizado o teste de Kruskal-Wallis e para os dados de diagnóstico de fratura radicular foi calculado a Curva de característica de operação do receptor (ROC) e o teste ANOVA e o teste post hoc Tukey para os valores de área sob a curva (ASC). Resultados: O terço médio foi a região que gerou mais variações na escala de cinza para ambos os dentes. O Pulp Canal Sealer e o Sealer 26 foram os cimentos que mais alteraram as imagens segundo a análise subjetiva. Os valores de ASC variaram de 0.530 a 0.745. Conclusão: Todos os materiais produziram artefato nas imagens por TCFC, dificultando o diagnóstico de fratura radicular. / Cone beam computed tomography (CBCT) has shown better results than conventional radiographs on head and neck diagnosis. However, the presence of artifact in the CBCT images could interfere on diagnosis accuracy. The aim of this study was to evaluate the influence of different types of endodontic fillings and analyze the influence of these materials in root fracture diagnosis. Two hundred teeth (100 superior incisors and 100 inferior premolars) were divided in 10 groups categorized by each typo of endodontic filling. All teeth were scanned by using CBCT device. Initially, a measurement was performed on CBCT images using the Region of Interest (ROI) tool and a subjective analysis of the images was carried on to compare the different endodontic fillings. After this phase, 100 teeth were fractured to perform the last analysis: root fracture diagnosis. Kruskall-Wallis analyses for ROI tool data was performed with the aid of post hoc Dunn test, Kruskal-Wallis test was selected for the subjective data assessment and receiver operating characteristic (ROC) curves were used for root fracture diagnosis analyses and the variance analysis with the post hoc Tukey test for AUC values. The middle third was the region that developed more grayscale variations for both teeth. At the subjective analysis, Pulp Canal Sealer and Sealer 26 were selected as the ones that produced more images alterations. AUC ROC values ranged from 0.530 to 0.745. Conclusion: All materials produced artifact in CBCT images, making the root fracture diagnosis more difficult.
109

Proposition d'un modèle pour la représentation de contexte d'exécution de simulations informatiques à des fins de reproductibilité / Proposing a representation model of computational simulations’ execution context for reproducibility purposes

Congo, Faïçal Yannick Palingwendé 19 December 2018 (has links)
La reproductibilité en informatique est un concept incontournable au 21ème siècle. Les évolutions matérielles des calculateurs font que le concept de reproductibilité connaît un intérêt croissant au sein de la communauté scientifique. Pour les experts en simulation, ce concept est indissociable de celui de vérification, de confirmation et de validation, que ce soit pour la crédibilité des résultats de recherches ou pour l’établissement de nouvelles connaissances. La reproductibilité est un domaine très vaste. Dans le secteur computationnel et numérique, nous nous attacherons, d’une part, à la vérification de la provenance et de la consistance des données de recherches. D’autre part, nous nous intéressons à la détermination précise des paramètres des systèmes d’exploitation, des options de compilation et de paramétrage des modèles de simulation permettant l’obtention de résultats fiables et reproductibles sur des architectures modernes de calcul. Pour qu’un programme puisse être reproduit de manière consistante il faut un certain nombre d’information de base. On peut citer entre autres le système d’exploitation, l’environnement de virtualisation, les diverses librairies utilisées ainsi que leurs versions, les ressources matérielles utilisées (CPU, GPU, accélérateurs de calcul multi coeurs tel que le précédent Intel Xeon Phi, Mémoires, ...), le niveau de parallélisme et éventuellement les identifiants des threads, le statut du ou des générateurs pseudo-aléatoires et le matériel auxquels ils accèdent, etc. Dans un contexte de calcul scientifique, même évident, il n’est actuellement pas possible d’avoir de manière cohérente toutes ces informations du fait de l’absence d’un modèle standard commun permettant de définir ce que nous appellerons ici contexte d'exécution. Un programme de simulation s'exécutant sur un ordinateur ou sur un noeud de calcul, que ce soit un noeud de ferme de calcul (cluster), un noeud de grille de calcul ou de supercalculateur, possède un état et un contexte d'exécution qui lui sont propres. Le contexte d'exécution doit être suffisamment complet pour qu’à partir de celui-ci, hypothétiquement,l'exécution d’un programme puisse être faite de telle sorte que l’on puisse converger au mieux vers un contexte d’exécution identique à l’original dans une certaine mesure. Cela, en prenant en compte l'architecture de l’environnement d’exécution ainsi que le mode d'exécution du programme. Nous nous efforçons, dans ce travail, de faciliter l'accès aux méthodes de reproductibilité et de fournir une méthode qui permettra d’atteindre une reproductibilité numérique au sens strict. En effet, de manière plus précise, notre aventure s’articule autour de trois aspects majeurs. Le premier aspect englobe les efforts de collaboration, qui favorisent l'éveil des consciences vis à vis du problème de la reproductibilité, et qui aident à implémenter des méthodes pour améliorer la reproductibilité dans les projets de recherche. Le deuxième aspect se focalise sur la recherche d’un modèle unifiant de contexte d'exécution et un mécanisme de fédération d’outils supportant la reproductibilité derrière une plateforme web pour une accessibilité mondiale. Aussi, nous veillons à l’application de ce deuxième aspect sur des projets de recherche. Finalement, le troisième aspect se focalise sur une approche qui garantit une reproductibilité numérique exacte des résultats de recherche. / Computational reproducibility is an unavoidable concept in the 21st century. Computer hardware evolutions have driven a growing interest into the concept of reproducibility within the scientificcommunity. Simulation experts press that this concept is strongly correlated to the one ofverification, confirmation and validation either may it be for research results credibility or for theestablishment of new knowledge. Reproducibility is a very large domain. Within the area ofnumerical and computational Science, we aim to ensure the verification of research dataprovenance and integrity. Furthermore, we show interest on the precise identification ofoperating systems parameters, compilation options and simulation models parameterizationwith the goal of obtaining reliable and reproducible results on modern computer architectures.To be able to consistently reproduce a software, some basic information must be collected.Among those we can cite the operating system, virtualization environment, the softwarepackages used with their versions, the hardware used (CPU, GPU, many core architectures suchas the former Intel Xeon Phi, Memory, …), the level of parallelism and eventually the threadsidentifiers, the status of pseudo-random number generators, etc. In the context of scientificcomputing, even obvious, it is currently not possible to consistently gather all this informationdue to the lack of a common model and standard to define what we call here execution context.A scientific software that runs in a computer or a computing node, either as a cluster node, a gridcluster or a supercomputer possesses a unique state and execution context. Gatheringinformation about the latter must be complete enough that it can be hypothetically used toreconstruct an execution context that will at best be identical to the original. This of course whileconsidering the execution environment and the execution mode of the software. Our effortduring this journey can be summarized as seeking an optimal way to both ease genuine access toreproducibility methods to scientists and aim to deliver a method that will provide a strictscientific numerical reproducibility. Moreover, our journey can be laid out around three aspects.The first aspect involves spontaneous efforts in collaborating either to bring awareness or toimplement approaches to better reproducibility of research projects. The second aspect focusesin delivering a unifying execution context model and a mechanism to federate existingreproducibility tools behind a web platform for World Wide access. Furthermore, we investigateapplying the outcome of the second aspect to research projects. Finally, the third aspect focusesin completing the previous one with an approach that guarantees an exact numerical reproducibility of research results.
110

Feature selection and artifact removal in sleep stage classification

Hapuarachchi, Pasan January 2006 (has links)
The use of Electroencephalograms (EEG) are essential to the analysis of sleep disorders in patients. With the use of electroencephalograms, electro-oculograms (EOG), and electromyograms (EMG), doctors and EEG technician can make conclusions about the sleep patterns of patients. In particular, the classification of the sleep data into various stages, such as NREM I-IV, REM, Awake, is extremely important. The EEG signal itself is highly sensitive to physiological and non-physiological artifacts. Trained human experts can accommodate for these artifacts while they are analyzing the EEG signal. <br /><br /> However, if some of these artifacts are removed prior to analysis, their job will be become easier. Furthermore, one of the biggest motivations, of our team's research is the construction of a portable device that can analyze the sleep data as they are being collected. For this task, the sleep data must be analyzed completely automatically in order to make the classifications. <br /><br /> The research presented in this thesis concerns itself with the <em>denoising</em> and the <em>feature selection</em> aspects of the teams' goals. Since humans are able to process artifacts and ignore them prior to classification, an automated system should have the same capabilities or close to them. As such, the denoising step is performed to condition the data prior to any other stages of the sleep stage neoclassicisms. As mentioned before, the denoising step, by itself, is useful to human EEG technicians as well. <br /><br /> The denoising step in this research mainly looks at EOG artifacts and artifacts isolated to a single EEG channel, such as electrode pop artifacts. The first two algorithms uses Wavelets exclusively (BWDA and WDA), while the third algorithm is a mixture of Wavelets and In- dependent Component Analysis (IDA). With the BWDA algorithm, determining <em>consistent</em> thresholds proved to be a difficult task. With the WDA algorithm, the performance was better, since the selection of the thresholds was more straight-forward and since there was more control over defining the duration of the artifacts. The IDA algorithm performed inferior to the WDA algorithm. This could have been due to the small number of measurement channels or the automated sub-classifier used to select the <em>denoised EEG signal</em> from the set of ICA <em>demixed</em> signals. <br /><br /> The feature selection stage is extremely important as it selects the most pertinent features to make a particular classification. Without such a step, the classifier will have to process useless data, which might result in a poorer classification. Furthermore, unnecessary features will take up valuable computer cycles as well. In a portable device, due to battery consumption, wasting computer cycles is not an option. The research presented in this thesis shows the importance of a systematic feature selection step in EEG classification. The feature selection step produced excellent results with a maximum use of just 5 features. During automated classification, this is extremely important as the automated classifier will only have to calculate 5 features for each given epoch.

Page generated in 0.0424 seconds