• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 53
  • 27
  • 19
  • 12
  • 6
  • 4
  • 2
  • 2
  • 1
  • Tagged with
  • 141
  • 27
  • 26
  • 18
  • 18
  • 17
  • 16
  • 16
  • 16
  • 15
  • 13
  • 12
  • 12
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Středověká šlechtická sídla v severovýchodních Čechách (Archeologie hradu Kumburku) / Medieval aristocratic residences in the north-eastern Czech Republic (Archaeology of The Kumburk Castle)

WITKOWSKI, Martin January 2012 (has links)
This thesis is divided into two coherent parts. The first part deals with issue of castle buildings in a specified area of the north-east Czech Republic. The aim of this part is to offer an overview of development and typology of individual settlements. Next, it introduces a model which presents the structure of castle buildings' layout within the specified area. The second part of the thesis focuses on the material culture of the castle Kumburk situated in the investigated area. The purpose of this part is to analyse, synthesize and interpret the non-stratified discoveries which were obtained by field walking in the castle area. The author attempts to place the results in a historical context of this castle on a basis of comparative study of scholarly literature, written and iconographic evidence. By processing the so far neglected archaeological discoveries the author also tries to identify everyday life in the castle from the 14th to the 16th century.
42

Development towards a focus variation based micro-co-ordinate measuring machine

Hiersemenzel, Florine January 2014 (has links)
The increasing number of small and fragile parts that are being manufactured using micromachining technology has raised the demand for co-ordinate measurement machines (CMM) that can measure on a micro- and millimetric scale without contacting the part, thus avoiding damage to the surface of the part. These instruments are expected to measure on a micro- and millimetric scale with a measuring uncertainty in the nanometre range. A number of techniques used for contactless surface measurements exist, such as the focus variation (FV) technique, which have the ability to perform measurements on the micro- and millimetric scale in a short amount of time. These instruments may have the potential to be implemented in a non-contact micro-CMM platform.
43

Identification et réduction de l’artefact métallique en tomographie à rayons X / Metal artifact identification and reduction in X-ray computed tomography

Frederique, Louis 13 February 2017 (has links)
Dans le cadre du contrôle non destructif de matériaux, les scanners à rayons X sont devenus un moyen d’assurer la validité et la qualité de pièces de productions industrielles. TomoAdour, un prestataire de services privé en digitalisation 3D et tomographie industrielle à rayons X, utilise des technologies d’acquisition basées sur ce rayonnement telles qu’un scanner médical ou tomographe industriel, de manière à observer et à analyser des produits de ses clients. La nature des objets acquis (principalement de forte densité et de grandes dimensions) fait apparaître des défauts sur les images issues de ces scanners X. Il devient dès lors difficile, voire impossible, d’analyser les images produites. On notera en particulier la présence d’artefact métallique. Ce dernier est dû à la présence de matériaux très denses dans l’objet acquis et est la conséquence de la forte atténuation des rayons dans le métal et/ou de leur distorsion dans toutes les directions. Les données reconstruites sont donc souvent difficiles à interpréter directement, car ces artefacts peuvent masquer des informations importantes. De nombreuses méthodes ont été développées durant ces dernières années et la problématique de la réduction de l’artefact métallique a largement pu être étudiée, apportant bon nombre de solutions. Pourtant, les algorithmes proposés ne s’intéressent qu’aux applications médicales et ne tiennent donc pas compte des limitations physiques propres aux échantillons industriels. De plus, les méthodes proposées basent, toutes, leur traitement sur des données brutes, c’est-à-dire des données issues du scanner avant la phase de reconstruction (ensemble de radiographies ou de projections). Dans notre cas, seules les données reconstruites (tomogrammes, i.e. volume représentant l’objet) sont disponibles et il n’existe, à l’heure actuelle, aucune approche de réduction de l’artefact métallique basée sur ces reconstructions. Le but de ces travaux de recherche est donc de proposer, dans un premier temps, un outil permettant d’identifier et de quantifier l’artefact métallique dans les données reconstruites pour la correction de ces dernières. Dans un second temps, notre travail propose un ensemble d’approches adaptées à l’analyse de matériaux dans un contexte industriel [1; 2]. / In the field of non-destructive testing of materials, computed tomography became a good way to check defects in industrial piece production. TomoAdour is a private compagny specialized in 3D digitization and x-ray computed tomography, it uses medical and industrial imaging techniques to analyze materials provided by his customers. However, tomographic analysis is difficult to achieve due to the presence of high density objects (such that metal) in most produced pieces, leading to the well-known metal artifacts in reconstructed data. In X-Ray tomography, metal artifact is characterized by a local and straight hyper-signal. This observed phenomenon is due to high attenuations of the rays in the high density materials. Many different approaches have been proposed for metal artifact reduction during the last decade. However, these methods have been developed for medical application and does not take into account physical limitations specific to industrial materials. Moreover, state-of-the-art approaches start their process from the original projection data, that is to say directly from the acquired data. In our context, only reconstructed image is available due to clinical scanner usage, and there is currently no metal artifact reduction method only based on these data. The goal of this work is first to propose a tool which permit identifying and measuring metal artifact in the reconstructed data in order to correct them. Then, our work take in interest in presenting methods developed for an industrial context[1; 2].
44

Opacité des artefacts d'un système Workflow / Opacity of artifacts in Workflow system

Diouf, Mohamadou Lamine 10 October 2014 (has links)
Une propriété d'un objet est dite opaque pour un observateur si celui-ci ne peut déduire que la propriété est satisfaite sur la base de l'observation qu'il a de cet objet. Supposons qu'un certain de nombre de propriétés (appelées secrets) soient attachées à chaque intervenant d'un système, nous dirons alors que le système lui-même est opaque si chaque secret d'un observateur lui est opaque : il ne peut percer aucun des secrets qui lui ont été attachés. L'opacité a été étudié préalablement dans le contexte des systèmes à événements discrets où différents jeux d'hypothèses ont pu être identifiés pour lesquels on pouvait d'une part décider de l'opacité d'un système et d'autre part développer des techniques pour diagnostiquer et/ou forcer l'opacité. Cette thèse constitue la première contribution au problème de l'opacité des artefacts d'un système à flots de tâches (système workflow). Notre propos est par conséquent de formaliser ce problème en dégageant les hypothèses qui doivent être posées sur ces systèmes pour que l'opacité soit décidable. Nous indiquons quelques techniques pour assurer l'opacité d'un système. / A property (of an object) is opaque to an observer when he or she cannot deduce the property from its set of observations. If each observer is attached to a given set of properties (the so-called secrets), then the system is said to be opaque if each secret is opaque to the corresponding observer. Opacity has been studied in the context of discrete event dynamic systems where technique of control theory were designed to enforce opacity. This thesis is the first attempt to formalize opacity of artifacts in data-centric workflow systems. We motivate this problem and give some assumptions that guarantee the decidability of opacity. Some techniques for enforcing opacity are indicated.
45

Metodika testování webových aplikací / Methodology of Testing Web Applications

Šplíchalová, Marcela January 2008 (has links)
The principle aim of this thesis is to create an unified methodical framework for a smaller Software Testing Department. Furthermore, its aim is to define and describe an important element of testing - the software mistake, define the way of its reporting and finally with the view of it to specify the troubleshooting areas of web applications. The last aim is to find a solution of how to publish this methodology. The aims of this thesis were reached by studying available theoretical findings and applications of principles that are known from notorious and approved methodologies which complexly focus on the software development. These principles were confronted with author's practical experience. Based on these processes, the methodology described above has been originated The contribution of the thesis is in the inner structure of the methodology, summarization of the most important information, application of practical personal experiences and adapting some elements of the methodology according to its usage in a small team. Other strong points are the proposals and recommendations of how to improve the situation on the testing department of a particular company, how to publish the methodology and how to maintain it in the future. The thesis is composed of three main parts. In the first chapter, the essential characteristics of testing, models of the life cycle of software development, sorts and levels of tests are given. The second chapter is the crucial part of the thesis. It describes the whole methodology - the main workflow and its details (processes), activities made during these processes, roles occurring in the methodology and the description of its responsibility (for activities and artefacts), artefacts made on the testing department, full description of mistake and its reporting and finally the summary of mistakes appearing in the web applications environment. The last chapter attends to the way of putting the methodology into operation - technical coverage of particular parts of the methodology in the present time, suggestion of improvement the testing in the future and possibilities of the publication of the methodology.
46

The democratisation of decision-makers in data-driven decision-making in a Big Data environment: The case of a financial services organisation in South Africa

Hassa, Ishmael January 2020 (has links)
Big Data refers to large unstructured datasets from multiple dissimilar sources. Using Big Data Analytics (BDA), insights can be gained that cannot be obtained by other means, allowing better decision-making. Big Data is disruptive, and because it is vast and complex, it is difficult to manage from technological, regulatory, and social perspectives. Big Data can provide decision-makers (knowledge workers) with bottom-up access to information for decision-making, thus providing potential benefits due to the democratisation of decision-makers in data-driven decision-making (DDD). The workforce is enabled to make better decisions, thereby improving participation and productivity. Enterprises that enable DDD are more successful than firms that are solely dependent on management's perception and intuition. Understanding the links between key concepts (Big Data, democratisation, and DDD) and decision-makers are important, because the use of Big Data is growing, the workforce is continually evolving, and effective decision-making based on Big Data insights is critical to a firm's competitiveness. This research investigates the influence of Big Data on the democratisation of decision-makers in data-driven decision-making. A Grounded Theory Method (GTM) was adopted due to the scarcity of literature around the interrelationships between the key concepts. An empirical study was undertaken, based on a case study of a large and leading financial services organisation in South Africa. The case study participants were diverse and represented three different departments. GTM facilitates emergence of novel theory that is grounded in empirical data. Theoretical elaboration of new concepts with existing literature permits the comparison of the emergent or substantive theory for similarities, differences, and uniqueness. By applying the GTM principles of constant comparison, theoretical sampling and emergence, decision-makers (people, knowledge workers) became the focal point of study rather than organisational decision-making processes or decision support systems. The concentrate of the thesis is therefore on the democratisation of decision-makers in a Big Data environment. The findings suggest that the influence of Big Data on the democratisation of the decisionmaker in relation to DDD is dependent on the completeness and quality of the Information Systems (IS) artefact. The IS artefact results from, and is comprised of, information that is extracted from Big Data through Big Data Analytics (BDA) and decision-making indicators (DMI). DMI are contributions of valuable decision-making parameters by actors that include Big Data, People, The Organisation, and Organisational Structures. DMI is an aspect of knowledge management as it contains both the story behind the decision and the knowledge that was used to decide. The IS artefact is intended to provide a better and more complete picture of the decision-making landscape, which adds to the confidence of decision-makers and promotes participation in DDD which, in turn, exemplifies democratisation of the decisionmaker. Therefore, the main theoretical contribution is that the democratisation of the decisionmaker in DDD is based on the completeness of the IS artefact, which is assessed within the democratisation inflection point (DIP). The DIP is the point at which the decision-maker evaluates the IS artefact. When the IS artefact is complete, meaning that all the parameters that are pertinent to a decision for specific information is available, then democratisation of the decision-maker is realised. When the IS artefact is incomplete, meaning that all the parameters that are pertinent to a decision for specific information is unavailable, then democratisation of the decision-maker breaks down. The research contributes new knowledge in the form of a substantive theory, grounded in empirical findings, to the academic field of IS. The IS artefact constitutes a contribution to practice: it highlights the importance of interrelationships and contributions of DMI by actors within an organisation, based on information extracted through BDA, that promote decisionmaker confidence and participation in DDD. DMI, within the IS artefact, are critical to decision-making, the lack of which has implications for the democratisation of the decisionmaker in DDD. The study has uncovered the need to further investigate the extent of each actor's contribution (agency) to DMI, the implications of generational characteristics on adoption and use of Big Data and an in-depth understanding of the relationships between individual differences, Big Data and decision-making. Research is also recommended to better explain democratisation as it relates to data-driven decision-making processes.
47

Shaping proactivity : Designing interactions with proactive ambient artefacts

Zomer, Manuel January 2021 (has links)
This thesis aims to explore how to shape proactive interactions between ambient artefacts and humans. Where ambient artefacts are represented by smart personal assistants and interactive tabletop robots. The conducted research is of explorative nature, relying on a critical and speculative design approach. Key aspects and their influence in shaping proactivity were identified.  Insights upon different means of feedback, namely coloured light, motion and sound are discussed in connection with the spatial placement of a proactive artefact and the provided user experience. The notion of gaining versus losing control to an artefact is presented and a reflection on the concept of annoyance as a design material. A proactive system featuring a collection of physical and digital interactive artefacts was created as part of this research. Insights and conclusions are based on conducted interviews after user-testing those artefacts.
48

Blockchain technology within the Swedish healthcare sector

Szilagyi, Kristoffer, Glennfalk, Carl January 2018 (has links)
Sverige är ett av de mest framträdande digitaliserade länderna inom EU. Men vissa sektorer har hamnat efter i digitaliseringsprocessen, en av dem är sjukvården. Sjukvården är en av de mest informationsintensiva sektorerna i det svenska samhället, det är kritiskt att IT-systemen är sammanhållna och kommunicerande med varandra, s.k. interoperabla. Just där brister sjukvårdens IT-system idag, men sjukvården som organisation brister också i att ha någon form av enhetlig standard för hur vårdinformation ska dokumenteras. Dessa brister leder till försämrad vårdkvalitet och arbetsmiljö för vårdpersonalen. Syftet med denna studie är att utveckla en artefakt för hur blockkedjeteknikens egenskaper kan användas för att förbättra interoperabiliteten i de svenska hälso- och sjukvårdssystemen. Vi har genomfört studien med en designbaserad metod, där vi tar fram en modell baserat på blockkedjans egenskaper och presenterade problem utifrån sex intervjuer av personer som arbetar med IT i vården. Vårt resultat visar att blockkedjan har egenskaper som kan stödja interoperabilitet i sjukvården. Resultatet visar också det krävs en balans mellan säkerhet och flexibilitet samt någon form av standard för hur vårdinformation ska dokumenteras, antingen på nationell eller regional nivå, för att skapa interoperabilitet. / Sweden is one of the most prominent digitized countries within the European Union. But some sectors have fallen behind in the digitizing process; one of them is the healthcare sector. The healthcare sector is one of the most information intensive fields in the Swedish society, where it is critical that the IT-systems are integrated and communicative with each other, so-called interoperable. Today's IT systems in healthcare are failing in terms of interoperability, but the healthcare itself as an organisation also fails to have some sort of uniform standard for documenting health data. These deficiencies lead to an impaired quality of care for the patients but also a worsened environment for the healthcare professionals. The purpose of this study is to develop an artefact for how the capabilities of the blockchain technology can be used to improve interoperability within the Swedish healthcare systems. We’ve conducted this paper by using a design-science based method, where we have developed a model based on the capabilities of blockchain technology and issues presented based on interviews with six people working with IT within healthcare. Our findings show that the blockchain technology has capabilities that can support interoperability within the healthcare systems. Our findings also show that to achieve interoperability there is a need to balance security and flexibility as well as some form of unified standard for how healthcare data is to be documented, on either a national or regional level.
49

Vêtement, corps, musée : l'objet-sujet ou le patrimoine incarné

Dubuc, Élise January 2002 (has links)
Thèse numérisée par la Direction des bibliothèques de l'Université de Montréal.
50

Raman Spectroscopic and structural studies of indigo and its four 6,6'-Dihalogeno analogues

Bowen, Richard D., Edwards, Howell G.M., Jorge Villar, Susana E., Karapanayiotis, Thanassis January 2004 (has links)
No / The Raman and electron impact mass spectra of synthetic indigo and its four 6,6'-dihalogeno analogues are reported and discussed. The influence of varying the halogen on these Raman spectra is considered. Particular emphasis is laid on distinguishing indigo from 6,6'-dibromoindigo and differentiating between the dihalogenocompounds, so as to develop protocols for determining whether artefacts are coloured with dyes of marine or terrestrial origin and whether such artefacts are dyed with genuine Tyrian Purple or with dihalogenoindigo substitutes that do not contain bromine. The value of even low resolution electron impact mass spectrometry in a forensic context as a means of identifying authentic 6,6'-dibromoindigo and distinguishing it from its dihalogenoanalogues is emphasised.

Page generated in 0.0497 seconds