• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 3
  • 1
  • Tagged with
  • 10
  • 8
  • 7
  • 4
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

What is integrity and how do we use it? – Enhancing the validity of integrity by reviewing integrity tests, expanding the nomological network, and reducing faking / Was ist Integrität und wie nutzen wir sie? – Verbesserung der Validität von Integrität durch Überprüfung von Integritätstests, Erweiterung des nomologischen Netzwerkes und Reduzierung von Faking

Schmitt, Nadine J. B. January 2022 (has links) (PDF)
This dissertation focuses on the construct and criterion validity of integrity tests and aims to enhance both. To accomplish this goal, three approaches were adopted: First, an overview and systematic comparison of integrity tests was conducted with reference to the construction and application of the tests. Second, the nomological network of integrity tests was expanded with reference to honesty-humility and organizational citizenship behavior at their factor and facet level. Third, two promising methods to reduce faking on integrity tests were tested: the double rating method (Hui, 2001) and the indirect questioning technique. In line with previous research, the results of the overview and comparison of integrity measures confirmed that integrity tests are multidimensional and heterogenous. A clear definition of integrity is urgently needed. The personality trait of honesty-humility and its facets of fairness, and modesty revealed the most significant relationships to integrity. Moreover, organizational citizenship behavior and its facets of altruism, conscientiousness, and sportsmanship were found to significantly relate to integrity. Furthermore, integrity tests were able not only to predict organizational citizenship behavior but also to incrementally predict job performance and organizational citizenship behavior beyond the factor and facet level of the personality traits of conscientiousness and honesty-humility. In contrast to the indirect questioning technique, the double rating method, which includes an other rating and a self rating, was shown to be able to significantly reduce faking on integrity tests in an anonymous survey setting. This dissertation makes an important contribution to better explain the construct and nomological network of integrity, provide a more detailed view on integrity tests and their protection against faking, and expand the predictive and incremental validity of these tests. The implications for future research and practice are further discussed. / Der Fokus dieser Dissertation liegt sowohl auf der Konstrukt- als auch auf der Kriteriumsvalidität von Integritätstests und deren Erhöhung mit Hilfe von drei Ansätzen: (a) Ein Überblick und systematischer Vergleich von Integritätstests mit Bezug auf deren Konstruktion und Anwendung wird erstellt. (b) Das nomologische Netzwerk von Integrität wird erweitert mit Bezug auf Honesty-Humility und freiwilligem Arbeitsverhalten sowohl auf Faktoren- als auch Facettenebene. (c) Zwei vielversprechende Ansätze, um Faking in Integritätstests zu reduzieren, werden getestet: die Doppelte Fragemethode (Hui, 2001) und indirektes Fragen. Im Ergebnis zeigte sich, dass die Multidimensionalität und Heterogenität von Integritätstests bestätigt wurde. Eine klare Definition von Integrität ist dringend notwendig. Mit Blick auf das nomologische Netzwerk zeigte Honesty-Humility und deren Facetten Fairness und Bescheidenheit die höchsten signifikanten Korrelationen mit Integrität. Beim freiwilligen Arbeitsverhalten und deren Facetten Altruismus, Gewissenhaftigkeit und Unkompliziertheit wurden ebenfalls signifikante Zusammenhänge gefunden. Des Weiteren zeigte sich, dass Integritätstests inkrementelle Validität besitzen über die Faktoren und Facetten von Gewissenhaftigkeit und Honesty-Humility hinaus bei der Vorhersage von beruflicher Leistung und bei der Vorhersage von freiwilligem Arbeitsverhalten. Im Hinblick auf die Reduzierung von Faking zeigte nur die Doppelte Fragemethode (d.h. Fremd- und Selbsteinschätzung der Items) in der anonymen Forschungssituation eine signifikante Reduzierung. Diese Dissertation leistet einen wichtigen Beitrag zum besseren Verständnis des Konstruktes und nomologischen Netzwerkes der Integrität, bietet einen detaillierten Einblick in Integritätstests sowie deren Schutz vor Faking und verbessert die prognostische und inkrementelle Validität von Integritätstests. Die Implikationen für zukünftige Forschung und Praxis werden weiterführend diskutiert.
2

Konsistenzerhaltende Techniken für generierbare Wissensbasen zum Entwurf eingebetteter Systeme

Sporer, Mathias 18 February 2008 (has links) (PDF)
Der Entwurfsprozess informationsverarbeitender Systeme ist gekennzeichnet durch die Beschreibung von speichernden, verarbeitenden und übertragenden Komponenten auf unterschiedlichen Abstraktionsstufen. Sowohl für spezifische Anwendungsdomänen als auch für die jeweiligen Abstraktionsstufen wurden in der Vergangenheit Werkzeuge entwickelt, die den Systemdesigner von der Phase der Anforderungsspezifikation bis hin zu Implementierung und funktionaler Erprobung begleiten. Beim Entwurf komplexer Systeme im allgemeinen und eingebetteter Systeme im besonderen stellt sich zusätzlich das Problem der Wiederverwendung von Komponenten aus früheren Entwürfen, der Transformation des Entwurfswissens über die Grenzen der Abstraktionsstufen hinweg sowie die Integration einer variablen Anzahl domänenspezifischer Werkzeuge in den Entwurfsprozess. Voraussetzung eines korrekten Designs ist dabei die anwendungsinvariante Integritätserhaltung aller beteiligten Entwurfsdaten unabhängig von ihrer Repräsentation. Nach der Diskussion des Integritätsbegriffs für konventionelle Informationssysteme und den nötigen Erweiterungen für eingebettete Systeme werden Verfahren zur Modellierung des Entwurfsprozesses vorgestellt, mit deren Hilfe eine der spezifischen Entwicklungsaufgabe optimal entsprechende Wissensbasis generiert und fortwährend neuen Anforderungen von Fremdwerkzeugen und Entwurfsverfahren angepasst werden kann. Sie erfordert vom Anwender keine Detailkenntnisse des zugrunde liegenden Datenmodells. Die Generierbarkeit der Wissensbasis und ihrer Werkzeuge beruht auf einem Metamodell, das sich auf eine erweiterbare Objektalgebra zur Struktur- und Verhaltensbeschreibung informationsverarbeitender Systeme stützt und in domänenspezifische Zielsysteme transformierbar ist. / The design process of data processing systems is characterized by the description of storing, processing and transmitting components on different levels of abstraction. In the past tools have been developed for specific application domains as well as for the respective abstraction levels. They support the system designer from the stage of the requirements specification down to implementation and functional test. During the sketch of complex systems in general and embedded systems in particular, problems occur in the following areas: reusing the components from former drafts; transforming the design knowledge across the boundaries of abstraction levels; integrating a variable number of domain specific tools in the design process. The precondition for a correct design is the integrity preservation of all involved draft data no matter which sources such as databases, XML files or conventional HOST file systems provide them. After discussing the integrity term regarding conventional information systems and the extensions necessary for embedded systems, approaches for modelling the design process are presented. They help to generate a knowledge base which is optimally adjusted to a particular design task and can be continuously adapted to new requests coming from external tools and design processes. The user does not need detailed knowledge about the knowledge base's underlying data model. The capability of generating the knowledge base and its tools is based on a meta model. First, this model is based on an extensible object algebra applied when describing the structure and behaviour of data processing systems and second, the model is transformable into domain specific target systems.
3

On the Integrity of the World of Sounds: Montage and Organic Unity

Zenkin, Konstantin 24 October 2023 (has links)
The purpose of this paper is to track the history of interaction of two fundamental principles of creating sounding musical texts: the organic unity, on the one hand, and the editing, on the other hand, in both composers’ and performers’ artwork. It is likely that the idea of organic unity reached the highest point of its development in Mozart’s oeuvre; later, Beethoven and composers of the following generations started comprehending the idea of a process (or, in philosophical terms, “the becoming”) as something organically integral. According to an opinion of musicians with a romantic way of thinking, sound engineers’ work in general, in particular editing, tends to break an ideal view of integrity as the instantaneous and inimitable life of an artwork. At the same time, as will be presented, the principle of editing an artwork, from the motifs to the entire structure, also reached its highest, though often implicit, expression in the music of Romanticism (Schumann, Chopin). On the contrary, musicians of the post-Romantic era, such as Strawinsky or Gould, preferred the method of montage which can easily explain their general preference for the audio recording, with its almost unavoidable, merely “cinematographic,” editing for the live sound. Having disavowed an idea of the process as a kind of organic development, both composers of the avant-garde and the following trends in new music have engaged a virtuosic playing with the very principle of editing, including the editing of every separate sound, as their most important creative method. Therefore, there are various ways of comprehending an idea of integrity at different time periods of art history as well as fundamentally different methods of implementing this idea in accordance with certain artistic purposes.
4

Konsistenzerhaltende Techniken für generierbare Wissensbasen zum Entwurf eingebetteter Systeme

Sporer, Mathias 16 July 2007 (has links)
Der Entwurfsprozess informationsverarbeitender Systeme ist gekennzeichnet durch die Beschreibung von speichernden, verarbeitenden und übertragenden Komponenten auf unterschiedlichen Abstraktionsstufen. Sowohl für spezifische Anwendungsdomänen als auch für die jeweiligen Abstraktionsstufen wurden in der Vergangenheit Werkzeuge entwickelt, die den Systemdesigner von der Phase der Anforderungsspezifikation bis hin zu Implementierung und funktionaler Erprobung begleiten. Beim Entwurf komplexer Systeme im allgemeinen und eingebetteter Systeme im besonderen stellt sich zusätzlich das Problem der Wiederverwendung von Komponenten aus früheren Entwürfen, der Transformation des Entwurfswissens über die Grenzen der Abstraktionsstufen hinweg sowie die Integration einer variablen Anzahl domänenspezifischer Werkzeuge in den Entwurfsprozess. Voraussetzung eines korrekten Designs ist dabei die anwendungsinvariante Integritätserhaltung aller beteiligten Entwurfsdaten unabhängig von ihrer Repräsentation. Nach der Diskussion des Integritätsbegriffs für konventionelle Informationssysteme und den nötigen Erweiterungen für eingebettete Systeme werden Verfahren zur Modellierung des Entwurfsprozesses vorgestellt, mit deren Hilfe eine der spezifischen Entwicklungsaufgabe optimal entsprechende Wissensbasis generiert und fortwährend neuen Anforderungen von Fremdwerkzeugen und Entwurfsverfahren angepasst werden kann. Sie erfordert vom Anwender keine Detailkenntnisse des zugrunde liegenden Datenmodells. Die Generierbarkeit der Wissensbasis und ihrer Werkzeuge beruht auf einem Metamodell, das sich auf eine erweiterbare Objektalgebra zur Struktur- und Verhaltensbeschreibung informationsverarbeitender Systeme stützt und in domänenspezifische Zielsysteme transformierbar ist. / The design process of data processing systems is characterized by the description of storing, processing and transmitting components on different levels of abstraction. In the past tools have been developed for specific application domains as well as for the respective abstraction levels. They support the system designer from the stage of the requirements specification down to implementation and functional test. During the sketch of complex systems in general and embedded systems in particular, problems occur in the following areas: reusing the components from former drafts; transforming the design knowledge across the boundaries of abstraction levels; integrating a variable number of domain specific tools in the design process. The precondition for a correct design is the integrity preservation of all involved draft data no matter which sources such as databases, XML files or conventional HOST file systems provide them. After discussing the integrity term regarding conventional information systems and the extensions necessary for embedded systems, approaches for modelling the design process are presented. They help to generate a knowledge base which is optimally adjusted to a particular design task and can be continuously adapted to new requests coming from external tools and design processes. The user does not need detailed knowledge about the knowledge base's underlying data model. The capability of generating the knowledge base and its tools is based on a meta model. First, this model is based on an extensible object algebra applied when describing the structure and behaviour of data processing systems and second, the model is transformable into domain specific target systems.
5

Von der Gastarbeit zur Identitätsarbeit / Integritätsverhandlungen in türkisch-deutschen Texten von Senocak, Özdamar, Agaoglu und der Online Community vaybee! / From Guest-Work to Identity-Work / Negotiating Integrity in Turkish-German texts by Senocak, Özdamar, Agaoglu and the Online Community vaybee!

Dayioglu-Yücel, Yasemin 06 July 2004 (has links)
No description available.
6

Neuroscientific approaches to general intelligence and cognitive ageing

Penke, Lars 28 October 2011 (has links)
Nach einem ausführlichem Überblick über den Kenntnisstand der Genetik und Neurowissenschaft von allgemeiner Intelligenz und einer methodischen Anmerkung zur Notwendigkeit der Berücksichtigung latenter Variablen in den kognitiven Neurowissenschaften am Beispiel einer Reanalyse publizierter Ergebnisse wir das am besten etablierte Gehirnkorrelat der Intelligenz, die Gehirngröße, aus evolutionsgenetischer Perspektive neu betrachtet. Schätzungen des Koeffizienten additiv-genetischer deuten an, dass es keine rezente direktionale Selektion auf Gehirngröße gegeben hat, was ihre Validität als Proxy für Intelligenz in evolutionären Studien in Frage stellt. Stattdessen deuten Korrelationen der Gesichtssymmetrie älterer Männer mit Intelligenz und Informationsverarbeitungsgeschwindigkeit an, dass organismusweite Entwicklungsstabilität eine wichtige Grundlage von unterschieden in kognitiven Fähigkeiten sein könnte. Im zweiten Teil dieser Arbeit geht es vornehmlich um die Alterung kognitiver Fähigkeiten, beginnend mit einem allgemeinen Überblick. Daten einer Stichprobe von über 130 Individuen zeigen dann, dass die Integrität verschiedener Nervenbahnen im Gehirn hoch korreliert, was die Extraktion eines Generalfaktors der Traktintegrität erlaubt, der mit Informationsverarbeitungsgeschwindigkeit korreliert. Der einzige Trakt mit schwacher Ladung auf diesem Generalfaktor ist das Splenium des Corpus Callosum, welches mit Veränderungen der Intelligenz über 6 Jahrzehnte korreliert und den Effekt des Bet2 adrenergischem Rezeptorgens (ADRB2) auf diese Veränderung mediiert, möglicherweise durch Effekte auf neuronale Komopensationsprozesse. Schließlich wird auf Basis neuer Analyseverfahren für Magnetresonanzdaten gezeigt, dass vermehrte Eiseneinlagerungen im Gehirn, vermutlich Marker für zerebrale Mikroblutungen, sowohl mit lebenslang stabilen Intelligenzunterschieden als auch mit der altersbedingten Veränderung kognitiver Fähigkeiten assoziiert sind. / After an extensive review of what is known about the genetics and neuroscience of general intelligence and a methodological note emphasising the necessity to consider latent variables in cognitive neuroscience studies, exemplified by a re-analysis of published results, the most well-established brain correlate of intelligence, brain size, is revisited from an evolutionary genetic perspective. Estimates of the coefficient of additive genetic variation in brain size suggest that there was no recent directional selection on brain size, questioning its validity as a proxy for intelligence in evolutionary analyses. Instead, correlations of facial fluctuating asymmetry with intelligence and information processing speed in old men suggest that organism-wide developmental stability might be an important cause of individual differences in cognitive ability. The second half of the thesis focuses on cognitive ageing, beginning with a general review. In a sample of over 130 subjects it has then been found that the integrity of different white matter tracts in the brain is highly correlated, allowing for the extraction of a general factor of white matter tract integrity, which is correlated with information processing speed. The only tract not loading highly on this general factor is the splenium of the corpus callosum, which is correlated with changes in intelligence over 6 decades and mediates the effect of the beta2 adrenergic receptor gene (ADRB2) on cognitive ageing, possibly due to its involvement in neuronal compensation processes. Finally, using a novel analytic method for magnetic resonance data, it is shown that more iron depositions in the brain, presumably markers of a history of cerebral microbleeds, are associated with both lifelong-stable intelligence differences and age-related decline in cognitive functioning.
7

Novel proapoptotic p63 isoforms are driven by an endogenous retrovirus in the male germ line of humans and great apes, likely increasing genome stability / Neue proapoptotische p63-Isoformen werden von einem endogenen Retrovirus in den männlichen Keimbahnen von Mensch und Menschenaffen gesteuert und erhöhen wahrscheinlich die genomische Stabilität

Beyer, Ulrike 29 October 2010 (has links)
No description available.
8

Bayesian Approach for Reliable GNSS-based Vehicle Localization in Urban Areas / Zuverlässige satellitengestützte Fahrzeuglokalisierung in städtischen Gebieten

Obst, Marcus 20 March 2015 (has links) (PDF)
Nowadays, satellite-based localization is a well-established technical solution to support several navigation tasks in daily life. Besides the application inside of portable devices, satellite-based positioning is used for in-vehicle navigation systems as well. Moreover, due to its global coverage and the availability of inexpensive receiver hardware it is an appealing technology for numerous applications in the area of Intelligent Transportation Systems (ITSs). However, it has to be admitted that most of the aforementioned examples either rely on modest accuracy requirements or are not sensitive to temporary integrity violations. Although technical concepts of Advanced Driver Assistance Systems (ADASs) based on Global Navigation Satellite Systems (GNSSs) have been successfully demonstrated under open sky conditions, practice reveals that such systems suffer from degraded satellite signal quality when put into urban areas. Thus, the main research objective of this thesis is to provide a reliable vehicle positioning concept which can be used in urban areas without the aforementioned limitations. Therefore, an integrated probabilistic approach which preforms fault detection & exclusion, localization and multi-sensor data fusion within one unified Bayesian framework is proposed. From an algorithmic perspective, the presented concept is based on a probabilistic data association technique with explicit handling of outlier measurements as present in urban areas. By that approach, the accuracy, integrity and availability are improved at the same time, that is, a consistent positioning solution is provided. In addition, a comprehensive and in-depth analysis of typical errors in urban areas within the pseudorange domain is performed. Based on this analysis, probabilistic models are proposed and later on used to facilitate the positioning algorithm. Moreover, the presented concept clearly targets towards mass-market applications based on low-cost receivers and hence aims to replace costly sensors by smart algorithms. The benefits of these theoretical contributions are implemented and demonstrated on the example of a real-time vehicle positioning prototype as used inside of the European research project GAlileo Interactive driviNg (GAIN). This work describes all necessary parts of this system including GNSS signal processing, fault detection and multi-sensor data fusion within one processing chain. Finally, the performance and benefits of the proposed concept are examined and validated both with simulated and comprehensive real-world sensor data from numerous test drives.
9

Digitale Erinnerungskultur und das Gedenkjahr 1938

Bistrovic, Miriam 06 December 2019 (has links)
No description available.
10

Bayesian Approach for Reliable GNSS-based Vehicle Localization in Urban Areas

Obst, Marcus 19 December 2014 (has links)
Nowadays, satellite-based localization is a well-established technical solution to support several navigation tasks in daily life. Besides the application inside of portable devices, satellite-based positioning is used for in-vehicle navigation systems as well. Moreover, due to its global coverage and the availability of inexpensive receiver hardware it is an appealing technology for numerous applications in the area of Intelligent Transportation Systems (ITSs). However, it has to be admitted that most of the aforementioned examples either rely on modest accuracy requirements or are not sensitive to temporary integrity violations. Although technical concepts of Advanced Driver Assistance Systems (ADASs) based on Global Navigation Satellite Systems (GNSSs) have been successfully demonstrated under open sky conditions, practice reveals that such systems suffer from degraded satellite signal quality when put into urban areas. Thus, the main research objective of this thesis is to provide a reliable vehicle positioning concept which can be used in urban areas without the aforementioned limitations. Therefore, an integrated probabilistic approach which preforms fault detection & exclusion, localization and multi-sensor data fusion within one unified Bayesian framework is proposed. From an algorithmic perspective, the presented concept is based on a probabilistic data association technique with explicit handling of outlier measurements as present in urban areas. By that approach, the accuracy, integrity and availability are improved at the same time, that is, a consistent positioning solution is provided. In addition, a comprehensive and in-depth analysis of typical errors in urban areas within the pseudorange domain is performed. Based on this analysis, probabilistic models are proposed and later on used to facilitate the positioning algorithm. Moreover, the presented concept clearly targets towards mass-market applications based on low-cost receivers and hence aims to replace costly sensors by smart algorithms. The benefits of these theoretical contributions are implemented and demonstrated on the example of a real-time vehicle positioning prototype as used inside of the European research project GAlileo Interactive driviNg (GAIN). This work describes all necessary parts of this system including GNSS signal processing, fault detection and multi-sensor data fusion within one processing chain. Finally, the performance and benefits of the proposed concept are examined and validated both with simulated and comprehensive real-world sensor data from numerous test drives.

Page generated in 0.0585 seconds