• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 253
  • 51
  • 34
  • 27
  • 27
  • 8
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 501
  • 501
  • 115
  • 79
  • 76
  • 68
  • 68
  • 56
  • 47
  • 44
  • 36
  • 36
  • 36
  • 35
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

Datová komunikace v distribučních systémech / Data communication in distribution systems

Sirotný, Miroslav January 2011 (has links)
This project includes a basic overview of concepts as is communication, data communication and distributed to systems. Further focuses on the technology use for remote data collection. Part is dedicated PLC technologies, which use systems for remote data collection. The thesis is also mentioned the concept of quality of electrical energy and standard ČSN EN 50160. The main part is focused on the design, implementation and measurement of the PLC network.
202

Optimalizace e-komunikace ÚMČ Praha 9 / The Optimalization of E-Communication in the UMC Praha 9

Hladík, Jiří January 2010 (has links)
The subject of this Master thesis is to identify the current situation and find an optimal solutions for communication; especially for e-communication between Municipal District Prague 9 and citizens, based on outcomes from surveys conducted with the citizens, employees and observe other Municipal offices. The theoretical part is mainly focused on the interpretation of terms, some of the relevant types of e-communication, methodology and process of market research. In the practical part is depict the current situation, further analyzed the outcome from the conducted survey research and the resulted suggestions and recommendations. I strongly believe that the outcomes from this Master thesis will contribute to achieve a desired objective of Municipal District Prague 9, which is optimal satisfaction of their citizens.
203

Datalogger pro stavební konstrukce / Datalogger or building constructions

Štábl, Martin January 2017 (has links)
This work deal with developing dedicated autonomous device for data collection. The work describes the approaches how to design that. Solve choosing of the sensor taking into his properties for this device. And regard on the comparison with similar devices on market according behavior and function. General concept of the suggested datalogger. The thesis contains hardware and software equipment of this device.
204

DEPOSIT : une approche pour exprimer et déployer des politiques de collecte sur des infrastructures de capteurs hétérogènes et partagées / DEPOSIT : an approach to model and deploy data collection policies on heterogeneous and shared sensor networks

Cecchinel, Cyril 08 November 2017 (has links)
Les réseaux de capteurs sont utilisés dans l’IoT pour collecter des données. Cependant, une expertise envers les réseaux de capteurs est requise pour interagir avec ces infrastructures. Pour un ingénieur logiciel, cibler de tels systèmes est difficile. Les spécifications des plateformes composant l'infrastructure de capteurs les obligent à travailler à un bas niveau d'abstraction et à utiliser des plateformes hétérogènes. Cette fastidieuse activité peut conduire à un code exploitant de manière non optimisée l’infrastructure. En étant spécifiques à une infrastructure, ces applications ne peuvent également pas être réutilisées facilement vers d’autres infrastructures. De plus, le déploiement de ces applications est hors du champ de compétences d’un ingénieur logiciel car il doit identifier la ou les plateforme(s) requise(s) pour supporter l’application. Enfin, l’architecture peut ne pas être conçue pour supporter l’exécution simultanée d’application, engendrant des déploiements redondants lorsqu’une nouvelle application est identifiée. Dans cette thèse, nous présentons une approche qui supporte (i) la définition de politiques de collecte de données à haut niveau d’abstraction et réutilisables, (ii) leur déploiement sur une infrastructure hétérogène dirigée par des modèles apportés par des experts réseau et (iii) la composition automatique de politiques sur des infrastructures hétérogènes. De ces contributions, un ingénieur peut dès lors manipuler un réseau de capteurs sans en connaitre les détails, en réutilisant des abstractions architecturales disponibles lors de l'expression des politiques, des politiques qui pourront également coexister au sein d'un même réseau. / Sensing infrastructures are classically used in the IoT to collect data. However, a deep knowledge of sensing infrastructures is needed to properly interact with the deployed systems. For software engineers, targeting these systems is tedious. First, the specifies of the platforms composing the infrastructure compel them to work with little abstractions and heterogeneous devices. This can lead to code that badly exploit the network infrastructure. Moreover, by being infrastructure specific, these applications cannot be easily reused across different systems. Secondly, the deployment of an application is outside the domain expertise of a software engineer as she needs to identify the required platform(s) to support her application. Lastly, the sensing infrastructure might not be designed to support the concurrent execution of various applications leading to redundant deployments when a new application is contemplated. In this thesis we present an approach that supports (i) the definition of data collection policies at high level of abstraction with a focus on their reuse, (ii) their deployment over a heterogeneous infrastructure driven by models designed by a network export and (iii) the automatic composition of the policy on top of the heterogeneous sensing infrastructures. Based on these contributions, a software engineer can exploit sensor networks without knowing the associated details, while reusing architectural abstractions available off-the-shelf in their policy. The network will also be shared automatically between the policies.
205

An Evaluation of the Individualized Behavior Rating Scale Tool (IBRST) in Inclusive Classroom Settings

Moore, Jessica L. 03 April 2019 (has links)
One of the greatest challenges facing school staff is problem behavior in the classroom (Public Agenda, 2004). Children who engage in problem behavior in the classroom setting greatly challenge teachers and diminish the ability to learn. This study evaluated the effects of self-monitoring using the Individualized Behavior Rating Scale Tool (IBRST) on problem behavior and on-task behavior in a classroom setting using a multiple-baseline across participants design. This study also evaluated the extent to which students self-rating on the IBRST correlated with direct observation data. Results indicate that self-monitoring using the IBRST was an effective strategy for increasing on-task behavior and decreasing problem behavior for all three students. Results also indicate that the IBRST may be an accurate and reliable means of measuring data when direct observation data may not be feasible or possible. There were 56/60 perfect agreements, with the other four opportunities being only 1-pt value apart. Limitations and future research are discussed.
206

Surveillance? : The influence of information asymmetry on consumers’ perceptions of online personalization

Toivonen, Elisa January 2019 (has links)
Data collection and online personalization has become essential part of modern marketing, and thus, embedded into consumer’s everyday life. This has emerged a lot of negative attention in the media and privacy concerns among consumers – however, their attitudes towards privacy seems to be controversial with lack of privacy enhancing behavior. The purpose of this study was to find out what is consumers take on online personalization, data collection and GDPR. In order to the tackle the causing reasons of such perceptions, focus group discussions were performed. The emerging thoughts were analyzed with the concepts of privacy paradox and information asymmetry – how structural imbalance between the advertisement network, companies and consumers impacted to their thinking about personalization and which factors caused the unwillingness to enhance one’s privacy, despite the attitudes that would predict different behavior. The results showed, that many respondents do not mind personalization if they perceive it relevant. However, the intrusive nature of its practices made the participants, directly or indirectly, reluctant towards it, as it was highlighted that it is not personalization per se that made the respondents uncomfortable, but how it was done. Due to the advertisement networks’ opaque nature, the participants founded challenging to comprehend how personalization was performed. Thus, conspiracy theories about surveillance, such as tapping via smartphone, were broad up to explain companies’ ability to know and target them so well. The main channel for companies to inform consumers about their privacy policy is terms and conditions. However, due to several reasons, the decision making for one’s privacy face many hinders, that may influence in how consumers perceive their privacy and how their personal data is collected and used. A controversiality between GDPR’s, companies’ and consumers’ view on privacy self-management is evident, as the regulation and companies rely too much on consumer’s own responsibility.
207

Zpověď jako technika subjektivace v politickém myšlení Michela Foucaulta / Confession as a Technique of Subjectification in the Political Thought of Michel Foucault

Doležal, Kryštof January 2017 (has links)
This thesis is focused on interpreting constitutive components of Michel Foucault's political thought (power, knowledge, subjectivity) from the viewpoint of the subjectification. The first goal of this work is to pursue the technique of confession as a crucial mechanism that participates on constructing the subjectivity in Foucault's work. To uncover in which connections it is usually pursued and what forms according to Foucault did it gain during history depending on various configurations of bonds between power and knowledge. The work's second goal is to substantiate a technique of confession defined by Foucault in contemporary human sciences, specifically in the foundations of the processes of data collection and then to demonstrate confession as a necessary element for creating an empirical type of knowledge.
208

Applying Dynamic Data Collection to Improve Dry Electrode System Performance for a P300-Based Brain-Computer Interface

Clements, J. M., Sellers, E. W., Ryan, D. B., Caves, K., Collins, L. M., Throckmorton, C. S. 07 November 2016 (has links)
Objective. Dry electrodes have an advantage over gel-based 'wet' electrodes by providing quicker set-up time for electroencephalography recording; however, the potentially poorer contact can result in noisier recordings. We examine the impact that this may have on brain-computer interface communication and potential approaches for mitigation. Approach. We present a performance comparison of wet and dry electrodes for use with the P300 speller system in both healthy participants and participants with communication disabilities (ALS and PLS), and investigate the potential for a data-driven dynamic data collection algorithm to compensate for the lower signal-to-noise ratio (SNR) in dry systems. Main results. Performance results from sixteen healthy participants obtained in the standard static data collection environment demonstrate a substantial loss in accuracy with the dry system. Using a dynamic stopping algorithm, performance may have been improved by collecting more data in the dry system for ten healthy participants and eight participants with communication disabilities; however, the algorithm did not fully compensate for the lower SNR of the dry system. An analysis of the wet and dry system recordings revealed that delta and theta frequency band power (0.1-4 Hz and 4-8 Hz, respectively) are consistently higher in dry system recordings across participants, indicating that transient and drift artifacts may be an issue for dry systems. Significance. Using dry electrodes is desirable for reduced set-up time; however, this study demonstrates that online performance is significantly poorer than for wet electrodes for users with and without disabilities. We test a new application of dynamic stopping algorithms to compensate for poorer SNR. Dynamic stopping improved dry system performance; however, further signal processing efforts are likely necessary for full mitigation.
209

The Mayo Clinic Study of Aging: Design and Sampling, Participation, Baseline Measures and Sample Characteristics

Roberts, Rosebud, Geda, Yonas E., Knopman, David S., Cha, Ruth H., Pankratz, V. Shane, Boeve, Bradley F., Ivnik, Robert J., Tangalos, Eric G., Petersen, Ronald C., Rocca, Walter A. 01 February 2008 (has links)
Background: The objective of this study was to establish a prospective population-based cohort to investigate the prevalence, incidence and risk factors for mild cognitive impairment (MCI) and dementia. Methods: The Olmsted County, Minn., population, aged 70-89 years on October 1, 2004, was enumerated using the Rochester Epidemiology Project. Eligible subjects were randomly selected and invited to participate. Participants underwent a comprehensive in-person evaluation including the Clinical Dementia Rating Scale, a neurological evaluation and neuropsychological testing. A consensus diagnosis of normal cognition, MCI or dementia was made by a panel using previously published criteria. A subsample of subjects was studied via telephone interview. Results: Four hundred and two subjects with dementia were identified from a detailed review of their medical records but were not contacted. At baseline, we successfully evaluated 703 women aged 70-79 years, 769 women aged 80-89 years, 730 men aged 70-79 years and 517 men aged 80-89 years (total n = 2,719). Among the participants, 2,050 subjects were evaluated in person and 669 via telephone. Conclusions: Strengths of the study are that the subjects were randomly selected from a defined population, the majority of the subjects were examined in person, and MCI was defined using published criteria. Here, we report the design and sampling, participation, baseline measures and sample characteristics.
210

Efficient Realistic Cloud Rendering using the Volumetric Rendering Technique : Science, Digital Game Development

Bengtsson, Adam January 2022 (has links)
With high quality in graphics being demanded a lot in modern video games, realistic clouds are noexception. In many video games, it is common that its rendering implementation is based on acollection of 2D cloud-images rendered into the scene. Through previously published work, it was found that while other techniques can be more appropriate depending on the project, volumetricrendering is the highest state-of-the-art in cloud rendering. The only lacking feature of this techniqueis the performance rate, as it is a very expensive technique. Two general problems regarding theperformance rate is that either the high quality of the clouds is not applicable to real-time rendering orthe quality has been pushed back to the point where the clouds lacked accuracy or realism in shape. There are three basic objectives to the project that were forumulated so that the aim can be completed. The objectives are listed as the following to satisfy the aim: Aim: Create a cloud generator with the volumetric rendering technique Objective 1: Create a 3D engine in OpenGL that generates clouds with volumetric rendering in real-time. Objective 2: Create different scenes that increase computational cost for the computer to render. Objective 3: Arrange tests across different computers running the engine and document the results in terms of performance. The project is created using the programming language C++ and the OpenGL library in Visual Studio. The code comes from a combination of other previously made projects regarding the subject ofrendering clouds in real-time. In order to save time in the project, two projects created by FedericoVaccaro and Sébastien Hillaire were used as references in order to quickly reach a solid foundation for experimenting with the performance rate of volumetric clouds. The resulting cloud implementation contains three of many cloud types and updates in real-time. It is possible to configure the clouds in real-time and have the density, coverage, light absorption and more be altered to generate between the three different cloud types. When changing the settings for the boxcontaining the clouds, as well as coloring and changing the position of the clouds and global light, the clouds updates in real-time. To conclude the project, rendering the clouds at the goal of above 60 FPS if only limiting the resultsdown to high-end computer was somewhat successful. The clouds visually looked realistic enough inthe scene and the efforts for improving the performance rate did not affect its overall quality. The high-end computer was able to render the clouds but the low-end computer was struggling with theclouds on their own

Page generated in 0.1199 seconds