• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 927
  • 358
  • 270
  • 233
  • 128
  • 69
  • 64
  • 58
  • 46
  • 38
  • 34
  • 13
  • 12
  • 9
  • 8
  • Tagged with
  • 2706
  • 569
  • 396
  • 350
  • 286
  • 247
  • 242
  • 241
  • 237
  • 236
  • 234
  • 227
  • 222
  • 197
  • 197
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Community ecology of ants (Hymenoptera: Formicidae) in the central sand hills of Alberta, and a key to the ants of Alberta.

Glasier, James RN Unknown Date
No description available.
352

Improved monocular videogrammetry for generating 3D dense point clouds of built infrastructure

Rashidi, Abbas 27 August 2014 (has links)
Videogrammetry is an affordable and easy-to-use technology for spatial 3D scene recovery. When applied to the civil engineering domain, a number of issues have to be taken into account. First, videotaping large scale civil infrastructure scenes usually results in large video files filled with blurry, noisy, or simply redundant frames. This is often due to higher frame rate over camera speed ratio than necessary, camera and lens imperfections, and uncontrolled motions of the camera that results in motion blur. Only a small percentage of the collected video frames are required to achieve robust results. However, choosing the right frames is a tough challenge. Second, the generated point cloud using a monocular videogrammetric pipeline is up to scale, i.e. the user has to know at least one dimension of an object in the scene to scale up the entire scene. This issue significantly narrows applications of generated point clouds in civil engineering domain since measurement is an essential part of every as-built documentation technology. Finally, due to various reasons including the lack of sufficient coverage during videotaping of the scene or existence of texture-less areas which are common in most indoor/outdoor civil engineering scenes, quality of the generated point clouds are sometimes poor. This deficiency appears in the form of outliers or existence of holes or gaps on surfaces of point clouds. Several researchers have focused on this particular problem; however, the major issue with all of the currently existing algorithms is that they basically treat holes and gaps as part of a smooth surface. This approach is not robust enough at the intersections of different surfaces or corners while there are sharp edges. A robust algorithm for filling holes/gaps should be able to maintain sharp edges/corners since they usually contain useful information specifically for applications in the civil and infrastructure engineering domain. To tackle these issues, this research presents and validates an improved videogrammetric pipeline for as built documentation of indoor/outdoor applications in civil engineering areas. The research consists of three main components: 1. Optimized selection of key frames for processing. It is necessary to choose a number of informative key frames to get the best results from the videogrammetric pipeline. This step is particularly important for outdoor environments as it is impossible to process a large number of frames existing in a large video clip. 2. Automated calculation of absolute scale of the scene. In this research, a novel approach for the process of obtaining absolute scale of points cloud by using 2D and 3D patterns is proposed and validated. 3. Point cloud data cleaning and filling holes on the surfaces of generated point clouds. The proposed algorithm to achieve this goal is able to fill holes/gaps on surfaces of point cloud data while maintaining sharp edges. In order to narrow the scope of the research, the main focus will be on two specific applications: 1. As built documentation of bridges and building as outdoor case studies. 2. As built documentation of offices and rooms as indoor case studies. Other potential applications of monocular videogrammetry in the civil engineering domain are out of scope of this research. Two important metrics, i.e. accuracy, completeness and processing time, are utilized for evaluation of the proposed algorithms.
353

Biodegradation of chlorinated compounds at interfaces and biodegradation of 4-nitroaniline

Kurt, Zohre 12 November 2012 (has links)
Most microbial activity in nature takes place at interfaces where redox discontinuities are present. Organic pollutants in groundwater encounter oxic/anoxic interfaces when they emerge to surface water bodies or volatilize above the plume. Such oxic/anoxic interfaces are key habitats for aerobic bacteria and are in turn created by the bacteria that degrade organic electron donors. In the absence of biodegradation, synthetic pollutants can migrate from the plume and impact a variety of receptors. The aims of our study were to determine whether microbes at oxic/anoxic interfaces can use synthetic chemicals as electron donors and protect the overlying vadose zone or surface water from groundwater pollutants. The approach was to design columns representing the interfaces and measure activities of the microbial communities responsible for the biodegradation of synthetic compounds.Taken together the above studies established clearly that contaminants recalcitrant under anaerobic conditions but degradable under aerobic conditions can be biodegraded at the narrow oxic/anoxic interface resulting in the protection of the overlying soil or water. The findings provide the basis for new approaches to natural attenuation that can serve to dramatically reduce the cost of bioremediation actions. Synthetic chemicals are widespread in the environment because of their extensive use in industry. These chemicals were recalcitrant until their microbial degradation pathways evolved. Currently the biodegradation pathways of many synthetic chemicals are known and serve as the basis for bioremediation strategies. The second part of the research described here involved discovery of the aerobic degradation pathway of a dye additive: 4-nitroaniline (4NA). Annotation of the whole genome sequence coupled with assays and supported with cloned enzymes revealed that the 4NA biodegradation pathway contains two monooxygenase steps prior to ring cleavage. Because nitroaniline degradation was not previously understood our work advanced the understanding of metabolic diversity in degradation of amino and nitro compounds by providing enzymes with unique activities.
354

Young people with low level literacy skills in the school and post-school environment

Macrae, Vera January 1999 (has links)
No description available.
355

On Experimental Quantum Communication and Cryptography

Erven, Christopher January 2012 (has links)
One of the most fascinating recent developments in research has been how different disciplines have become more and more interconnected. So much so that fields as disparate as information theory and fundamental physics have combined to produce ideas for the next generation of computing and secure information technologies, both of which have far reaching consequences. For more than fifty years Moore's law, which describes the trend of the transistor's size shrinking by half every two years, has proven to be uncannily accurate. However, the computing industry is now approaching a fundamental barrier as the size of a transistor approaches that of an individual atom and the laws of physics and quantum mechanics take over. Rather then look at this as the end, quantum information science has emerged to ask the question of what additional power and functionality might be realized by harnessing some of these quantum effects. This thesis presents work on the sub-field of quantum cryptography which seeks to use quantum means in order to assure the security of ones communications. The beauty of quantum cryptographic methods are that they can be proven secure, now and indefinitely into the future, relying solely on the validity of the laws of physics for their proofs of security. This is something which is impossible for nearly all current classical cryptographic methods to claim. The thesis begins by examining the first implementation of an entangled quantum key distribution system over two free-space optical links. This system represents the first test-bed of its kind in the world and while its practical importance in terrestrial applications is limited to a smaller university or corporate campus, the system mimics the setup for an entangled satellite system aiding in the study of distributing entangled photons from an orbiting satellite to two earthbound receivers. Having completed the construction of a second free-space link and the automation of the alignment system, I securely distribute keys to Alice and Bob in two distant locations separated by 1,575 m with no direct line-of-sight between them. I examine all of the assumptions necessary for my claims of security, something which is particularly important for moving these systems out of the lab and into commercial industry. I then go on to describe the free-space channel over which the photons are sent and the implementation of each of the major system components. I close with a discussion of the experiment which saw raw detected entangled photon rates of 565 s^{-1} and a quantum bit error rate (QBER) of 4.92% resulting in a final secure key rate of 85 bits/s. Over the six hour night time experiment I was able to generate 1,612,239 bits of secure key. With a successful QKD experiment completed, this thesis then turns to the problem of improving the technology to make it more practical by increasing the key rate of the system and thus the speed at which it can securely encrypt information. It does so in three different ways, involving each of the major disciplines comprising the system: measurement hardware, source technology, and software post-processing. First, I experimentally investigate a theoretical proposal for biasing the measurement bases in the QKD system showing a 79% improvement in the secret key generated from the same raw key rates. Next, I construct a second generation entangled photon source with rates two orders of magnitude higher than the previous source using the idea of a Sagnac interferometer. More importantly, the new source has a QBER as low as 0.93% which is not only important for the security of the QKD system but will be required for the implementation of a new cryptographic primitive later. Lastly, I study the free-space link transmission statistics and the use of a signal-to-noise ratio (SNR) filter to improve the key rate by 25.2% from the same amount of raw key. The link statistics have particular relevance for a current project with the Canadian Space Agency to exchange a quantum key with an orbiting satellite - a project which I have participated in two feasibility studies for. Wanting to study the usefulness of more recent ideas in quantum cryptography this thesis then looks at the first experimental implementation of a new cryptographic primitive called oblivious transfer (OT) in the noisy storage model. This primitive has obvious important applications as it can be used to implement a secure identification scheme provably secure in a quantum scenario. Such a scheme could one day be used, for example, to authenticate a user over short distances, such as at ATM machines, which have proven to be particularly vulnerable to hacking and fraud. Over a four hour experiment, Alice and Bob measure 405,642,088 entangled photon pairs with an average QBER of 0.93% allowing them to create a secure OT key of 8,939,150 bits. As a first implementer, I examine many of the pressing issues currently preventing the scheme from being more widely adopted such as the need to relax the dependance of the OT rate on the loss of the system and the need to extend the security proof to cover a wider range of quantum communication channels and memories. It is important to note that OT is fundamentally different than QKD for security as the information is never physically exchanged over the communication line but rather the joint equality function f(x) = f(y) is evaluated. Thus, security in QKD does not imply security for OT. Finally, this thesis concludes with the construction and initial alignment of a second generation free-space quantum receiver, useful for increasing the QKD key rates, but designed for a fundamental test of quantum theory namely a Svetlichny inequality violation. Svetlichny's inequality is a generalization of Bell's inequality to three particles where any two of the three particles maybe be non-locally correlated. Even so, a violation of Svetlichny's inequality shows that certain quantum mechanical states are incompatible with this restricted class of non-local yet realistic theories. Svetlichny's inequality is particularly important because while there has been an overwhelming number of Bell experiments performed testing two-body correlations, experiments on many-body systems have been few and far between. Experiments of this type are particularly valuable to explore since we live in a many-body world. The new receiver incorporates an active polarization analyzer capable of switching between measurement bases on a microsecond time-scale through the use of a Pockels cell while maintaining measurements of a high fidelity. Some of the initial alignment and analysis results are detailed including the final measured contrasts of 1:25.2 and 1:22.6 in the rectilinear and diagonal bases respectively.
356

Towards Real-World Adoption of Quantum Key Distribution using Entangled Photons

Holloway, Catherine 01 August 2012 (has links)
In order for quantum key distribution (QKD) to move from the lab to widespread adoption, it will need to be compatible with existing infrastructure. To that end, I demonstrate an implementation of QKD with entangled photons on active, standard telecommunications ber. By using a wavelength outside of the conventional band used by telecommunications tra c, I achieve minimal disruption to either the quantum or classical signals. In an attempt to extend the reach of QKD with entangled photons I studied the parameters of these systems. I developed a model for the number of measured two-fold coincidences that maximizes the secure key rate (SKR), for any combination of system parameters, using a symbolic regression algorithm based on simulated data. I validated this model against experimental data, and demonstrated its usefulness by applying it to simulations of QKD between the ground and a satellite and in optical bers. Finally, I worked on a step towards a new entangled photon source that is a hybrid between visible and telecommunications wavelengths by building a hybrid single photon source.
357

The performance measurement of healthcare facility management: A way forward

Hamwi, Tayfe, Built Environment, Faculty of Built Environment, UNSW January 2009 (has links)
Facilities management is emerging rapidly as a distinct and critical field of study and professional practice for all sectors of the economy. In the past it has tended to be regarded as a service function and subsumed within other areas such as engineering, the built environment and business management. The health sector is an important part of the overall FM industry because it represents a driver of capital works that is both extensive and expensive. The health sector building requirements are growing, demand regular maintenance and upgrading, and are critical to core business. An extensive literature review revealed that there has been minimal published research into the development and implementation of a central feature of effective FM, namely, performance management. Performance management provides the primary evaluation and planning tool for FM, in that it identifies the performance indicators that are meaningful to FM and core business, provides measures for those indicators and enables projective planning and benchmarking to be undertaken. This research study assesses the possibility of developing a comprehensive performance measurement system for FM in the healthcare sector. The research has two main aims: - To identify the knowledge gaps in current FM performance measurement systems in general, and for the healthcare sector in particular. - To investigate the possibility of benchmarking FM performance across the healthcare sector using a weighted assessment across all categories of performance. In order to meet the research purpose and data collection requirements from the relatively limited number of FM experts who specialise in the health sector, the research methodology employs a combination of a case study approach as an exploratory tool, and a phenomenological approach as the main qualitative design. In achieving the aims and objectives, this research helps in improving FM practice in the healthcare sector through: - Providing facility managers with a comprehensive study which highlights the achievements and knowledge gaps in FM and its performance measurement - Providing top management with a study for the effectiveness of developing decision support strategy to make effective and efficient changes to their facility management practices - Facilitate the implementation of benchmarking techniques in the health sector, in order to improve the performance of the healthcare sector FM over time. The study concludes that developing aggregate/collective KPIs for each performance measurement category of possible measure (social/quality, financial, environmental, functional, and technical) is feasible. However, before being able to develop and implement that approach an improvement to the current FM practices is required. The improvement can be achieved via either a voluntarily or compulsory commitment from top management towards FM in the organizations. This commitment should be implemented with an appropriate hierarchy from the bottom to the top. The key components include: establishing the contribution of FM to the success of the overall business in financial terms; setting agreed FM objectives; deriving meaningful general KPIs; defining the data required for each KPI; establishing the system for collecting, analyzing and interpreting the data; and conducting the benchmarking process for continual improvement.
358

Flexible certificate management in public key infrastructures

Karatsiolis, Evangelos. Unknown Date (has links) (PDF)
Darmstadt, Techn. University, Diss., 2007.
359

Sicheres Übergangsloses Roaming

Haisch, Michael. Unknown Date (has links)
Techn. Universiẗat, Diss., 2007--Darmstadt.
360

Formale Modellierung von Authentifizierungs- und Autorisierungsinfrastrukturen : Authentizität von deskriptiven Attributen und Privilegien auf der Basis digitaler Zertifikate /

Wölfl, Thomas. January 2006 (has links)
Universiẗat, Diss., 2006--Regensburg.

Page generated in 0.0309 seconds