• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 4
  • 1
  • 1
  • Tagged with
  • 22
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

An Examination of Site Response in Columbia, South Carolina: Sensitivity of Site Response to "Rock" Input Motion and the Utility of Vs(30)

Lester, Alanna Paige 21 July 2005 (has links)
This study examines the sensitivity of calculated site response in connection with alternative assumptions regarding input motions and procedures prescribed in the IBC 2000 building code, particularly the use of average shear wave velocity in the upper 30 meters as an index for engineering design response spectra. Site specific subsurface models are developed for four sites in and near Columbia, South Carolina using shear wave velocity measurements from cone penetrometer tests. The four sites are underlain by thin coastal plain sedimentary deposits, overlying high velocity Paleozoic crystalline rock. An equivalent-linear algorithm is used to estimate site response for vertically incident shear waves in a horizontally layered Earth model. Non-linear mechanical behavior of the soils is analyzed using previously published strain-dependent shear modulus and damping degradation models. Two models for material beneath the investigated near-surface deposits are used: B-C outcrop conditions and hard rock outcrop conditions. The rock outcrop model is considered a geologically realistic model where a velocity gradient, representing a transition zone of partially weathered rock and fractured rock, overlies a rock half-space. Synthetic earthquake input motions are generated using the deaggregations from the 2002 National Seismic Hazard Maps, representing the characteristic Charleston source. The U. S. Geological Survey (2002) uniform hazard spectra are used to develop 2% in 50 year probability of exceedance input ground motions for both B-C boundary and hard rock outcrop conditions. An initial analysis was made for all sites using an 8 meter thick velocity gradient for the rock input model. Sensitivity of the models to uncertainty of the weathered zone thickness was assessed by randomizing the thickness of the velocity gradient. The effect of the velocity gradient representing the weathered rock zone increases site response at high frequencies. Both models (B-C outcrop conditions and rock outcrop conditions) are compared with the International Building Code (IBC 2000) maximum credible earthquake spectra. The results for both models exceed the IBC 2000 spectra at some frequencies, between 3 and 10 Hz at all four sites. However, site 2, which classifies as a C site and is therefore assumed to be the most competent of the four sites according to IBC 2000 design procedures, has the highest calculated spectral acceleration of the four sites analyzed. Site 2 has the highest response because a low velocity zone exists at the bottom of the geotechnical profile in immediate contact with the higher velocity rock material, producing a very large impedance contrast. An important shortcoming of the IBC 2000 building code results from the fact that it does not account for cases in which there is a strong rock-soil velocity contrast at depth less than 30 meters. It is suggested that other site-specific parameters, specifically, depth to bedrock and near-surface impedance ratio, should be included in the IBC design procedures. / Master of Science
12

Computational strategies for impedance boundary condition integral equations in frequency and time domains / Stratégies computationelles pour des équations intégrales avec conditions d'impédance aux frontières en domaines fréquentiel et temporel

Dély, Alexandre 15 March 2019 (has links)
L'équation intégrale du champ électrique (EFIE) est très utilisée pour résoudre des problèmes de diffusion d'ondes électromagnétiques grâce à la méthode aux éléments de frontière (BEM). En domaine fréquentiel, les systèmes matriciels émergeant de la BEM souffrent, entre autres, de deux problèmes de mauvais conditionnement : l'augmentation du nombre d'inconnues et la diminution de la fréquence entrainent l'accroissement du nombre de conditionnement. En conséquence, les solveurs itératifs requièrent plus d'itérations pour converger vers la solution, voire ne convergent pas du tout. En domaine temporel, ces problèmes sont également présents, en plus de l'instabilité DC qui entraine une solution erronée en fin de simulation. La discrétisation en temps est obtenue grâce à une quadrature de convolution basée sur les méthodes de Runge-Kutta implicites.Dans cette thèse, diverses formulations d'équations intégrales utilisant notamment des conditions d'impédance aux frontières (IBC) sont étudiées et préconditionnées. Dans une première partie en domaine fréquentiel, l'IBC-EFIE est stabilisée pour les basses fréquences et les maillages denses grâce aux projecteurs quasi-Helmholtz et à un préconditionnement de type Calderón. Puis une nouvelle forme d'IBC est introduite, ce qui permet la construction d'un préconditionneur multiplicatif. Dans la seconde partie en domaine temporel, l'EFIE est d'abord régularisée pour le cas d'un conducteur électrique parfait (PEC), la rendant stable pour les pas de temps larges et immunisée à l'instabilité DC. Enfin, unerésolution efficace de l'IBC-EFIE est recherchée, avant de stabiliser l'équation pour les pas de temps larges et les maillages denses. / The Electric Field Integral Equation (EFIE) is widely used to solve wave scattering problems in electromagnetics using the Boundary Element Method (BEM). In frequency domain, the linear systems stemming from the BEM suffer, amongst others, from two ill-conditioning problems: the low frequency breakdown and the dense mesh breakdown. Consequently, the iterative solvers require more iterations to converge to the solution, or they do not converge at all in the worst cases. These breakdowns are also present in time domain, in addition to the DC instability which causes the solution to be completely wrong in the late time steps of the simulations. The time discretization is achieved using a convolution quadrature based on Implicit Runge-Kutta (IRK) methods, which yields a system that is solved by Marching-On-in-Time (MOT). In this thesis, several integral equations formulations, involving Impedance Boundary Conditions (IBC) for most of them, are derived and subsequently preconditioned. In a first part dedicated to the frequency domain, the IBC-EFIE is stabilized for the low frequency and dense meshes by leveraging the quasi-Helmholtz projectors and a Calderón-like preconditioning. Then, a new IBC is introduced to enable the development of a multiplicative preconditioner for the new IBC-EFIE. In the second part on time domain,the EFIE is regularized for the Perfect Electric Conductor (PEC) case, to make it stable in the large time step regime and immune to the DC instability. Finally, the solution of the time domain IBC-EFIE is investigated by developing an efficient solution scheme and by stabilizing the equation for large time steps and dense meshes.
13

Utilisation d'identifiants cryptographiques pour la sécurisation IPv6

Combes, Jean-Michel 28 September 2012 (has links) (PDF)
IPv6, protocole succédant à IPv4, est en cours de déploiement dans l'Internet. Il repose fortement sur le mécanisme Neighbor Discovery Protocol (NDP). Celui-ci permet non seulement à deux nœuds IPv6 de pouvoir communiquer, à l'instar du mécanisme Address Resolution Protocol (ARP) en IPv4, mais il apporte aussi de nouvelles fonctionnalités, telles que l'autoconfiguration d'adresse IPv6. Aussi, sa sécurisation pour le bon fonctionnement de l'Internet en IPv6 est critique. Son mécanisme de sécurité standardisée à l'Internet Engineering Task Force (IETF) se nomme Secure Neighbor Discovery (SEND). Il s'appuie à la fois sur l'utilisation d'identifiants cryptographiques, adresses IPv6 appelées Cryptographically Generated Addresses (CGA) et qui sont générées à partir d'une paire de clés publique/privée, et de certificats électroniques X.509. L'objet de cette thèse est l'étude de ces identifiants cryptographiques, les adresses CGA, ainsi que le mécanisme SEND les employant, et leurs réutilisations potentielles pour la sécurisation IPv6. Dans une première partie de cette thèse, tout d'abord, nous posons l'état de l'art. Dans une deuxième partie de cette thèse, nous nous intéressons à la fiabilité du principal mécanisme connu employant les adresses CGA, le mécanisme SEND. Dans une troisième et dernière partie de cette thèse, nous présentons des utilisations des identifiants cryptographiques pour la sécurisation IPv6
14

Založení firmy v daňovém ráji / Foundation of the Company in Tax Haven

Pospíšil, Miloš January 2013 (has links)
Thesis focuses on the possibility of setting up business entities in tax havens and their utilization in tax optimization. Reader is introduced to the most interesting Offshore and Onshore localities, their advantages, risks and possibilities of use. I will clarify the nature of the tax advantages of various types of companies including explanations of terms come from angloamerican jurisdictions. I will describe the specific steps which are required to set up the company in Grenada, including Sample documents and a model example demonstrating the tax optimization of the hypothetical bussines company. The resulting structure of the company will be subject to lower tax burden and consequently increase competitiveness of its products in global markets.
15

Looking for the high-mass progenitors of stripped-envelope supernovae

Karamehmetoglu, Emir January 2018 (has links)
Stripped-envelope supernovae were thought to be the explosions of very massive stars (& 20 M) that lost their outer layers of hydrogen and/or helium in strong stellar winds. However, recent studies have highlighted that most stripped-envelope supernovae seem to be arising from rela- tively lower-mass progenitor stars in the 12 20 M(sun) range, creating a mystery about the fate of the higher-mass stars. In this licentiate thesis, we review our knowledge of stripped-envelope supernovae, and present the astrophysical problem of their missing high-mass progenitors. The thesis focuses on observations of unique and rare stripped-envelope supernovae classified with modern optical surveys such as the intermediate Palomar Transient Factory (iPTF) and the Public European Southern Observatory Spectroscopic Survey of Transient Objects (PESSTO). In these surveys we have discovered stripped-envelope supernovae with long-lasting broad lightcurves, which are thought to be a marker for highly massive (& 20 M[sun]) progenitor stars. Despite this exciting association, there are only a handful of existing examples of stripped- envelope supernovae with broad lightcurves published in the literature, not numerous enough to account for the missing high-mass stars. During our efforts, the first object we focused on was OGLE-2014-SN-131, a long-lasting supernova in the southern sky initially classified by PESSTO. We re-classified it as a supernova Type Ibn interacting with a helium-rich circumstellar environment. Unlike all other Type Ibn’s in the literature, OGLE-2014-SN-131 was found to have a long rise-time and large lightcurve broadness. By modeling its bolometric lightcurve, we concluded that OGLE-2014-SN-131 must have had an unusually massive progenitor star. Furthermore, since an ordinary radioactive- decay model could not reproduce the lightcurve, we investigated both a magnetar and circum- stellar interaction as potential powering scenarios and favored the latter due to the signatures of interaction present in the spectra. Next, we looked for similar objects in the supernova dataset of the iPTF, which contains over 200 stripped-envelope supernovae. Searching in a sub-sample of 100 well-observed supernovae, we identified 11 to have unusually broad lightcurves. We also constrained the distribution of lightcurve broadness for iPTF stripped-envelope supernovae. The 11 with broad lightcurves will be studied carefully in a forthcoming paper. The first part of this forthcoming paper, which describes the careful statistical identification of these super-novae, is included in this thesis. In it we identify that 10% of the iPTF stripped-envelope supernova sample have broad lightcurves, which a surprisingly high fraction given their rarity in the published literature. Finally, we evaluate whether our estimate of the fraction of broad stripped-envelope supernovae could help explain the missing high-mass progenitors, and con- clude that they can only be a small fraction of the missing high-mass progenitors.
16

Secure collection and data management system for WSNs

Drira, Wassim 10 December 2012 (has links) (PDF)
Nowadays, each user or organization is already connected to a large number of sensor nodes which generate a substantial amount of data, making their management not an obvious issue. In addition, these data can be confidential. For these reasons, developing a secure system managing the data from heterogeneous sensor nodes is a real need. In the first part, we developed a composite-based middleware for wireless sensor networks to communicate with the physical sensors for storing, processing, indexing, analyzing and generating alerts on those sensors data. Each composite is connected to a physical node or used to aggregate data from different composites. Each physical node communicating with the middleware is setup as a composite. The middleware has been used in the context of the European project Mobesens in order to manage data from a sensor network for monitoring water quality. In the second part of the thesis, we proposed a new hybrid authentication and key establishment scheme between senor nodes (SN), gateways (MN) and the middleware (SS). It is based on two protocols. The first protocol intent is the mutual authentication between SS and MN, on providing an asymmetric pair of keys for MN, and on establishing a pairwise key between them. The second protocol aims at authenticating them, and establishing a group key and pairwise keys between SN and the two others. The middleware has been generalized in the third part in order to provide a private space for multi-organization or -user to manage his sensors data using cloud computing. Next, we expanded the composite with gadgets to share securely sensor data in order to provide a secure social sensor network
17

Les Zones franches d'exportation au carrefour entre les politiques d'exception et les cycles d'accumulation du capital

Falaise, Cynthia January 2015 (has links)
En érigeant un cadre théorique réunissant les concepts de fixation spatiale et d’état d’exception, la présente thèse vise à brosser un portrait des Zones franches d’exportation (ZF) à la lumière du cas salvadorien entre les années 1980 et 2000. L’objectif est de proposer des pistes de réflexion quant au caractère exceptionnel des ZF et leur insertion dans les cycles d’accumulation capitalistes. Sont ainsi abordés les discours légitimant la création des ZF et leur maintien, les moyens financiers, juridiques et logistiques déployés pour permettre leur construction, les relations de pouvoir exceptionnelles qui caractérisent les rapports intra et extra-ZF ainsi que les intérêts que les ZF servent. Cette thèse se veut donc une tentative de compréhension du processus général à l’oeuvre derrière les ZF, dans l’objectif de mieux cibler les possibilités de résistance. *** This thesis aims to analyse the emergence of export processing zones (EPZ) through a theoretical framework that combines the concepts of spatial fix and state of exception. Taking as an illustration the Salvadoran case in the 1980-2000 period, the objective is to suggest means for reflecting on the exceptional character of the EPZ and their insertion in cycles of capitalist accumulation. The thesis will also explore the discourses legitimizing the creation and the retention of EPZ, the logistical, juridical and financial means deployed to allow their construction, the exceptional power relations that characterize the intra and extra-EPZ relations and the interests served by EPZ. This thesis is thus an attempt to encompass the general process behind the EPZ, in order to identify the resistance possibilities.
18

Assessing the Repercussions of a Mass Departure of Building Inspectors from the Code Professional Industry in Utah

Williams, George Reese 01 June 2015 (has links) (PDF)
National studies suggest that eighty percent of existing code professionals are expected to retire within the next fifteen years. As part of this research, it was determined that approximately half of all licensed building inspectors in the State of Utah will reach retirement age within the next ten years. As building inspectors make up a large part of the Code Professional Industry this demographic was selected as the focus of this research. The purpose of this research project was to assess the urgent need for new entrants into the Code Professional Industry in Utah. As part of this research, trends within the local industry over a 20 year period were evaluated. A statewide survey of over 300 licensed building inspectors was conducted to investigate the demographics of the industry, and gain first-hand insight from individual code professionals. This research was successful in quantifying the size of the Code Professional Industry in Utah, and numbering the populations of certified professionals in each individual code discipline. In addition, projected losses were established within each code discipline, discovering many disciplines in which over 50% of current professionals would be lost within a ten year period. In addition projections were made contrasting the number of code professionals leaving the industry versus the small anticipated number of individuals entering the industry. This research conclusively predicts a steady and dramatic decline in the number of licensed code professionals, unless the industry actively works toward addressing the issue. The group of aging code professionals possess a level of knowledge and experience not easily replaced. This study was based on an extensive statewide survey of licensed building inspectors in Utah, and collected opinions, concerns and insights directly from the Code Professional Industry. The findings of this study provide a unique look at this specialized industry within a single state. The lessons learned likely apply to populations of code professionals in other locations. This study concluded that a combination of phased retirement, modified work duties and mentoring programs would be of great benefit to the Code Professional Industry, by allowing the transfer of knowledge between the outgoing generation and the future generation of code professionals.
19

Exploring the Dimensions of the Learning Organization Questionnaire (DLOQ) for Startup Learning Environments

Morris, Mark Orlando 07 1900 (has links)
Although the Dimensions of the Learning Organization Questionnaire (DLOQ) has been widely accepted by the HRD community, it has not been tested in a startup context for reliability. The purpose of the current study is to explore if the DLOQ is a reliable instrument for startups to help them be more successful. The current study seeks to address some of the questions, which have been posed by previous researchers. The study utilizes a mixed-method design applying Cronbach alpha values to check the reliability of the instrument in a startup learning environment, with more than 600 participants and 42 startup businesses at a university in the Mountain West. The study uses objective financial measures for startup firms to explore the correlation between the seven dimensions of the DLOQ and startup companies at the university. Cronbach alphas for the instrument measured at the .80 level or higher. Four of the dimensions were found to be statistically significant resulting in a model that accounted for 30% of the variance in predicted Operating Income (p<.004) and 29% of the variances in predicted Net Income (p<.003). The study also uses qualitative analysis to explore what activities relate to the seven dimensions of the DLOQ, and if those activities would be considered complex and disruptive. Thirteen activity clusters were identified and found to be relevant to startups and the seven dimensions of the DLOQ.
20

A 3D-printed Fat-IBC-enabled prosthetic arm : Communication protocol and data representation

Engstrand, Johan January 2020 (has links)
The aim of this thesis is to optimize the design of the Fat-IBC-based communication of a novel neuroprosthetic system in which a brain-machine interface is used to control a prosthetic arm. Fat-based intra-body communication (Fat-IBC) uses the fat tissue inside the body of the bearer as a transmission medium for low-power microwaves. Future projects will use the communication system and investigate ways to control the prosthetic arm directly from the brain. The finished system was able to individually control all movable joints of multiple prosthesis prototypes using information that was received wirelessly through Fat-IBC. Simultaneous transmission in the other direction was possible, with the control data then being replaced by sensor readings from the prosthesis. All data packets were encoded with the COBS/R algorithm and the wireless communication was handled by Digi Xbee 3 radio modules using the IEEE 802.15.4 protocol at a frequency of 2.45 GHz. The Fat-IBC communication was evaluated with the help of so-called "phantoms" which emulated the conditions of the human body fat channel. During said testing, packet loss measurements were performed for various combinations of packet sizes and time intervals between packets. The packet loss measurements showed that the typical amount of transmitted data could be handled well by the fat channel test setup. Although the transmission system was found to be well-functioning in its current state, increasing the packet size to achieve a higher granularity of the movement was perceived to be viable considering the findings from the packet loss measurements.

Page generated in 0.026 seconds