• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 661
  • 497
  • 176
  • 176
  • 86
  • 80
  • 28
  • 27
  • 26
  • 18
  • 17
  • 16
  • 9
  • 9
  • 8
  • Tagged with
  • 2089
  • 371
  • 278
  • 210
  • 207
  • 173
  • 170
  • 161
  • 159
  • 157
  • 149
  • 147
  • 143
  • 128
  • 113
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
541

Performance Assessment of Shear-critical Reinforced Concrete Plane Frames

Guner, Serhan 19 January 2009 (has links)
Current analysis procedures for new reinforced concrete structures are typically based on linear-elastic principles. However, under certain conditions, it may be necessary to analyze a structure to more accurately predict its structural behaviour. Such an analysis can be performed using nonlinear analysis procedures which typically require specialized software. This type of software is limited in number and most available programs do not adequately capture shear-related influences, potentially severely overestimating strength and ductility in shear-critical structures. The purpose of this study is to develop and verify an analytical procedure for the nonlinear analysis of frame structures with the aim of capturing shear-related mechanisms as well as flexural and axial effects. A previously developed analysis program, VecTor5, is further developed for this purpose. Originally formulated in the early 1980s at the University of Toronto, VecTor5 is based on the Modified Compression Field Theory (MCFT) and is capable of performing nonlinear frame analyses under temperature and monotonic loading conditions. Although providing generally satisfactory simulations, there are a number of deficiencies present in its computational algorithms. This study consists of three major parts: improvement of the original analysis procedure for monotonic loading conditions, expansion of the procedure for general loading conditions including the special cases of cyclic and reversed-cyclic loading, and further development of the procedure for dynamic loading conditions including time-varying base accelerations, impulse, impact and blast forces, initial mass velocities, and constant mass accelerations. Each part is supported by verification studies performed on a large number and variety of previously tested structures available in the literature. In addition, considerations in nonlinear modelling are discussed with the aim of providing guidelines for general modelling applications. Analyses of 63 previously tested structures, half of which are shear-critical, demonstrate that the developed analytical procedure is highly successful in simulating the experimental responses in terms of load-deflection response, reinforcement strains, crack widths, failure mode, failure displacement, total energy dissipation, displacement ductility ratio, and post-peak vibrational characteristics.
542

Effective Scheduling Algorithms for I/O Blocking with a Multi-Frame Task Model

TAKADA, Hiroaki, TOMIYAMA, Hiroyuki, DING, Shan 01 July 2009 (has links)
No description available.
543

The Physical Underpinning of Security Proofs for Quantum Key Distribution

Boileau, Jean Christian 25 September 2007 (has links)
The dawn of quantum technology unveils a plethora of new possibilities and challenges in the world of information technology, one of which is the quest for secure information transmission. A breakthrough in classical algorithm or the development of a quantum computer could threaten the security of messages encoded using public key cryptosystems based on one-way function such as RSA. Quantum key distribution (QKD) offers an unconditionally secure alternative to such schemes, even in the advent of a quantum computer, as it does not rely on mathematical or technological assumptions, but rather on the universality of the laws of quantum mechanics. Physical concepts associated with quantum mechanics, like the uncertainty principle or entanglement, paved the way to the first successful security proof for QKD. Ever since, further development in security proofs for QKD has been remarkable. But the connection between entanglement distillation and the uncertainty principle has remained hidden under a pile of mathematical burden. Our main goal is to dig the physics out of the new advances in security proofs for QKD. By introducing an alternative definition of private state, which elaborates the ideas of Mayers and Koashi, we explain how the security of all QKD protocols follows from an entropic uncertainty principle. We show explicitly how privacy amplification protocol can be reduced to a private state distillation protocol constructed from our observations about the uncertainty principle. We also derive a generic security proof for one-way permutation-invariant QKD protocols. Considering collective attack, we achieve the same secret key generation rate as the Devetak-Winter's bound. Generalizing an observation from Kraus, Branciard and Renner, we have provided an improved version of the secret key generation rates by considering a different symmetrization. In certain situations, we argue that Azuma's inequality can simplify the security proof considerably, and we explain the implication, on the security level, of reducing a QKD protocol to an entanglement or a more general private state distillation protocol. In a different direction, we introduce a QKD protocol with multiple-photon encoding that can be implemented without a shared reference frame. We prove the unconditional security of this protocol, and discuss some features of the efficiency of multiple-photon QKD schemes in general.
544

An adaptive solution for power efficiency and QoS optimization in WLAN 802.11n

Gomony, Manil Dev January 2010 (has links)
The wide spread use of IEEE Wireless LAN 802.11 in battery operated mobile devices introduced the need of power consumption optimization while meeting Quality-of-Service (QoS) requirements of applications connected through the wireless network. The IEEE 802.11 standard specifies a baseline power saving mechanism, hereafter referred to as standard Power Save Mode (PSM), and the IEEE 802.11e standard specifies the Automatic Power Save Delivery (APSD) enhancement which provides support for real-time applications with QoS requirements. The latest amendment to the WLAN 802.11 standard is the IEEE 802.11n standard which enables the use of much higher data rates by including enhancements in the Physical and MAC Layer. In this thesis, different 802.11n MAC power saving and QoS optimization possibilities are analyzed comparing against existing power saving mechanisms. Initially, the performance of the existing power saving mechanisms PSM and Unscheduled-APSD (UAPSD) are evaluated using the 802.11n process model in the OPNET simulator and the impact of frame aggregation feature introduced in the MAC layer of 802.11n was analyzed on these power saving mechanisms. From the performance analysis it can be concluded that the frame aggregation will be efficient under congested network conditions. When the network congestion level increases, the signaling load in UAPSD saturates the channel capacity and hence results in poor performance compared to PSM. Since PSM cannot guarantee the minimum QoS requirements for delay sensitive applications, a better mechanism for performance enhancement of UAPSD under dynamic network conditions is proposed. The functionality and performance of the proposed algorithm is evaluated under different network conditions and using different contention settings. From the performance results it can be concluded that, by using the proposed algorithm the congestion level in the network is reduced dynamically thereby providing a better power saving and QoS by utilizing the frame aggregation feature efficiently.
545

Jämförelse Mellan Lätt och Tung Stomme på ett Kontorshus / Comparison between Light and Heavy Frame of an Office Building

Aljija, Elnes January 2012 (has links)
I början av varje nytt projekt stöter man på de olika alternativ av stomme- och bjälklagskonstruktioner som finns att välja mellan, och frågan blir ofta vilken alternativ som är optimal för den aktuella projekteten. Den optimala lösningen för varje projekt existerar inte, på grund av de olika faktorer och förutsättningar som styr projektet, till exempel: ekonomi, typ av byggnad, terrängtyp etc. Frågeställningen i denna rapport är om limträ eller betong är den mest optimala alternativen som stommaterial i det aktuella projektet. Jämförelsearbetet utgående från förutsättningar har gjorts genom att dimensionera delar av ett projekt med både materialen. Fokus har lagts på skillnader i byggnadshöjd, vindstabilitet och grundläggningen. Resultaten tyder på om man ska bygga ett kontorshus eller flervåningshus vore betong mer lämpligare alternativ jämfört med limträ. Skillnaden i byggnadshöjd finns men är förvånansvärt inte så stor mellan de två olika stommaterial. Dock skillnaden varierar avsevärd när det gäller bjälklagshöjd mellan limträ och betong. Detta pga krav på nedböjning och svikt som ställs på limträbjälklag. Båda stommaterial klara vindstabiliteten utan plintar, som är ganska intressant speciellt med tanke på limträets låga vikt. När det gäller grundläggningen, blev skillnaden betydlig större mellan limträ och betong. Dimensionering enligt Eurokoder har gjorts genom egna handberäkningar och även användandet av programvaran Strusoft. / At the beginning of each new project comes across on the different options of frame and floor construction available to choose from, and the question is often which option that is optimal for the current project. The optimal solution for each project does not exist, because of the different factors and conditions that govern the project, such as: economy, building type, terrain type, etc. The issue addressed in this report is on glue-laminated wood or concrete is the most ideal alternative to framing materials in the current project. Comparative work on the basis of preconditions has been made by dimensioning the parts of a project with both materials. The focus was on differences in building height, wind stability and the foundation. The results indicate if you're going to build an office building or apartment building,concrete is more appropriate alternative compared to the glue-laminated wood. The difference in building height is surprisingly not so great between the two different frames. However, the difference varies considerably in terms of floor height between glue-laminated wood and concrete. This is due to requirements for deflection and springines imposed on wood. Both frames can handle wind stability without plinths, which is quite interesting especially in view of the wood's light weight. Regarding the foundation, the difference was significantly greater between glue-laminated wood and concrete. The design according to Eurocodes has been made by hand calculations and also the use of the software Strusofts.
546

The Physical Underpinning of Security Proofs for Quantum Key Distribution

Boileau, Jean Christian 25 September 2007 (has links)
The dawn of quantum technology unveils a plethora of new possibilities and challenges in the world of information technology, one of which is the quest for secure information transmission. A breakthrough in classical algorithm or the development of a quantum computer could threaten the security of messages encoded using public key cryptosystems based on one-way function such as RSA. Quantum key distribution (QKD) offers an unconditionally secure alternative to such schemes, even in the advent of a quantum computer, as it does not rely on mathematical or technological assumptions, but rather on the universality of the laws of quantum mechanics. Physical concepts associated with quantum mechanics, like the uncertainty principle or entanglement, paved the way to the first successful security proof for QKD. Ever since, further development in security proofs for QKD has been remarkable. But the connection between entanglement distillation and the uncertainty principle has remained hidden under a pile of mathematical burden. Our main goal is to dig the physics out of the new advances in security proofs for QKD. By introducing an alternative definition of private state, which elaborates the ideas of Mayers and Koashi, we explain how the security of all QKD protocols follows from an entropic uncertainty principle. We show explicitly how privacy amplification protocol can be reduced to a private state distillation protocol constructed from our observations about the uncertainty principle. We also derive a generic security proof for one-way permutation-invariant QKD protocols. Considering collective attack, we achieve the same secret key generation rate as the Devetak-Winter's bound. Generalizing an observation from Kraus, Branciard and Renner, we have provided an improved version of the secret key generation rates by considering a different symmetrization. In certain situations, we argue that Azuma's inequality can simplify the security proof considerably, and we explain the implication, on the security level, of reducing a QKD protocol to an entanglement or a more general private state distillation protocol. In a different direction, we introduce a QKD protocol with multiple-photon encoding that can be implemented without a shared reference frame. We prove the unconditional security of this protocol, and discuss some features of the efficiency of multiple-photon QKD schemes in general.
547

Från 2D till BIM i ett trähusföretag : Transition from 2D-CAD to BIM in a timber frame home company

Fält, Pernilla January 2009 (has links)
A prefabricated timber frame house previously built by Villafabriken AB has been modeled in Autodesk Revit Architectural, a 3D-program based on BIM-technology. This has been done tosee if it’s possible to produce the publications that Villafabriken demands from the designengineer’s work, and examine which possible extra values that may arise compared to traditional 2D-CAD drawing.BIM is short for Building Information Modeling. Everything is stored in a single database and a change in the project file is automatically updated across the project. BIM provides more than just drawings since information from the model can be retrieved in various ways such as lists and quantity schedules.It was possible to produce the publications that Villafabriken demanded using Revit, but BIM doesn’t only mean a new way of drawing, it also require a change in the company’s process were the information from the model is being used.
548

Mathematical Formulation of Tools for Assessment of Fragility and Vulnerability of Damaged Buildings

Li, Quanwang 11 April 2006 (has links)
Performance-Based (PBE) and Consequence-Based (CBE) are new approaches to seismic design, evaluation and risk assessment, in which design criteria are devised to achieve stated performance objectives, and regional losses to civil infrastructure are mitigated through selective interventions for critical components of a civil infrastructure. These new approaches give engineers more flexibility in achieving performance goals but require substantial additional computational resources to fully achieve performance goals. As a step toward making such approaches feasible, this dissertation develops a number of computationally efficient methods for performing finite element-based structural system dynamic response analysis and reliability assessment. The Enhanced Uncoupled Modal Response History Analysis (EUMRHA) procedure developed herein is an efficient response analysis procedure to make the analysis of dynamic structural response to earthquakes in the nonlinear range less time-consuming. This technique is used to investigate the potential for aftershocks to cause additional damage to steel moment frame buildings, utilizing a technique designed to enhance the efficiency of Monte Carlo simulation in estimating low-probability events. Relatively simple probabilistic tools are proposed for purposes of rapid structural evaluation and condition assessment of damaged buildings. Finally, an analysis-based inspection scheme based on an associated probability model of connection damage is proposed for assessing the safety condition of existing buildings, and a procedure to assess the likely performance of an un-repaired building during a future earthquake is developed.
549

CSTN LCD Frame Rate Controller For Image Quality Enhancement

Lee, Chien-te 20 July 2010 (has links)
This thesis is mainly focused on FRC (Frame Rate Control) method which can be used for LCD panels, where a new algorithm is proposed to improve the flicker problem. The proposed algorithm can be implemented by simple digital circuits with low power consumption. The proposed design can be applied in both mono- and color- STN panels. It can generate 32768 colors in a panel without any flicker and motion line problems, which can only allow 8 colors originally. The major contribution in this thesis is to add a location number to each pixel of the panel.Notably, the numbers for all the pixels can not be a regular pattern. Otherwise, the flicker problem is resolved at the expense of a serious motion line issue. The consequence is poor display quality. To resolve both the flicker and motion line problem, we propose to employ a PRSG (Pseudo Random Sequence Generator) which generates a non-regular number sequence for all the pixels. Therefore, all the ON pixels can be dispersed on the panel in all frames.
550

Dewatering of Biological Sludges by an Electrokinetics-Assisted Filter Press System

Chen, Min-Cong 03 March 2012 (has links)
The objective of this research was to evaluate the technical and economic feasibility of employing an electric field to enhance the dewatering performance of two types of biological sludge by a pilot-scale plate and frame filter press. In this work a biological industrial sludge and biological municipal sludge were collected and tested, respectively. Through the jar testing, it was found that a low molecular weight cationic polymer or medium molecular weight cationic polymer with a dose of 0.008 wt% would yield a satisfactory flocculation for the biological industrial sludge, whereas an iron-based coagulant with a dose of 0.08 wt% would meet the conditioning need of the biological municipal sludge. To find out the optimal dewatering conditions for the concerned sludges, experimental designs based on the Taguchi method were adopted. More specifically, L8(27) and L18(21¡Ñ37) orthogonal arrays were selected for the biological industrial sludge and biological municipal sludge, respectively. Among others, applied mechanical pressure and time, electrode array, and electrodewatering time were operating parameters of concern. Test results showed that a 10-15% increase of dewatering efficiency for both sludges was obtained for the parallel circuit and parallel series circuit. However, the filtrate quality deteriorated, particularly in pH, turbidity, and chemical oxygen demand. In addition, due to ohmic heating the temperature of filtrate might raise to 80 ¢Jor even higher depending on the operating conditions employed. Thus, the filtrate should be recirculated back to the wastewater treatment system for proper treatment. To find out the significant controlling factors and optimal operating conditions for electrodewatering in a more scientific manner, the final sludge cake moisture and energy consumption for each test was subjected to formal analysis and analysis of variance. For biological industrial sludge, the flocculant type and applied filtration pressure were found to be the most significant controlling factors for the final sludge cake moisture, whereas the applied electric field strength for the power consumption. In the case of biological municipal sludge, however, the electrode array was the most significant controlling factor for both final sludge cake moisture and power consumption. At last, the optimal operating conditions theoretically obtained for electrodewatering were subjected to the respective verification tests for both biological industrial sludge and biological municipal sludge. Test results showed that a final sludge cake moisture of 67.1¡Ó3.9% and energy consumption of 72.6 kWh/ton dry solids were obtained for the former sludge, whereas 68.1¡Ó3.4% and 18.6 kWh/ton dry solids for the latter sludge. These results validated the predictions made by the Taguchi method. Therefore, it may conclude that electrodewatering is technically and economically feasible for treating both biological industrial sludge and biological municipal through the electrokinetics-assisted filter press system employed in this work.

Page generated in 0.0588 seconds