• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 517
  • 276
  • 126
  • 103
  • 63
  • 53
  • 33
  • 19
  • 18
  • 12
  • 10
  • 6
  • 6
  • 5
  • 5
  • Tagged with
  • 1358
  • 257
  • 180
  • 140
  • 129
  • 118
  • 116
  • 115
  • 111
  • 111
  • 100
  • 97
  • 95
  • 94
  • 90
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Implementering av BIM i kalkyleringsskedet / Implementation of BIM in the calculation stage

Wallin, Dan, Andersson, Benny January 2015 (has links)
Syfte: Syftet med examensarbetet är att analysera i vilken utsträckning BIM används i kalkyleringsarbetet, och hur hindren kan övervinnas för att öka implementeringen av BIM i kalkyleringsarbetet. Metod: Vi har använt oss av en litteraturanalys och ett antal kvalitativa intervjuer. Resultat: Resultatet av examensarbetet är att det finns ett antal hinder som behöver övervinnas för att öka implementeringen av BIM i kalkyleringsarbetet. Ett av hindren är att i förfrågningsunderlaget från beställaren finns inte BIM-modeller med. Och om de finns med så håller de en bristande kvalité. Ytterligare hinder som är identifierade är de juridiska, tekniska och branschens motstånd mot ny teknik. Även utbildning i BIMverktygen, avsaknad av en gemensam branschstandard och att de inblandade aktörerna måste få en förståelse att BIM är ett verktyg som alla kan dra nytta av. Konsekvenser: BIM i kalkyleringsarbetet har en potential att ge en exaktare, mer tidsbesparande och säkrare kalkyler. Den analys som vi kan dra av vårt examensarbete är att vi upplever att Peab är på rätt väg. De väntar inte på att beställarna ska ge dem en användbar BIM-modell tillsammans med anbudunderlaget. De har istället tagit ett ledningsbeslut att själva börja modellera upp rätta geometriska modeller som är användbara för kalkylingenjören. De har insett vikten av att kalkylingenjören får de digitala verktygen och utbildning på dessa så att de kan dra nytta av de fördelar som BIM kan ge i kalkyleringsarbetet. För att driva implementeringen av BIM framåt så kan det behövas åtgärder i form av nationella regler gällande hur standardiseringar ska utformas. Dessa standardiseringar kan möjliggöra att processen att implementera BIM underlättas och att dessa standardiseringar kan övervinna hinder i informationsutbytet mellan aktörerna. För att öka användandet av BIM så rekommenderas att en BIM-samordnare utses, som tidigt i projekten ställer krav på de olika aktörerna på hur filer och handlingar ska utformas så att informationen går att använda, inte bara vid kalkyleringsskeendet utan också vidare i projektet. För att driva implementeringen av BIM framåt så behöver det juridiska hindret få en lösning. Därför att nu så saknas regler i standardavtalen för att kunna inkludera digital information i handlingarna. Dessa förändringar behöver åtgärdas för att kunna knyta leverantören och mottagarens ansvar och nyttjanderätt av innehållet i leveransen av digital information. Begränsningar: Vi har avgränsat oss till att endast analysera hur BIM används idag och hur det kan komma att användas i kalkyleringsarbetet i framtiden. / Purpose: The purpose with our thesis is to analyze to what extent BIM is used in the calculation work, and how to overcome the obstacles of implementing BIM in the calculation work. Method: We have used literature analysis and a number of qualitative interviews. Findings: The outcome of our thesis is that there are several obstacles to overcome in order to increase the implementation of BIM in the calculation work. One of the obstacles is that BIM structures are missing in the specifications from the clients and if they are included they do not live up to the standard necessary. Further obstacles are the juridical and technical problems alongside with the industry’s resistance towards new techniques. Insufficient education in BIM tools, and the lack of a common industrial standard are problems as well as there’s a need to help the involved operators acquire knowledge that BIM is a tool that everyone can benefit from. Implications: BIM has the potential to give more precise, more timesaving and more accurate calculations in the calculation work. When analyzing our work, we feel that Peab is on the right path. They don’t wait for the clients to give them a useful BIM structure together with the tender documentation. Instead, the management have decided to make geometrically correct structures themselves that are useful to the calculation engineer. They have realized the importance of providing the calculation engineer with the proper digital tools and education of these tools so that they can benefit from the advantages BIM can give in the calculation work. In order to drive the implementation of BIM forward, there might be a need for some workarounds in the form of national rules about standardizations are to be formed. These standardizations can make it possible to ease the process of implementing BIM and that these standardizations can overcome obstacles in the information exchange between the clients. To increase the use of BIM it’s recommended that a BIM coordinator is selected, who can make demands on the various clients about how files and documents should be designed in order to be able to use them, not just in the calculation stage but also further into the project. In order to drive the implementation of BIM forward, a solution to the juridical obstacle are needed. As it is now, rules to include digital information in the acts are missing in the standard documents. Changes need to be done in order to attach the contractor and the client`s responsibility and usufruct of content in the delivery of digital information. Limitations: We have limited ourselves to only analyze how BIM are used today and how it may be used in the calculation work in the future.
162

Density functional calculation of simple molecules

Olaoye, Olufemi Opeyemi. 03 1900 (has links)
Thesis (MSc)--Stellenbosch University, 2012. / AFRIKAANSE OPSOMMING: Berekeninge met Density Functional Theory (DFT) is ’n nuttige tegniek om die dinamika van molekules op potensiële energievlakke te verstaan. Beginnende met ’n prototipe molekuul formaldimien, wat die kern vorm van die groter fotochromiese molekuul dithizonatophenyl kwik (DPM), word die modellering van die molekuul meer ingewikkeld tot laasgenoemde bestudeer kan word asook sy fotochromiese afgeleides wat vervanging van elektronryk en elektronarm radikale by orto, meta en para posisies van die phenyl ringe insluit. DFT berekeninge word met spektra van Absorpsiespektroskopie met UV en sigbare lig asook tyd opgeloste spektra, verkry dmv femtosekondespektroskopie, vergelyk. In pol^ere aprotiese, pol^ere protiese en nie-pol^ere oplosmiddels, isomeriseer die molekuul om die C=N dubbelbinding. Daar kan tussen die twee isomere onderskei word deur dat die een in oplossing in sy grondtoestand blou en die ander een oranje voorkom. Die isomerisering is’n fotogeinduseerde proses. Die optimering van die molekul^ere struktuur, absorpsiespektra, oplosmiddel-afhanklikheid, en potensiële energievlak metings van die molekuul word bestudeer. Die sterk/swak wisselwerking wat in pol^ere protiese/aprotiese oplosmiddels verskyn word geopenbaar deur die hoe/lae absorpsie van die sekond^ere bande van die molekules. Daar is gevind dat die absorpsiespektra van DPM bathochromies in oplosmiddels met hoë diëlektriese konstantes is. Vir die potensiële energievlak berekeninge van die grondtoestand word rigiede en ontspanne metodes gebruik waar laasgenoemde met gebroke simmetrie berekeninge verkry word. Van alle metodes wat vir berekeninge gebruik was, gee die B3LYP/CEP-31G metode die beste benadering aan eksperimentele data. Alle berekeninge word gedoen met twee bekende sagteware pakkette; Amsterdam Density Functional (ADF) en Gaussian, wat op twee verskillende DFT metodes gebaseer is. / ENGLISH ABSTRACT: Density functional theory is a useful computational tool in the understanding of molecular dynamics on potential energy surfaces. Starting with a prototype molecule formaldimine, the photochromic molecule dithizonatophenylmercury II (DPM) and a set of its photochromic derivatives, (involving substitutions of electron donating and electron withdrawing substituents at ortho, meta and para positions of the dithizonato phenyl rings), are studied through density functional calculation in comparison with steady state absorption spectra obtained from UV-Visible and femto second spectroscopy experiments. In polar aprotic, polar protic and non-polar solvents these molecules isomerise around C=N double bond chromophore, from orange electronic ground states to blue electronic ground states upon photo-excitation. We investigate the structural optimisations, the absorption spectra, the solvent dependence and the potential energy surface (PES) of these molecules. The strong (weak) interactions exhibited by the polar protic (aprotic) solvents used are revealed through high (low) absorbance in the secondary bands of these molecules. The absorption spectra of DPM are found to be bathochromic in solvents with high dielectric constants. For the ground state PES calculation we make use of rigid and relaxed methods, and the latter is obtained through broken symmetry calculation. Of all the methods used in calculation, B3LYP/CEP-31G method gives the best approximation to the experimental data. All calculations are done using the two renown software, Amsterdam Density Functional (ADF) and Gaussian, availing their different density functional methods.
163

Neues zu Mathcad / Mathcad News

Büttner, René 22 July 2016 (has links) (PDF)
Die Präsentation gibt einen Überblick über die Software PTC Mathcad Prime 3.1, das Software Development Kit (SDK) und den neuen Berechnungsserver Mathcad Gateway. Neue Funktionalitäten und Anwendungsmöglichkeiten von Mathcad werden vorgestellt.
164

Klimatanalys av avloppsreningsverk : Analyser av två av Laholmsbuktens VA:s avloppsreningsverk med förslag på förbättringsåtgärder

Adriansson, Emma, Turesson, Linnéa January 2016 (has links)
The intensified greenhouse effect is the biggest cause for a negative climate change. Global warming is a result of enhanced greenhouse effect and poses a potential threat for humans and its surrounding environment that can result in disastrous consequences. The rise in temperature is driven by increased human activity whereas the leading cause for emissions of greenhouse gases is the combustion of fossil fuels. The prominent source of emissions is from the following sectors: transport, industrial and energy. The task of reducing the emissions of greenhouse gases is both globally and internationally prioritized. Progress has been made although every potential source for emissions has to be investigated in order to reduce the total climate impact. Based on scientific research, wastewater treatment plants are also a contributing factor to the greenhouse effect by emissions of gases. This bachelor’s thesis is investigating two wastewater treatment plants and identifying which processes contribute to the climate impact. The climate impact is calculated with an Excel-based analyzing tool and the results show the Carbon Footprints for the treatment plants. The purpose of this paper is to present result acquired from the treatment plants and identify the processes that have the biggest impact on our climate. Afterwards, solutions derived from the analysis of results will be suggested to improve and bring additional help to the treatment plants with their climate work. In conclusion, it is determined that the treatment plants both have net emissions of greenhouse gases. The biggest contributor is the wastewater treatment and the use of biogas. Results from analysis show that some of the assumptions on emissions made in the tool make the initial results doubtful. Therefore, further research is needed on this subject in order to produce more reliable facts.
165

Automated coverage calculation and test case generation

Morrison, George Campbell 03 1900 (has links)
Thesis (MScEng)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: This research combines symbolic execution, a formal method of static analysis, with various test adequacy criteria, to explore the e ectiveness of using symbolic execution for calculating code coverage on a program's existing JUnit test suites. Code coverage is measured with a number of test adequacy criteria, including statement coverage, branch coverage, condition coverage, method coverage, class coverage, and loop coverage. The results of the code coverage calculation is then used to automatically generate JUnit test cases for areas of a program that are not su ciently covered. The level of redundancy of each test case is also calculated during coverage calculation, thereby identifying fully redundant, and partially redundant, test cases. The combination of symbolic execution and code coverage calculation is extended to perform coverage calculation during a manual execution of a program, allowing testers to measure the e ectiveness of manual testing. This is implemented as an Eclipse plug-in, named ATCO, which attempts to take advantage of the Eclipse workspace and extensible user interface environment to improve usability of the tool by minimizing the user interaction required to use the tool. The code coverage calculation process uses constraint solving to determine method parameter values to reach speci c areas in the program. Constraint solving is an expensive computation, so the tool was parallellised using Java's Concurrency package, to reduce the overall execution time of the tool. / AFRIKAANSE OPSOMMING: Hierdie navorsing kombineer simboliese uitvoering, 'n formele metode van statiese analise, met verskeie toets genoegsaamheid kriteria, om die e ektiwiteit van die gebruik van simboliese uitvoer te ondersoek vir die berekening van kode dekking op 'n program se bestaande JUnit toets stelle. Kode dekking word gemeet deur verskeie toets genoegsaamheid kriteria, insluited stelling dekking, tak dekking, kondisie dekking, metode dekking, klas dekking, en lus dekking. Die resultate van die kode dekking berekeninge word dan gebruik om outomaties JUnit toets voorbeelde te genereer vir areas van 'n program wat nie doeltre end ondersoek word nie. Die vlak van oortolligheid van elke toets voorbeeld word ook bereken gedurende die dekkingsberekening, en daardeur word volledig oortollige, en gedeeltelik oortollige, toets voorbeelde identi seer. Die kombinasie van simboliese uitvoer en kode dekking berekening is uitgebrei deur die uitvoer van dekking berekeninge van 'n gebruiker-beheerde uitvoer, om sodoende kode dekking van 'n gebruiker-beheerde uitvoer van 'n program te meet. Dit laat toetsers toe om die e ektiwiteit van hulle beheerde uitvoer te meet. Bogenoemde word ge mplimenteer as 'n Eclipse aanvoegsel, genaamd ATCO, wat poog om voordeel te trek vanuit die Eclipse werkspasie, en die uitbreibare gebruiker oordrag omgewing, om die bruikbaarheid van ATCO te verbeter, deur die vermindering van die gebruiker interaksie wat benodig word om ATCO te gebruik. Die kode dekking berekeningsproses gebruik beperking oplossing om metode invoer waardes te bereken, om spesi eke areas in die program te bereik. Beperking oplossing is 'n duur berekening, so ATCO is geparalleliseer, met behulp van Java se Concurrency pakket, om die algehele uitvoer tyd van die program te verminder.
166

Path Bandwidth Calculation for QoS Support in Wireless Multihop Networks / 支援無線多跳接網路服務品質之路徑頻寬計算

劉姿吟, Liu, Tzu-Yin Unknown Date (has links)
行動資訊服務環境的理想,是要提供一個無所不在的資訊環境,讓使用者可以在任何地方、任何時間,利用各種有線或無線的傳輸網路去存取可用資源。行動通訊與行動計算的飛越發展使得行動資訊服務的理想指日可待。而無線網路要支援一些即時多媒體通訊傳輸,服務品質便成為很重要的課題,頻寬計算更是其中最關鍵的議題。除了現有IEEE 802.11無法有效支援多跳接網路使之達到服務品質的保證外,也由於Ad Hoc網路移動性及流量多變性的特性,要在這樣的無線環境下支援服務品質便成為一個困難的挑戰。由於我們參考的論文皆在TDMA的環境下探討頻寬保證的問題,但是這在無線多跳接網路下十分複雜且受限制。因此我們針對此問題提出一個簡單的頻寬計算方法來估算網路現有頻寬,用於頻寬繞徑演算法上以支援無線網路服務品質。實驗結果顯示我們的方法比過去的頻寬計算方法更簡單、誤差少、適用於各種MAC層的通訊協定,也容易與現有頻寬繞徑演算法結合以執行允入控制機制。透過我們的方法,可以有效地支援無線多跳接網路服務品質。 / The idea of mobile computing service is to provide a ubiquitous information environment. However, the present mobile ad hoc networks still can’t support real-time transmission very effectively. In other words, the capability of supporting QoS guarantee has become a very important issue. IEEE 802.11 PCF adopts the polling scheme to provide time-bounded traffic services, which is not suitable in multi-hop networks. Moreover, due to mobility and traffic dynamics, the network resource management is more difficult. Thus, QoS support in such an environment is a challenge. Specifically, path bandwidth calculation is the first key element. All the bandwidth routing papers we referenced were using TDMA. However, they are restricted in TDMA systems and somehow complicated in path bandwidth calculation. We propose a simple path bandwidth calculation solution that can be used whatever MAC protocol is. It is also easy to implement call admission control and to combine with bandwidth routing algorithms. The simulation results illustrate that the statistical error rates of our path bandwidth calculation are within an acceptable range. By path bandwidth calculation, bandwidth routing algorithm is also developed to achieve the objective of supporting QoS in wireless multihop networks effectively.
167

Beam Modelling for Treatment Planning of Scanned Proton Beams / Strålmodellering i dosplaneringssyfte för svepta protonstrålar

Kimstrand, Peter January 2008 (has links)
<p>Scanned proton beams offer the possibility to take full advantage of the dose deposition properties of proton beams, i.e. the limited range and sharp peak at the end of the range, the Bragg peak. By actively scanning the proton beam, laterally by scanning magnets and longitudinally by shifting the energy, the position of the Bragg peak can be controlled in all three dimensions, thereby enabling high dose delivery to the target volume only. A typical scanned proton beam line consists of a pair of scanning magnets to perform the lateral beam scanning and possibly a range shifter and a multi-leaf collimator (MLC). Part of this thesis deals with the development of control, supervision and verification methods for the scanned proton beam line at the The Svedberg laboratory in Uppsala, Sweden. </p><p>Radiotherapy is preceded by treatment planning, where one of the main objectives is predicting the dose to the patient. The dose is calculated by a dose calculation engine and the accuracy of the results is of course dependent on the accuracy and sophistication of the transport and interaction models of the dose engine itself. But, for the dose distribution calculation to have any bearing on the reality, it needs to be started with relevant input in accordance with the beam that is emitted from the treatment machine. This input is provided by the beam model. As such, the beam model is the link between the reality (the treatment machine) and the treatment planning system. The beam model contains methods to characterise the treatment machine and provides the dose calculation with the reconstructed beam phase space, in some convenient representation. In order for a beam model to be applicable in a treatment planning system, its methods have to be general. </p><p>In this thesis, a beam model for a scanned proton beam is developed. The beam model contains models and descriptions of the beam modifying elements of a scanned proton beam line. Based on a well-defined set of generally applicable characterisation measurements, ten beam model parameters are extracted, describing the basic properties of the beam, i.e. the energy spectrum, the radial and the angular distributions and the nominal direction. Optional beam modifying elements such as a range shifter and an MLC are modelled by dedicated Monte Carlo calculation algorithms. The algorithm that describes the MLC contains a parameterisation of collimator scatter, in which the rather complex phase space of collimator scattered protons has been parameterised by a set of analytical functions. </p><p>Dose calculations based on the phase space reconstructed by the beam model are in good agreement with experimental data. This holds both for the dose distribution of the elementary pencil beam, reflecting the modelling of the basic properties of the scanned beam, as well as for complete calculations of collimated scanned fields.</p>
168

Implicit restart schemes for Krylov subspace model reduction methods

Ahmed, Nisar January 1999 (has links)
No description available.
169

Artificial neural network methods in high energy physics and their application to the identification of quark and gluon jets in electroproton collisions

Vorvolakos, Angelos January 1999 (has links)
No description available.
170

A standard neutron spectrum source of application to fast reactor physics

Emmett, John Carter Alfred January 2000 (has links)
No description available.

Page generated in 0.1851 seconds