• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 26
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 47
  • 47
  • 13
  • 11
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Automatic Volume Estimation of Timber from Multi-View Stereo 3D Reconstruction

Rundgren, Emil January 2017 (has links)
The ability to automatically estimate the volume of timber is becoming increasingly important within the timber industry. The large number of timber trucks arriving each day at Swedish timber terminals fortifies the need for a volume estimation performed in real-time and on-the-go as the trucks arrive. This thesis investigates if a volumetric integration of disparity maps acquired from a Multi-View Stereo (MVS) system is a suitable approach for automatic volume estimation of timber loads. As real-time execution is preferred, efforts were made to provide a scalable method. The proposed method was quantitatively evaluated on datasets containing two geometric objects of known volume. A qualitative comparison to manual volume estimates of timber loads was also made on datasets recorded at a Swedish timber terminal. The proposed method is shown to be both accurate and precise under specific circumstances. However, robustness is poor to varying weather conditions, although a more thorough evaluation of this aspect needs to be performed. The method is also parallelizable, which means that future efforts can be made to significantly decrease execution time.
22

Hodnocení a rozklad efektivnosti pomocí Malmquistova výkonnostního indexu / Evaluation and decomposition efficiency using Malmquist productivity index

Skočdopol, Petr January 2010 (has links)
At first, the basics of microeconomics from the perspective of companies, effectiveness and methods of its measurement and the most important information on the distance function this thesis, are shown. It also contains the development of the Malmquist productivity index. The aim of this work is the description of this index and its components. Indicate how these values are calculated and what expressed. Secondary objectives are to introduce different variants of Malmquist indexes and their use. Four models are used for calculating individual components of the Malmquist productivity index. These are the DEA models, Aigner-Chu, Stochastic production frontiers and Stochastic activity analysis. The first three in this work are described in detail. In conclusion is an illustrative example of calculation Malmquist productivity index using DEA models. For the calculation I used the program Lingo.
23

Charakterizace funkcí s nulovou stopou pomocí funkce vzdálenosti od hranice / Characterization of functions with zero traces via the distance function

Turčinová, Hana January 2019 (has links)
Consider a domain Ω ⊂ RN with Lipschitz boundary and let d(x) = dist(x, ∂Ω). It is well known for p ∈ (1, ∞) that u ∈ W1,p 0 (Ω) if and only if u/d ∈ Lp (Ω) and ∇u ∈ Lp (Ω). Recently a new characterization appeared: it was proved that u ∈ W1,p 0 (Ω) if and only if u/d ∈ L1 (Ω) and ∇u ∈ Lp (Ω). In the author's bachelor thesis the condition u/d ∈ L1 (Ω) was weakened to the condition u/d ∈ L1,p (Ω), but only in the case N = 1. In this master thesis we prove that for N ≥ 1, p ∈ (1, ∞) and q ∈ [1, ∞) we have u ∈ W1,p 0 (Ω) if and only if u/d ∈ L1,q (Ω) and ∇u ∈ Lp (Ω). Moreover, we present a counterexample to this equivalence in the case q = ∞. 1
24

Prostori sa fazi rastojanjem i primena u obradi slike / Spaces with fuzzy distances and application in image processing

Karaklić Danijela 13 September 2019 (has links)
<p>Merenje kvaliteta slike korišćenjem indeksa za kvalitet slike, ne mora da odražava i praktični kvalitet slike, odnosno nije baziran na HVS (Human visual system) modelu. Formiranje razmatranih funkcija, koje se koriste u algoritmu filtriranja za određivanje rastojanja među pikselima, može se vršiti&nbsp; na različite načine, što se može videti u radovima iz oblasti filtriranja slike, daje širok spektar mogućnosti da se ispita uticaj fazi rastojanja npr. fazi T-metrike ili fazi Ѕ-metrike mogu imati na sam proces filtriranja slike. Cilj je poboljšanje kvaliteta slike u odnosu na medijanski filter. U okviru teorijskih razmatranja prostora sa fazi rastojanjem dobijeni su i rezultati iz teorije nepokretne tačke koji pružaju mogućnost dalje primene ovih prostora u tehnici.</p> / <p>Measuring the image quality using a given image quality index does not necessarily reflect the practical quality of the image, that is, it is not based on the HVS (Human Visual System) model. The formation of given functions, which are used in the filtering algorithm for determining the distance between the pixels, can be done in different ways, which can be seen in works in the field of image filtering, provides a wide range of possibilities to examine the effect of fuzzy distance, for example, of the fuzzy T-metric or the fuzzy S-metric can have on the image filtering process itself. The goal is to improve image quality in relation to a vector median filter. Within the theoretical considerations of space with fuzzy distance, results from the fixed point theory have been obtained which provide the possibility of further application of these spaces in the technique.</p>
25

Fuzzer Test Log Analysis Using Machine Learning : Framework to analyze logs and provide feedback to guide the fuzzer

Yadav, Jyoti January 2018 (has links)
In this modern world machine learning and deep learning have become popular choice for analysis and identifying various patterns on data in large volumes. The focus of the thesis work has been on the design of the alternative strategies using machine learning to guide the fuzzer in selecting the most promising test cases. Thesis work mainly focuses on the analysis of the data by using machine learning techniques. A detailed analysis study and work is carried out in multiple phases. First phase is targeted to convert the data into suitable format(pre-processing) so that necessary features can be extracted and fed as input to the unsupervised machine learning algorithms. Machine learning algorithms accepts the input data in form of matrices which represents the dimensionality of the extracted features. Several experiments and run time benchmarks have been conducted to choose most efficient algorithm based on execution time and results accuracy. Finally, the best choice has been implanted to get the desired result. The second phase of the work deals with applying supervised learning over clustering results. The final phase describes how an incremental learning model is built to score the test case logs and return their score in near real time which can act as feedback to guide the fuzzer. / I denna moderna värld har maskininlärning och djup inlärning blivit populärt val för analys och identifiering av olika mönster på data i stora volymer. Uppsatsen har fokuserat på utformningen av de alternativa strategierna med maskininlärning för att styra fuzzer i valet av de mest lovande testfallen. Examensarbete fokuserar huvudsakligen på analys av data med hjälp av maskininlärningsteknik. En detaljerad analysstudie och arbete utförs i flera faser. Första fasen är inriktad på att konvertera data till lämpligt format (förbehandling) så att nödvändiga funktioner kan extraheras och matas som inmatning till de oövervakade maskininlärningsalgoritmerna. Maskininlärningsalgoritmer accepterar ingångsdata i form av matriser som representerar dimensionen av de extraherade funktionerna. Flera experiment och körtider har genomförts för att välja den mest effektiva algoritmen baserat på exekveringstid och resultatnoggrannhet. Slutligen har det bästa valet implanterats för att få önskat resultat. Den andra fasen av arbetet handlar om att tillämpa övervakat lärande över klusterresultat. Slutfasen beskriver hur en inkrementell inlärningsmodell är uppbyggd för att få poäng i testfallsloggarna och returnera poängen i nära realtid vilket kan fungera som feedback för att styra fuzzer.
26

COVID-19 crisis and the efficiency of Indian banks: Have they weathered the storm?

Gulati, R., Vincent, Charles, Hassan, M.K., Kumar, S. 22 June 2023 (has links)
Yes / The purpose of this study is to determine whether Indian banks were able to weather the COVID-19 storm. We estimate banks’ deposits-generating and operating efficiencies using a two-stage directional distance function-based network data envelopment analysis (DDF- NDEA) approach and seek to capture the immediate impact of COVID-19 on these efficiency measures by comparing their magnitudes in the pre-pandemic (2014/15-2019/20), just 1-year prior to the pandemic (2019/20), and during the pandemic year (2020/21) periods. The study looks at whether the impact of the COVID-19 pandemic was uniform across ownership types and size classes. The empirical findings suggest that the Indian banking system was resilient and withstood the immediate impact of the COVID-19 pandemic. During the study period, however, the large and medium-sized banks experienced some effi ciency losses. By and large, regardless of bank group, banks have shown resilience to the shock of the global health pandemic and improvements in efficiency. / The full-text of this article will be released for public view at the end of the publisher embargo on 28 Dec 2024.
27

Measuring Airport Efficiency with Fixed Asset Utilization to Minimize Airport Delays

Widener, Scott D. 22 October 2010 (has links)
Deregulation of the airlines in the United States spawned a free-for-all system which led to a variety of agents within the aviation system all seeking to optimize their own piece of the aviation system, and the net result was that the aviation system itself was not optimized in aggregate, frequently resulting in delays. Research on the efficiency of the system has likewise focused on the individual agents, primarily focusing on the municipalities in an economic context, and largely ignoring the consumer. This paper develops the case for a systemic efficiency measurement which incorporates the interests of the airlines and the consumers with those of the airport operating municipalities in three different Data Envelopment Analysis (DEA) models: traditional Charnes-Cooper-Rhodes and Banker-Charnes-Cooper models, and a Directional Output Distance Function model, devised and interpreted using quality management principles. These models were combined to allow the resulting efficiencies of the operating configurations of the given airport to predict the efficiency of the associated airport. Based upon regression models, these efficiency measurements can be used as a diagnostic for improving the efficiency of the entire United States airspace, on a systemic basis, at the individual airport configuration level. An example analysis using this diagnostic is derived in the course of the development and description of the diagnostic and two additional case studies are presented.
28

Bioenergy, pollution, and economic growth

Ankarhem, Mattias January 2005 (has links)
This thesis consists of four papers: two of them deal with the effects on the forest sector of an increase in the demand for forest fuels, and two of them concern the relation between economic growth and pollution. Paper [I] is a first, preliminary study of the potential effects on the Swedish forest sector of a continuing rise in the use of forest resources as a fuel in energy generation. Sweden has made a commitment that the energy system should be sustainable, i.e., it should be based on renewable resources. However, an increasing use of the forest resources as an energy input could have effects outside the energy sector. We consider this in a static model by estimating a system of demand and supply equations for the four main actors on the Swedish roundwood market; forestry, sawmills, pulpmills and the energy sector. We then calculate the industries' short run supply and demand elasticities. Paper [II], is a development of the former paper. In this paper, we estimate the dynamic effects on the forest sector of an increased demand for forest fuels. This is done by developing a partial adjustment model of the forest sector that enables short, intermediate, and long run price elasticities to be estimated. It is relevant to study the effects of increased demand for forest fuels as the Swedish government has committed to an energy policy that is likely to further increase the use of renewable resources in the Swedish energy system. Four subsectors are included in the model: forestry, sawmills, pulpmills and the energy industry. The results show that the short run elasticities are fairly consistent with earlier studies and that sluggish adjustment in the capital stock is important in determining the intermediate and long run responses. Simulation shows that an increase in the demand for forest fuels has a positive effect on the equilibrium price of all three types of wood, and a negative effect on the equilibrium quantities of sawtimber and pulpwood. In paper [III] a Shephard distance function approach is used to estimate time series of shadow prices for Swedish emissions of CO2, SO2, and VOC for the period 1918 - 1994. The shadow prices are in a next step regressed on GDP per capita. The objective of the study is closely linked to hypothesis of environmental Kuznets curves. We conclude that the time series of the shadow prices from this approach can not be used to explain the EKCs found for Swedish emissions. In paper [IV], we calculate time series of shadow prices for Swedish emissions of CO2, SO2, and VOC for the period 1918 - 1994. The shadow prices are in a next step related to income, to explain the EKCs previously found for Swedish data on the three emissions. Newly constructed historical emission time series enable studying a single country's emission paths through increasing levels of economic activity. A directional distance function approach is used to estimate the industry's production process in order to calculate the opportunity costs of a reduction in the emissions. The time series of the shadow prices show support for EKCs for the Swedish industry.
29

Baigtinės populiacijos dviejų sumų santykio kalibruotieji įvertiniai / Calibrated estimators of the finite population ratio of two totals

Radišauskaitė, Simona 05 August 2013 (has links)
Šiame darbe nagrinėjami dviejų sumų santykio kalibruotieji įvertiniai, kuriuose panaudojama daugiau negu po vieną kiekvieno tyrimo kintamojo papildomąjį kintamąjį. Naudojant skirtingas kalibravimo lygtis ir atstumo funkcijas čia sukonstruoti šeši tokio tipo įvertiniai. Taikant Teiloro ištiesinimo, visrakčio ir atsitiktinių grupių metodus sukonstruoti keleto naujų santykio įvertinių dispersijos įvertiniai. Modeliuojant nauji santykio įvertiniai lyginami tarpusavyje bei su standartiniu ir A. Plikuso įvertiniais. Tiriama, kaip įvertiniu tikslumą įtakoja imties dydis ir laisvai pasirenkami svoriai, kai koreliacija tarp tyrimo ir papildomų kintamųjų yra stipri. Keičiant imties dydį pastebėjome, kad daugeliu atvejų nauji santykio įvertiniai yra tikslesni už A. Plikuso įvertini. Taip pat pastebėjome, kad parenkant skirtingus laisvai pasirenkamus svorius įvertinių tikslumas išlieka panašus. Atliekant sukonstruotų santykio įvertinių dispersijų įvertinių tikslumo tyrimą, kai keičiamas imties elementų skaičius, pastebėjome, kad visrakčio metodu gautieji dispersijos įvertiniai yra tikslesni už Teiloro ištiesinimo ar atsitiktinių grupių metodais gautus dispersijos įvertinius. Visi matematinio modeliavimo eksperimentai atlikti, naudojant matematinių uždavinių paketą MATLAB 7.10.0. / In this work we analyze the calibrated estimators of the ratio of two totals, which use more than one auxiliary variable for each study variable. Using different calibration equations and distance functions, we construct here six new estimators of the ratio. The estimators of the variance of some estimators of the ratio are constructed using Taylor linearization, jackknife and random groups methods. A simulation study is performed to compare new estimators of the ratio with the standard estimator and estimators, introduced by A. Plikusas. It is analyzed, how the characteristics of accuracy of estimators depend on the sample size and free additional weights when auxiliary variables are well correlated with study variables. The simulation results show that for some populations new estimators are more accurate than those, introduced by A. Plikusas. During the simulation, we observed, that the estimators of the variance of the estimators of the ratio that are constructed using jackknife method are more accurate than those that are constructed using Taylor linearization and random groups methods. The simulation results are obtained using the computer program that was made using the Language of Technical Computing MATLAB 7.10.0.
30

A ROBUST RGB-D SLAM SYSTEM FOR 3D ENVIRONMENT WITH PLANAR SURFACES

Su, Po-Chang 01 January 2013 (has links)
Simultaneous localization and mapping is the technique to construct a 3D map of unknown environment. With the increasing popularity of RGB-depth (RGB-D) sensors such as the Microsoft Kinect, there have been much research on capturing and reconstructing 3D environments using a movable RGB-D sensor. The key process behind these kinds of simultaneous location and mapping (SLAM) systems is the iterative closest point or ICP algorithm, which is an iterative algorithm that can estimate the rigid movement of the camera based on the captured 3D point clouds. While ICP is a well-studied algorithm, it is problematic when it is used in scanning large planar regions such as wall surfaces in a room. The lack of depth variations on planar surfaces makes the global alignment an ill-conditioned problem. In this thesis, we present a novel approach for registering 3D point clouds by combining both color and depth information. Instead of directly searching for point correspondences among 3D data, the proposed method first extracts features from the RGB images, and then back-projects the features to the 3D space to identify more reliable correspondences. These color correspondences form the initial input to the ICP procedure which then proceeds to refine the alignment. Experimental results show that our proposed approach can achieve better accuracy than existing SLAMs in reconstructing indoor environments with large planar surfaces.

Page generated in 0.0808 seconds