211 |
Separation and Extraction of Valuable Information From Digital Receipts Using Google Cloud Vision OCR.Johansson, Elias January 2019 (has links)
Automatization is a desirable feature in many business areas. Manually extracting information from a physical object such as a receipt is something that can be automated to save resources for a company or a private person. In this paper the process will be described of combining an already existing OCR engine with a developed python script to achieve data extraction of valuable information from a digital image of a receipt. Values such as VAT, VAT%, date, total-, gross-, and net-cost; will be considered as valuable information. This is a feature that has already been implemented in existing applications. However, the company that I have done this project for are interested in creating their own version. This project is an experiment to see if it is possible to implement such an application using restricted resources. To develop a program that can extract the information mentioned above. In this paper you will be guided though the process of the development of the program. As well as indulging in the mindset, findings and the steps taken to overcome the problems encountered along the way. The program achieved a success rate of 86.6% in extracting the most valuable information: total cost, VAT% and date from a set of 53 receipts originated from 34 separate establishments.
|
212 |
Traitement unifié des propriétés physiques dans un<br />environnement d'analyse intégréGiurgea, Stefan 17 December 2003 (has links) (PDF)
SALOME est une plate-forme générique Open Source de Pré-Post traitement destinée à être spécialisée pour y intégrer des codes de calcul existants. Elle intègre plusieurs modules dans une architecture de composants distribués. Notre mission dans le cadre du projet SALOME a été la conception et la mise en œuvre technologique du module DATA, dédié à la description des propriétés physiques.En ce sens, nous avons réalisé un nouveau langage dédié à la description des modèles de données physiques : le SPML (SALOME Physics Modelling Language). Un métamodèle dédié à la description des propriétés physiques offre au langage SPML sa base sémantique. Pour réaliser une partie commune de communication entre des modèles représentant différents domaines de la physique, il a été développé un Modèle de Données Commun, matérialise par une librairie SPML réutilisable. La réalisation de l'IHM, notamment l'adaptation automatique de l'interface graphique aux modèles physiques décrits en SPML, font du module DATA un outil performant, qui permet une adaptation facile de la plate-forme, pour tout domaine de la physique. Nous avons réalisé une première connexion du solveur Flux, dans le cadre de la plate-forme pour des analyses magnétostatiques sur des problèmes décrits en SALOME.
|
213 |
Heapy: A Memory Profiler and Debugger for PythonNilsson, Sverker January 2006 (has links)
<p>Excessive memory use may cause severe performance problems and system crashes. Without appropriate tools, it may be difficult or impossible to determine why a program is using too much memory. This applies even though Python provides automatic memory management --- garbage collection can help avoid many memory allocation bugs, but only to a certain extent due to the lack of information during program execution. There is still a need for tools helping the programmer to understand the memory behaviour of programs, especially in complicated situations. The primary motivation for Heapy is that there has been a lack of such tools for Python.</p><p>The main questions addressed by Heapy are how much memory is used by objects, what are the objects of most interest for optimization purposes, and why are objects kept in memory. Memory leaks are often of special interest and may be found by comparing snapshots of the heap population taken at different times. Memory profiles, using different kinds of classifiers that may include retainer information, can provide quick overviews revealing optimization possibilities not thought of beforehand. Reference patterns and shortest reference paths provide different perspectives of object access patterns to help explain why objects are kept in memory.</p>
|
214 |
Walk-A-Way : A Maya Plug-in for Walk Cycle AutomationChristiansson, Kajsa January 2009 (has links)
<p>In 3D and 2D animations walk cycles of characters appear very frequently and are an important way of expressing various aspects of the story told. However walk cycles are tedious and time consuming to animate. In this work an Autodesk MAYA plug-in has been developed, that aims at automating this process. The walk cycle plug-in can be highly beneficial for animators when creating convincing walk cycles in a fast and simple way. The plug-in calculates the right values for each phase in the walk cycle. The GUI of the plug-in makes it easy to provide the required input parameters. In addition, the plug-in allows the animation of a character to walk along a chosen path.</p>
|
215 |
Introducing Lantmäteriet’s gravity data in ArcGIS with implementation of customized GIS functionsRyttberg, Mattias January 2013 (has links)
Gravity is measured and used by Lantmäteriet to calculate a model of the geoid to get accurate reference heights for positioning. Lantmäteriet are continuously measuring new gravity and height data across Sweden to both complement, replace and to add new data points. This is mainly done by measurements in the field at benchmark points. One of the major reasons for continued measurements on e.g. benchmark points is that the measuring always moves forward which makes the measurements more accurate. More accurate data leads to a more accurate calculation of the geoid due to the more accurate gravity values. A more accurate geoid gives the possibility of more precise positioning across Sweden, due to the more precise height values. Lantmäteriet is in the process of updating their entire database of gravity data. They are also measuring at locations where there are none or sparse with measurements. As a stage in the renewing of their database and other systems the Geodesy department wishes to get an introduction to the ArcGIS environment. By customizations of several ArcGIS functions, Lantmäteriet’s work with the extensive data will get easier and perhaps faster. Customized tools will help make e. g. adding and removing data points easier, as well as making cross validation and several other functions only a click of a button away.
|
216 |
PIC/FLIP Fluid Simulation Using Block-Optimized Grid Data StructureSalomonsson, Fredrik January 2011 (has links)
This thesis work will examin and present how to implement a Particle-In-Cell and a Fluid-Implicit-Particle (PIC / FLIP) fluid solver that takes advantage of the inherent parallelism of Digital Domain's sparse block optimized data structure, DB-Grid. The methods offer a hybrid approach between particle and grid based simulation. This thesis will also discuss and go through different approaches for storing and accessing the data associated with each particle. For dynamically create and remove attributes from the particles, Disney's open source API, Partio is used. Which is also used for saving the particles to disk. Finally how to expose C++ classes into Python by wrapping everything into a Python module using the Boost.Python API and discuss the benets of having a script language.
|
217 |
Wind Power and Its Impact on the Moldovan Electrical SystemEriksson, Joel, Gozdz Englund, Simon January 2012 (has links)
The master thesis project has been executed with the cooperation of Borlänge Energi, with the aim of reducing the high electric energy dependency which Moldova has on Ukraine, Transnistria and Russia. The project examines what reduction that would be possible by wind power installations on the existing electrical grid of Moldova. The installations should not surpass the capacity of the transmission lines or the voltage levels according to regulation. The southern regions of Moldova proved to have the best wind conditions and the locations of Besarabeasca, Zarnesti, Leovo, Ciadyr and Cimislia in the southern region were chosen for wind power installations. For the analysis a model over the Moldovan electrical system is constructed. Each of the five chosen locations is modelled with a generator symbolizing the wind power installation. The power flow software PSS/E is used to construct the model. To examine possible wind power installations different scenarios are created. The scenarios are executed with the southern regions 110 kV system as a focus area. All scenarios are analysed with a contingency analysis, where transmission lines in the focus region are tripped. The contingency analysis and the scenarios are automated using the programming language Python. An economic analysis shows payback periods for wind power investments in Moldova, the analysis also shows the sensitivity of the electricity price and discount rates. The project concludes that wind power installations are possible with the Moldovan electric grid as it looks today. The installations would result in reducing the high dependency of imported electrical energy.
|
218 |
Impact of error : Implementation and evaluation of a spatial model for analysing landscape configurationWennbom, Marika January 2012 (has links)
Quality and error assessment is an essential part of spatial analysis which with the increasingamount of applications resulting from today’s extensive access to spatial data, such as satelliteimagery and computer power is extra important to address. This study evaluates the impact ofinput errors associated with satellite sensor noise for a spatial method aimed at characterisingaspects of landscapes associated with the historical village structure, called the HybridCharacterisation Model (HCM), that was developed as a tool to monitor sub goals of theSwedish Environmental Goal “A varied agricultural landscape”. The method and errorsimulation method employed for generating random errors in the input data, is implemented andautomated as a Python script enabling easy iteration of the procedure. The HCM is evaluatedqualitatively (by visual analysis) and quantitatively comparing kappa index values between theoutputs affected by error. Comparing the result of the qualitative and quantitative evaluationshows that the kappa index is an applicable measurement of quality for the HCM. Thequalitative analysis compares impact of error for two different scales, the village scale and thelandscape scale, and shows that the HCM is performing well on the landscape scale for up to30% error and on the village scale for up to 10% and shows that the impact of error differsdepending on the shape of the analysed feature. The Python script produced in this study couldbe further developed and modified to evaluate the HCM for other aspects of input error, such asclassification errors, although for such studies to be motivated the potential errors associatedwith the model and its parameters must first be further evaluated.
|
219 |
Nutzerorientiertes Management von materiellen und immateriellen InformationsobjektenHübsch, Chris 12 December 2001 (has links) (PDF)
Schaffung einer stabilen, erweiterbaren und skalierbaren Infrastruktur für die Bereitstellung von Diensten im Umfeld von Bibliotheken und ähnlichen wissensanbietenden Einrichtungen unter Verwendung von XML-RPC und Python.
|
220 |
wxWindows / wxPythonWegener, Jens 17 May 2002 (has links)
Gemeinsamer Workshop von Universitaetsrechenzentrum und Professur Rechnernetze und verteilte Systeme der Fakultaet fuer Informatik der TU Chemnitz.
Der Vortrag stellt wxWindows und wxPython als Lösung zur Entwicklung plattformunabhängiger Software mit graphischer Benutzerschnittstelle vor.
|
Page generated in 0.0204 seconds