• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 4
  • 3
  • 1
  • Tagged with
  • 24
  • 24
  • 12
  • 9
  • 6
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Panel uživatelského rozhraní s dotykovým grafickým LCD / User Interface Panel with graphic LCD with touchscreen

Plass, Petr January 2011 (has links)
The Master`s thesis gives the basic facts about realisation of HMI. It will measure and display the electric quantity of electric traction of the Marabu experimental aircraft. The HMI is possible to use for other difficult applications. The main part of the thesis is overviewed the graphic function library for LCD.
12

Vícekamerový snímač biometrických vlastností lidského prstu / Multi-Camera Scanner of Biometric Features of Human Finger

Trhoň, Adam January 2015 (has links)
This thesis describes a conceptual design of touchless fingerprint sensor and design, implementation and testing of its firmware, which is a composition of hardware implemented in VHDL and a program implemented in C. Result of this thesis can be used as the first step of building an industrial solution.
13

Simulação computacional de parâmetros importantes dos sistemas radiológicos / Computer simulation of important parameters of radiological systems

Marques, Márcio Alexandre 10 March 1998 (has links)
A presente pesquisa faz parte de um trabalho que visa o desenvolvimento de um método de simulação computacional que permita fazer a avaliação da qualidade da imagem radiográfica de maneira rápida e completa e que não apresente os problemas dos métodos tradicionais disponíveis e usados atualmente. Neste trabalho foi desenvolvido um método que leva em consideração e simula os parâmetros importantes dos sistemas radiológicos tais como: o tamanho e a distribuição de intensidade do ponto focal, as condições geométricas de exposição, a heterogeneidade do campo, a distribuição angular dos raios X, a grade supressora, bem como o efeito Compton. O método de simulação computacional desenvolvido realiza o controle de qualidade da imagem para um objeto colocado em qualquer posição do campo de radiação. A simulação foi feita implementando os algoritmos desenvolvidos em linguagem \"C\" e os resultados foram apresentados sob forma gráfica para facilitar a compreensão dos usuários. / This research is part of a work that aims the development of a computer simulation method which allows to evaluate the radiographic image quality in a faster and complete way, so that it doesn\'t present the problems found in the available methods used nowadays. The method developed in this work takes into consideration and simulates the important parameters of the radiologic systems such as: the focal spot size, the intensity distribution focal spot, the geometric conditions exposure, the field heterogeneity, the X-ray angular distribution, the antiscatter grid (Bucky), as well as the Compton effect. This developed computer simulation method achieves the image quality control of an object at any position of the radiation field. The simulation was done implementing the algorithms developed in C language and the results were displayed in a graphic way to facilitate the users comprehension.
14

A Concurrency and Time Centered Framework for Certification of Autonomous Space Systems

Dechev, Damian 2009 December 1900 (has links)
Future space missions, such as Mars Science Laboratory, suggest the engineering of some of the most complex man-rated autonomous software systems. The present process-oriented certification methodologies are becoming prohibitively expensive and do not reach the level of detail of providing guidelines for the development and validation of concurrent software. Time and concurrency are the most critical notions in an autonomous space system. In this work we present the design and implementation of the first concurrency and time centered framework for product-oriented software certification of autonomous space systems. To achieve fast and reliable concurrent interactions, we define and apply the notion of Semantically Enhanced Containers (SEC). SECs are data structures that are designed to provide the flexibility and usability of the popular ISO C++ STL containers, while at the same time they are hand-crafted to guarantee domain-specific policies, such as conformance to a given concurrency model. The application of nonblocking programming techniques is critical to the implementation of our SEC containers. Lock-free algorithms help avoid the hazards of deadlock, livelock, and priority inversion, and at the same time deliver fast and scalable performance. Practical lock-free algorithms are notoriously difficult to design and implement and pose a number of hard problems such as ABA avoidance, high complexity, portability, and meeting the linearizability correctness requirements. This dissertation presents the design of the first lock-free dynamically resizable array. Our approach o ers a set of practical, portable, lock-free, and linearizable STL vector operations and a fast and space effcient implementation when compared to the alternative lock- and STM-based techniques. Currently, the literature does not offer an explicit analysis of the ABA problem, its relation to the most commonly applied nonblocking programming techniques, and the possibilities for its detection and avoidance. Eliminating the hazards of ABA is left to the ingenuity of the software designer. We present a generic and practical solution to the fundamental ABA problem for lock-free descriptor-based designs. To enable our SEC container with the property of validating domain-specific invariants, we present Basic Query, our expression template-based library for statically extracting semantic information from C++ source code. The use of static analysis allows for a far more efficient implementation of our nonblocking containers than would have been otherwise possible when relying on the traditional run-time based techniques. Shared data in a real-time cyber-physical system can often be polymorphic (as is the case with a number of components part of the Mission Data System's Data Management Services). The use of dynamic cast is important in the design of autonomous real-time systems since the operation allows for a direct representation of the management and behavior of polymorphic data. To allow for the application of dynamic cast in mission critical code, we validate and improve a methodology for constant-time dynamic cast that shifts the complexity of the operation to the compiler's static checker. In a case study that demonstrates the applicability of the programming and validation techniques of our certification framework, we show the process of verification and semantic parallelization of the Mission Data System's (MDS) Goal Networks. MDS provides an experimental platform for testing and development of autonomous real-time flight applications.
15

A Concurrency and Time Centered Framework for Certification of Autonomous Space Systems

Dechev, Damian 2009 December 1900 (has links)
Future space missions, such as Mars Science Laboratory, suggest the engineering of some of the most complex man-rated autonomous software systems. The present process-oriented certification methodologies are becoming prohibitively expensive and do not reach the level of detail of providing guidelines for the development and validation of concurrent software. Time and concurrency are the most critical notions in an autonomous space system. In this work we present the design and implementation of the first concurrency and time centered framework for product-oriented software certification of autonomous space systems. To achieve fast and reliable concurrent interactions, we define and apply the notion of Semantically Enhanced Containers (SEC). SECs are data structures that are designed to provide the flexibility and usability of the popular ISO C++ STL containers, while at the same time they are hand-crafted to guarantee domain-specific policies, such as conformance to a given concurrency model. The application of nonblocking programming techniques is critical to the implementation of our SEC containers. Lock-free algorithms help avoid the hazards of deadlock, livelock, and priority inversion, and at the same time deliver fast and scalable performance. Practical lock-free algorithms are notoriously difficult to design and implement and pose a number of hard problems such as ABA avoidance, high complexity, portability, and meeting the linearizability correctness requirements. This dissertation presents the design of the first lock-free dynamically resizable array. Our approach o ers a set of practical, portable, lock-free, and linearizable STL vector operations and a fast and space effcient implementation when compared to the alternative lock- and STM-based techniques. Currently, the literature does not offer an explicit analysis of the ABA problem, its relation to the most commonly applied nonblocking programming techniques, and the possibilities for its detection and avoidance. Eliminating the hazards of ABA is left to the ingenuity of the software designer. We present a generic and practical solution to the fundamental ABA problem for lock-free descriptor-based designs. To enable our SEC container with the property of validating domain-specific invariants, we present Basic Query, our expression template-based library for statically extracting semantic information from C++ source code. The use of static analysis allows for a far more efficient implementation of our nonblocking containers than would have been otherwise possible when relying on the traditional run-time based techniques. Shared data in a real-time cyber-physical system can often be polymorphic (as is the case with a number of components part of the Mission Data System's Data Management Services). The use of dynamic cast is important in the design of autonomous real-time systems since the operation allows for a direct representation of the management and behavior of polymorphic data. To allow for the application of dynamic cast in mission critical code, we validate and improve a methodology for constant-time dynamic cast that shifts the complexity of the operation to the compiler's static checker. In a case study that demonstrates the applicability of the programming and validation techniques of our certification framework, we show the process of verification and semantic parallelization of the Mission Data System's (MDS) Goal Networks. MDS provides an experimental platform for testing and development of autonomous real-time flight applications.
16

Verification techniques in the context of event-trigged soft real-time systems / Verifikationstekniker för event-triggade mjuka realtidssystem

Norberg, Johan January 2007 (has links)
<p>When exploring a verification approach for Komatsu Forest's control system regarding their forest machines (Valmet), the context of soft real-time systems is illuminated. Because of the nature of such context, the verification process is based on empirical corroboration of requirements fulfillment rather than being a formal proving process.</p><p>After analysis of the literature with respect to the software testing field, two paradigms have been defined in order to highlight important concepts for soft real-time systems. The paradigms are based on an abstract stimuli/response model, which conceptualize a system with inputs and output. Since the system is perceived as a black box, its internal details are hidden and thus focus is placed on a more abstract level.</p><p>The first paradigm, the “input data paradigm”, is concerned about what data to input to the system. The second paradigm, the “input data mechanism paradigm” is concerned about how the data is sent, i.e. the actual input mechanism is focused. By specifying different dimensions associated with each paradigm, it is possible to define their unique characteristics. The advantage of this kind of theoretical construction is that each paradigm creates an unique sub-field with its own problems and techniques.</p><p>The problems defined for this thesis is primarily focused on the input data mechanism paradigm, where devised dimensions are applied. New verification techniques are deduced and analyzed based on general software testing principles. Based on the constructed theory, a test system architecture for the control system is developed. Finally, an implementation is constructed based on the architecture and a practical scenario. Its automation capability is then assessed.</p><p>The practical context for the thesis is a new simulator under development. It is based upon LabVIEW and PXI technology and handles over 200 I/O. Real machine components are connected to the environment, together with artificial components that simulate the engine, hydraulic systems and a forest. Additionally, physical control sticks and buttons are connected to the simulator to enable user testing of the machine being simulated.</p><p>The results associated with the thesis is first of all that usable verification techniques were deduced. Generally speaking, some of these techniques are scalable and are possible to apply for an entire system, while other techniques may be appropriate for selected subsets that needs extra attention. Secondly, an architecture for an automated test system based on a selection of techniques has been constructed for the control system.</p><p>Last but not least, as a result of this, an implementation of a general test system has been possible and successful. The implemented test system is based on both C# and LabVIEW. What remains regarding the implementation is primarily to extend the system to include the full scope of features described in the architecture and to enable result analysis.</p> / <p>Då verifikationstekniker för Komatu Forests styrsystem utreds angående Valmet skogsmaskiner, hamnar det mjuka realtidssystemkontextet i fokus. Ett sådant kontext antyder en process där empirisk styrkning av kravuppfyllande står i centrum framför formella bevisföringsprocesser.</p><p>Efter en genomgång och analys av litteratur för mjukvarutestområdet har två paradigmer definierats med avsikten att belysa viktiga concept för mjuka realtidssystem. Paradigmerna är baserade på en abstrakt stimuli/responsmodell, som beskriver ett system med in- och utdata. Eftersom detta system betraktas som en svart låda är inre detaljer gömda, vilket medför att fokus hamnar på ett mer abstrakt plan.</p><p>Det första paradigmet benämns som “indata-paradigmet” och inriktar sig på vilket data som skickas in i systemet. Det andra paradigmet går under namnet “indatamekanism-paradigmet” och behandlar hur datat skickas in i systemet, dvs fokus placeras på själva inskickarmekanismen. Genom att definiera olika</p><p>dimensioner för de två paradigmen, är det möjligt att beskriva deras utmärkande drag. Fördelen med att använda denna teoretiska konstruktion är att ett paradigm skapar ett eget teoriområde med sina egna frågeställningar och tekniker.</p><p>De problem som definierats för detta arbete är främst fokuserade på indatamekanism-paradigmet, där framtagna dimensioner tillämpas. Nya verifikationstekniker deduceras och analyseras baserat på generella mjukvarutestprinciper. Utifrån den skapade teorin skapas en testsystemarkitektur för kontrollsystemet. Sedan utvecklas ett testsystem baserat på arkitekturen samt ett praktiskt scenario med syftet att utreda systemets automationsgrad.</p><p>Den praktiska miljön för detta arbete kretsar kring en ny simulator under utveckling. Den är baserad på LabVIEW och PXI-teknik och hanterar över 200 I/O. Verkliga maskinkomponenter ansluts till denna miljö tillsammans med konstgjorda komponenter som simulerar motorn, hydralik samt en skog. Utöver detta, ansluts styrspakar och knappar för att möjliggöra användarstyrning av maskinen som simuleras.</p><p>Resultatet förknippat med detta arbete är för det första användbara verifikationstekniker. Man kan generellt säga att några av dessa tekniker är skalbara och därmed möjliga att tillämpa för ett helt system. Andra tekniker är ej skalbara, men lämpliga att applicera på en systemdelmängd som behöver testas mer utförligt.</p><p>För det andra, en arkitektur har konstruerats för kontrollsystemet baserat på ett urval av tekniker. Sist men inte minst, som en följd av ovanstående har en lyckad implementation av ett generellt testsystem utförts. Detta system implementerades med hjälp av C# och LabVIEW. Det som återstår beträffande implementationen är att utöka systemet så att alla funktioner som arkitekturen beskriver är inkluderade samt att införa resultatanalys.</p>
17

Linear Programming for Scheduling Waste Rock Dumping from Surface Mines

Nan Zhang Unknown Date (has links)
Abstract The removal of overlying waste rock in open pit mines to dumps is conventionally undertaken by draglines or by trucks and shovels, or by a combination of these. Waste rock dumps are the largest remnant structures of open cut mining operations and can absorb a large proportion of the mine operating costs. If the dumps are not properly developed they can be excessively expensive and can become a major safety risk and environmental hazard. There are many examples worldwide where poor design and construction of waste rock dumps have resulted in failures causing considerable loss of life and widespread damage, or have resulted in erosion and seepage that have led to severe environmental pollution. The proper design and scheduling of waste rock dumps and haul routes can significantly reduce costs, minimise the possibility of failures, and avoid harming the environment. This Thesis is limited to the consideration of trucks and shovels for waste rock haulage in open cut mining operations. It describes the development and application of a waste rock dump scheduling model using the Operations Research technique of Mixed-Integer Linear Programming, implemented in the mathematical modelling language AMPL. The model focuses on minimising the haulage cost for each block of waste rock taken from the open pit and placed in the dump. Allowance is made for the selective placement of benign and reactive waste rock, based on an open pit block model that delineates benign and reactive waste rock. The formulation requires input data including the xyz-coordinates of the block model for the open pit, information on whether the waste rock blocks are benign or reactive, the proposed time scheduling of waste rock haulage from the open pit, unit haulage costs, and the geometry of the waste rock dump, including the delineation of the zones that are benign and those that are reactive. The model was successfully tested by using both simple test data and actual mine site data. The application of the model to a simple case confirmed that it produces results that meet the Objective Function in producing an optimal haulage time and cost, and meets the various Constraints imposed. This model for scheduling the removal of waste rock from open cut mining operations with trucks and shovels will require further research and testing and, because the results are generated in a numerical format, there will also be a need to convert them to a graphical format to facilitate their interpretation. Ultimately, it will have the potential to provide a relatively low-cost scheduling tool that meets operators’ economic, safety and environmental goals.
18

Simulação computacional de parâmetros importantes dos sistemas radiológicos / Computer simulation of important parameters of radiological systems

Márcio Alexandre Marques 10 March 1998 (has links)
A presente pesquisa faz parte de um trabalho que visa o desenvolvimento de um método de simulação computacional que permita fazer a avaliação da qualidade da imagem radiográfica de maneira rápida e completa e que não apresente os problemas dos métodos tradicionais disponíveis e usados atualmente. Neste trabalho foi desenvolvido um método que leva em consideração e simula os parâmetros importantes dos sistemas radiológicos tais como: o tamanho e a distribuição de intensidade do ponto focal, as condições geométricas de exposição, a heterogeneidade do campo, a distribuição angular dos raios X, a grade supressora, bem como o efeito Compton. O método de simulação computacional desenvolvido realiza o controle de qualidade da imagem para um objeto colocado em qualquer posição do campo de radiação. A simulação foi feita implementando os algoritmos desenvolvidos em linguagem \"C\" e os resultados foram apresentados sob forma gráfica para facilitar a compreensão dos usuários. / This research is part of a work that aims the development of a computer simulation method which allows to evaluate the radiographic image quality in a faster and complete way, so that it doesn\'t present the problems found in the available methods used nowadays. The method developed in this work takes into consideration and simulates the important parameters of the radiologic systems such as: the focal spot size, the intensity distribution focal spot, the geometric conditions exposure, the field heterogeneity, the X-ray angular distribution, the antiscatter grid (Bucky), as well as the Compton effect. This developed computer simulation method achieves the image quality control of an object at any position of the radiation field. The simulation was done implementing the algorithms developed in C language and the results were displayed in a graphic way to facilitate the users comprehension.
19

Verification techniques in the context of event-trigged soft real-time systems / Verifikationstekniker för event-triggade mjuka realtidssystem

Norberg, Johan January 2007 (has links)
When exploring a verification approach for Komatsu Forest's control system regarding their forest machines (Valmet), the context of soft real-time systems is illuminated. Because of the nature of such context, the verification process is based on empirical corroboration of requirements fulfillment rather than being a formal proving process. After analysis of the literature with respect to the software testing field, two paradigms have been defined in order to highlight important concepts for soft real-time systems. The paradigms are based on an abstract stimuli/response model, which conceptualize a system with inputs and output. Since the system is perceived as a black box, its internal details are hidden and thus focus is placed on a more abstract level. The first paradigm, the “input data paradigm”, is concerned about what data to input to the system. The second paradigm, the “input data mechanism paradigm” is concerned about how the data is sent, i.e. the actual input mechanism is focused. By specifying different dimensions associated with each paradigm, it is possible to define their unique characteristics. The advantage of this kind of theoretical construction is that each paradigm creates an unique sub-field with its own problems and techniques. The problems defined for this thesis is primarily focused on the input data mechanism paradigm, where devised dimensions are applied. New verification techniques are deduced and analyzed based on general software testing principles. Based on the constructed theory, a test system architecture for the control system is developed. Finally, an implementation is constructed based on the architecture and a practical scenario. Its automation capability is then assessed. The practical context for the thesis is a new simulator under development. It is based upon LabVIEW and PXI technology and handles over 200 I/O. Real machine components are connected to the environment, together with artificial components that simulate the engine, hydraulic systems and a forest. Additionally, physical control sticks and buttons are connected to the simulator to enable user testing of the machine being simulated. The results associated with the thesis is first of all that usable verification techniques were deduced. Generally speaking, some of these techniques are scalable and are possible to apply for an entire system, while other techniques may be appropriate for selected subsets that needs extra attention. Secondly, an architecture for an automated test system based on a selection of techniques has been constructed for the control system. Last but not least, as a result of this, an implementation of a general test system has been possible and successful. The implemented test system is based on both C# and LabVIEW. What remains regarding the implementation is primarily to extend the system to include the full scope of features described in the architecture and to enable result analysis. / Då verifikationstekniker för Komatu Forests styrsystem utreds angående Valmet skogsmaskiner, hamnar det mjuka realtidssystemkontextet i fokus. Ett sådant kontext antyder en process där empirisk styrkning av kravuppfyllande står i centrum framför formella bevisföringsprocesser. Efter en genomgång och analys av litteratur för mjukvarutestområdet har två paradigmer definierats med avsikten att belysa viktiga concept för mjuka realtidssystem. Paradigmerna är baserade på en abstrakt stimuli/responsmodell, som beskriver ett system med in- och utdata. Eftersom detta system betraktas som en svart låda är inre detaljer gömda, vilket medför att fokus hamnar på ett mer abstrakt plan. Det första paradigmet benämns som “indata-paradigmet” och inriktar sig på vilket data som skickas in i systemet. Det andra paradigmet går under namnet “indatamekanism-paradigmet” och behandlar hur datat skickas in i systemet, dvs fokus placeras på själva inskickarmekanismen. Genom att definiera olika dimensioner för de två paradigmen, är det möjligt att beskriva deras utmärkande drag. Fördelen med att använda denna teoretiska konstruktion är att ett paradigm skapar ett eget teoriområde med sina egna frågeställningar och tekniker. De problem som definierats för detta arbete är främst fokuserade på indatamekanism-paradigmet, där framtagna dimensioner tillämpas. Nya verifikationstekniker deduceras och analyseras baserat på generella mjukvarutestprinciper. Utifrån den skapade teorin skapas en testsystemarkitektur för kontrollsystemet. Sedan utvecklas ett testsystem baserat på arkitekturen samt ett praktiskt scenario med syftet att utreda systemets automationsgrad. Den praktiska miljön för detta arbete kretsar kring en ny simulator under utveckling. Den är baserad på LabVIEW och PXI-teknik och hanterar över 200 I/O. Verkliga maskinkomponenter ansluts till denna miljö tillsammans med konstgjorda komponenter som simulerar motorn, hydralik samt en skog. Utöver detta, ansluts styrspakar och knappar för att möjliggöra användarstyrning av maskinen som simuleras. Resultatet förknippat med detta arbete är för det första användbara verifikationstekniker. Man kan generellt säga att några av dessa tekniker är skalbara och därmed möjliga att tillämpa för ett helt system. Andra tekniker är ej skalbara, men lämpliga att applicera på en systemdelmängd som behöver testas mer utförligt. För det andra, en arkitektur har konstruerats för kontrollsystemet baserat på ett urval av tekniker. Sist men inte minst, som en följd av ovanstående har en lyckad implementation av ett generellt testsystem utförts. Detta system implementerades med hjälp av C# och LabVIEW. Det som återstår beträffande implementationen är att utöka systemet så att alla funktioner som arkitekturen beskriver är inkluderade samt att införa resultatanalys.
20

Translation of CAN Bus XML Messages to C Source Code

Andersson, Gustav January 2020 (has links)
The concept of translating source code into other target programming languages is extensively used in a wide area of applications. Danfoss Power Solutions AB, a company located in Älmhult, strives to streamline their way of software development for microcontrollers by implementing this idea. Their proprietary software tool PLUS+1 GUIDE is based on the CAN bus communication network, which allows electronic control units to share data represented in the XML format. Due to compatibility problems, the application in the electronic control units requires this data to be translated into the source code in the low-level C programming language. This thesis project proposes an approach for facilitating this task by implementing a source-to-source compiler that performs the translation with a reduced level of manual user involvement. A literature review was conducted in order to find the existing solutions relevant to our project task. An analysis of the provided XML input files was thereafter performed to clarify a software design suitable for the problem. By using a general XML parser, a solution was then constructed. The implementation resulted in a fully functional source-to-source compiler, producing the generated C code within a time range of 73–85 milliseconds for the input test files of typical size. The feedback received from the domain experts at Danfoss confirms the usability of the proposed solution.

Page generated in 0.3641 seconds