• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 5
  • 2
  • 1
  • Tagged with
  • 18
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Optimalizace NC programu pomocí CAD/CAM software / Optimization of NC program using CAD/CAM software

Paseka, Jan January 2014 (has links)
Tendency of this master thesis is a proposal of savings in the process of technological production’s preparation in a manufacturing company. In the first part is elaborated general theoretical study of current components and NC programs. Based on this, and finished analysis was defined optimization’s proposals described in second part. Thanks this complete proposals comes to time and money savings, which are needed for implementation of prototype project into serial production.
12

Radiating Macroscopic Dark Matter: Searching for Effects in Cosmic Microwave Background and Recombination History

Kumar, Saurabh 26 January 2021 (has links)
No description available.
13

Using Hard Macros to Accelerate FPGA Compilation for Xilinx FPGAs

Lavin, Christopher Michael 22 January 2012 (has links) (PDF)
Field programmable gate arrays (FPGAs) offer an attractive compute platform because of their highly parallel and customizable nature in addition to the potential of being reconfigurable to any almost any desired circuit. However, compilation time (the time it takes to convert user design input into a functional implementation on the FPGA) has been a growing problem and is stifling designer productivity. This dissertation presents a new approach to FPGA compilation that more closely follows the software compilation model than that of the application specific integrated circuit (ASIC). Instead of re-compiling every module in the design for each invocation of the compilation flow, the use of pre-compiled modules that can be "linked" in the final stage of compilation are used. These pre-compiled modules are called hard macros and contain the necessary physical information to ultimately implement a module or building block of a design. By assembling hard macros together, a complete and fully functional implementation can be created within seconds. This dissertation describes the process of creating a rapid compilation flow based on hard macros for Xilinx FPGAs. First, RapidSmith, an open source framework that enabled the creation of custom CAD tools for this work is presented. Second, HMFlow, the hard macro-based rapid compilation flow is described and presented as tuned to compile Xilinx FPGA designs as fast as possible. Finally, several modifications to HMFlow are made such that it produces circuits with clock rates that run at more than 75% of Xilinx-produced implementations while compiling more than 30X faster than the Xilinx tools.
14

Optimalizace koaxiálních filtrů metodou Tuning-Space Mapping v CST / Optimizing coaxial filters by Tuning-Space Mapping in CST

Wolanský, David January 2010 (has links)
This thesis deals with optimization of coaxial cavity filters. It thoroughly describes optimization method Tuning – Space Mapping (TSM) in CST Studio. It focuses on the explanation of the fine and the coarse model and their mutual link. The whole algorithm is illustrated and tested on the third-order coaxial cavity filter working in the band of 880 – 960MHz. In order to check the function of TSM, the algorithm is applied on fourth-order filter, triplet filter and quadruplet filter. Another part of the thesis focuses on the automatization of the whole optimization procedure. Macros for automatic determination of calibration constants and automatic calibration process between the coarse and the fine model are proposed and programmed in CST. The complete optimization procedure is applied on optimization of seventh-order filter with two cross couplings.
15

Language and tool support for multilingual programs

Lee, Byeongcheol 12 October 2011 (has links)
Programmers compose programs in multiple languages to combine the advantages of innovations in new high-level programming languages with decades of engineering effort in legacy libraries and systems. For language inter-operation, language designers provide two classes of multilingual programming interfaces: (1) foreign function interfaces and (2) code generation interfaces. These interfaces embody the semantic mismatch for developers and multilingual systems builders. Their programming rules are difficult or impossible to verify. As a direct consequence, multilingual programs are full of bugs at interface boundaries, and debuggers cannot assist developers across these lines. This dissertation shows how to use composition of single language systems and interposition to improve the safety of multilingual programs. Our compositional approach is scalable by construction because it does not require any changes to single-language systems, and it leverages their engineering efforts. We show it is effective by composing a variety of multilingual tools that help programmers eliminate bugs. We present the first concise taxonomy and formal description of multilingual programming interfaces and their programming rules. We next compose three classes of multilingual tools: (1) Dynamic bug checkers for foreign function interfaces. We demonstrate a new approach for automatically generating a dynamic bug checker by interposing on foreign function interfaces, and we show that it finds bugs in real-world applications including Eclipse, Subversion, and Java Gnome. (2) Multilingual debuggers for foreign function interfaces. We introduce an intermediate agent that wraps all the methods and functions at language boundaries. This intermediate agent is sufficient to build all the essential debugging features used in single-language debuggers. (3) Safe macros for code generation interfaces. We design a safe macro language, called Marco, that generates programs in any language and demonstrate it by implementing checkers for SQL and C++ generators. To check the correctness of the generated programs, Marco queries single-language compilers and interpreters through code generation interfaces. Using their error messages, Marco points out the errors in program generators. In summary, this dissertation presents the first concise taxonomy and formal specification of multilingual interfaces and, based on this taxonomy, shows how to compose multilingual tools to improve safety in multilingual programs. Our results show that our compositional approach is scalable and effective for improving safety in real-world multilingual programs. / text
16

Knihovna stavebních prvků koaxiálního filtru pro CST Microwave Studio / Library of building elements of coaxial filters for CST Microwave Studio

Vorek, Jiří January 2010 (has links)
This thesis deals with development of tool – macros which help to create coaxial cavity filters. It describes problematic design of complex structures at CST Studio Suite 2009. For this purpose macros supporting coarse and fine model to method Tuning – Space Mapping (TSM) was made. This means creating fine model at CST MWS and coarse model at CST DS.
17

Well-Formed and Scalable Invasive Software Composition / Wohlgeformte und Skalierbare Invasive Softwarekomposition

Karol, Sven 26 June 2015 (has links) (PDF)
Software components provide essential means to structure and organize software effectively. However, frequently, required component abstractions are not available in a programming language or system, or are not adequately combinable with each other. Invasive software composition (ISC) is a general approach to software composition that unifies component-like abstractions such as templates, aspects and macros. ISC is based on fragment composition, and composes programs and other software artifacts at the level of syntax trees. Therefore, a unifying fragment component model is related to the context-free grammar of a language to identify extension and variation points in syntax trees as well as valid component types. By doing so, fragment components can be composed by transformations at respective extension and variation points so that always valid composition results regarding the underlying context-free grammar are yielded. However, given a language’s context-free grammar, the composition result may still be incorrect. Context-sensitive constraints such as type constraints may be violated so that the program cannot be compiled and/or interpreted correctly. While a compiler can detect such errors after composition, it is difficult to relate them back to the original transformation step in the composition system, especially in the case of complex compositions with several hundreds of such steps. To tackle this problem, this thesis proposes well-formed ISC—an extension to ISC that uses reference attribute grammars (RAGs) to specify fragment component models and fragment contracts to guard compositions with context-sensitive constraints. Additionally, well-formed ISC provides composition strategies as a means to configure composition algorithms and handle interferences between composition steps. Developing ISC systems for complex languages such as programming languages is a complex undertaking. Composition-system developers need to supply or develop adequate language and parser specifications that can be processed by an ISC composition engine. Moreover, the specifications may need to be extended with rules for the intended composition abstractions. Current approaches to ISC require complete grammars to be able to compose fragments in the respective languages. Hence, the specifications need to be developed exhaustively before any component model can be supplied. To tackle this problem, this thesis introduces scalable ISC—a variant of ISC that uses island component models as a means to define component models for partially specified languages while still the whole language is supported. Additionally, a scalable workflow for agile composition-system development is proposed which supports a development of ISC systems in small increments using modular extensions. All theoretical concepts introduced in this thesis are implemented in the Skeletons and Application Templates framework SkAT. It supports “classic”, well-formed and scalable ISC by leveraging RAGs as its main specification and implementation language. Moreover, several composition systems based on SkAT are discussed, e.g., a well-formed composition system for Java and a C preprocessor-like macro language. In turn, those composition systems are used as composers in several example applications such as a library of parallel algorithmic skeletons.
18

Well-Formed and Scalable Invasive Software Composition

Karol, Sven 18 May 2015 (has links)
Software components provide essential means to structure and organize software effectively. However, frequently, required component abstractions are not available in a programming language or system, or are not adequately combinable with each other. Invasive software composition (ISC) is a general approach to software composition that unifies component-like abstractions such as templates, aspects and macros. ISC is based on fragment composition, and composes programs and other software artifacts at the level of syntax trees. Therefore, a unifying fragment component model is related to the context-free grammar of a language to identify extension and variation points in syntax trees as well as valid component types. By doing so, fragment components can be composed by transformations at respective extension and variation points so that always valid composition results regarding the underlying context-free grammar are yielded. However, given a language’s context-free grammar, the composition result may still be incorrect. Context-sensitive constraints such as type constraints may be violated so that the program cannot be compiled and/or interpreted correctly. While a compiler can detect such errors after composition, it is difficult to relate them back to the original transformation step in the composition system, especially in the case of complex compositions with several hundreds of such steps. To tackle this problem, this thesis proposes well-formed ISC—an extension to ISC that uses reference attribute grammars (RAGs) to specify fragment component models and fragment contracts to guard compositions with context-sensitive constraints. Additionally, well-formed ISC provides composition strategies as a means to configure composition algorithms and handle interferences between composition steps. Developing ISC systems for complex languages such as programming languages is a complex undertaking. Composition-system developers need to supply or develop adequate language and parser specifications that can be processed by an ISC composition engine. Moreover, the specifications may need to be extended with rules for the intended composition abstractions. Current approaches to ISC require complete grammars to be able to compose fragments in the respective languages. Hence, the specifications need to be developed exhaustively before any component model can be supplied. To tackle this problem, this thesis introduces scalable ISC—a variant of ISC that uses island component models as a means to define component models for partially specified languages while still the whole language is supported. Additionally, a scalable workflow for agile composition-system development is proposed which supports a development of ISC systems in small increments using modular extensions. All theoretical concepts introduced in this thesis are implemented in the Skeletons and Application Templates framework SkAT. It supports “classic”, well-formed and scalable ISC by leveraging RAGs as its main specification and implementation language. Moreover, several composition systems based on SkAT are discussed, e.g., a well-formed composition system for Java and a C preprocessor-like macro language. In turn, those composition systems are used as composers in several example applications such as a library of parallel algorithmic skeletons.

Page generated in 0.024 seconds