• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17
  • 7
  • 6
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 43
  • 10
  • 8
  • 7
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Week 03, Video 09: Render Settings

Marlow, Gregory 01 January 2020 (has links)
https://dc.etsu.edu/digital-animation-videos-oer/1031/thumbnail.jpg
12

Dynamic allocation of servers for large scale rendering application

Andersson, Samuel January 2021 (has links)
Cloud computing has been widely used for some time now, and its area of use is growing larger and larger year by year. It is very convenient for companies to use cloud computing when creating certain products, however it comes with a great price. In this thesis it will be evaluated if one could optimize the expenses for a product regardless of what platform that is used. And would it be possible to anticipate how much resources a product will need, and allocate those machines in a dynamic fashion?       In this thesis the work of predicting the need of rendering machines based on response times from user requests, and dynamically allocate rendering machines to a product based on this need will be evaluated. The solution used will be based on machine learning, where different types of regression models will try to predict the response times of the future, and evaluate whether or not they are acceptable. During the thesis both a simulation and a replica of the real architecture will be implemented. The replica of the real architecture will be implemented by using AWS cloud services.       The resulting regression model that turned out to be best, was the simplest possible. A linear regression model with response times as the independent variable, and the queue size per rendering machine was used as the dependent variable. The model performed very good in the region of realistic response times, but not necessarily that good at very high response times or at very low response times. That is not considered as a problem though, since response times in those regions should not be of concern for the purpose of the regression model. The effects of the usage of the regression model seems to be better than in the case of using a completely reactive scaling method. Although the effects are not really clear, since there is no user data available. In order for the effects to be evaluated in a fair way, there is a need of user patterns in terms of daily usage of the product. Because the requests in the used simulation are based on pure randomness, there is no correlation in what happened 10 minutes back in the simulation and what will happen 10 minutes in the future. The effect of that is that it is really hard to estimate how the dependent variable will change over time. And if that can not be estimated in a proper way, the results with the inclusion of the regression model can not be tested in a realistic scenario either.
13

SCALABILITY OF JAVASCRIPTLIBRARIES FOR DATAVISUALIZATION

Persson, Jonna January 2021 (has links)
Visualization is an important tool for making data understandable. Visualization can be used for many different purposes, such as charts on the web used to visualize a dataset. Render time is important for websites since slow response times can cause users to leave the site. When creating a website with fast render times in mind, the selection of JavaScript library may be crucial. This work aims to determine if dataset size and/or chart type affects the render time of charts created by different JavaScript libraries. The comparison is done by a literature search to identify suitable libraries, followed by an experiment in which 40 websites are created to compare the performance of selected JavaScript libraries for rendering selected chart types. The results show that while both dataset size and chart type affect the render time in most cases, the libraries scale differently depending on the dataset size.
14

Tvorba fotorealistických V-Ray materiálů a jejich aplikace v programu 3Ds Max

Kramárik, Jakub January 2017 (has links)
The thesis deals with introduction of V-Ray material properties and their production. By means of these materials it is possible to create physically accurate photorealistic surfaces of objects in visualisations in Autodesk 3Ds Max program. In the first part are described the characteristics of different surfaces and their optical properties depending on the impact of light. In this part are also analysed all V-Ray materials and their properties related specifically to visualisations of interiors. The thesis also deals with the issues of mapping onto objects and HDRI mapping. In the practical part the thesis pursues creating complex photorealistic V-Ray materials and their application on objects. In this part a tutorial for creating your own textures in grid editor is also made. In conjunction with created materials, these textures are used in a sample interior visualisation.
15

Influence of the combination of Roman cement and lime as the binder phase in render mortars for restoration

Starinieri, V., Hughes, David C., Wilk, D. January 2013 (has links)
No / It is known that lime was added to historic Roman cement render mortars. The focus of this work is the influence of the combination of NHL5 and CL90 with Roman cement in mortars for restoration; however, the results indicate a wider potential for render applications in general. It is shown that simply adding lime to Roman cement does not retard its hydration and yields mortars where the binding action of the cement is compromised by the mixing process. If the cement is retarded by means of a pre-hydration process, hybrid mortars can be produced with improved workability and workable life as well as permitting the fine control of strength and moisture transport.
16

Light emitting diode color rendition properties

Hood, Sean January 1900 (has links)
Master of Science / Department of Architectural Engineering and Construction Science / Fred Hasler / This paper discusses the color rendition capabilities of light emitting diodes (LEDs) and their relationship with the current standard for color rendition quality. The current standard for judging light source color rendering properties, known as the color rendering index (CRI), has come under heavy scrutiny in recent years with the introduction of LED in commercial lighting applications. LEDs, depending on construction type, have highly structured spectral distributions which do not scale well under the color rendering index; moreover, CRI for LEDs has become disjointed with the subjective measurement of human color preference. Unfortunately, given the multidimensional nature of color, an all-encompassing scale with a single rated value for color rendition capabilities of a light source has proven difficult to establish. An analysis on the human visual system is first discussed, establishing how the visual system first detects color in the eye and subsequently encodes that color information through a color-opponent process, formulating conscious color appearance. The formation of color appearance leads into a discussion on human color vision and the creation of three dimensional color space, which is subsequently used for the measurement of color fidelity (CRI) of consumer light sources. An overview of how LED lamps create light and color is then discussed, showing that the highly structured spectral distribution of LED lamps is often the cause of discrepancy within the CRI system. Existing alternatives to the CRI system are then compared and contrasted to each other, and the existing CRI system. A final color preference study was conducted where four LED lamps where compared to a reference lamp of equal correlated color temperature. Observers were asked to rate the various test lamps against the reference lamp in terms of vividness, naturalness, overall preference, and individual color preference. It was found that no significant difference was found between the first three dimensions measured but significant trend lines existed for the preference of individual colors when illuminated by either LED lamps or the reference source. Recommendations are then made for how the lighting industry could move forward in terms of color metrics.
17

Inspeção de fachadas históricas: levantamento de materiais e danos de argamassas

Teles, Carlos Dion de Melo 20 October 2010 (has links)
As argamassas de fachada protegem a edificação, seu conteúdo e revelam sua identidade, funções realçadas quando se trata de patrimônio histórico. Elas sofrem pelo intemperismo, umidade, vandalismo, intervenções malsucedidas e vícios de fabricação. A tese apresenta e aplica uma metodologia de levantamento de danos, materiais e técnicas construtivas, para a conservação ou restauro de argamassas de fachadas históricas. Revisa a literatura de preservação arquitetônica e demonstra a importância desse levantamento. Faz uma compilação dos mecanismos de degradação de argamassas históricas. Traz informações práticas sobre acesso vertical à fachada, definição de objetivos de levantamento, representação da evolução histórica, elaboração de formulários de campo e orçamento. Apresenta proposta de formação de inspetores, revisa a bibliografia e propõe metodologia de levantamento de danos, materiais e métodos construtivos. No levantamento de danos, emprega a inspeção visual e ensaio não destrutivo, por percussão com martelo de bordas em ABS. Revê e propõe critérios de amostragem e protocolos de análises laboratoriais. Emprega análise macroscópica, difração de raios-X (DR-X), petrografia e química por digestão ácida. Traz sugestões quanto à síntese e apresentação dos resultados à equipe de restauração. A metodologia prima pela simplicidade e economicidade, foi desenvolvida através de três estudos de caso, sendo dois no Paço de São Cristóvão Pátio Maior e na fachada principal - e a fachada exterior do futuro Museu da Casa da Moeda do Brasil, ambos no Rio de Janeiro, juntos somando aproximadamente 4.400m² de argamassa. As informações de materiais e métodos construtivos contribuíram para o levantamento histórico e compreensão da patologia. O levantamento de danos por percussão indicou 69% de argamassa comprometida (problemas de aderência ou reposição inadequada) no Pátio Maior, apesar da inspeção visual indicar somente 5% de ausência de argamassa por queda. A demolição cuidadosa das argamassas degradadas comprovou um desvio de apenas 4% do levantamento de danos por percussão. O protocolo de análise laboratorial agrupou as amostras pela inspeção macroscópica e/ou DR-X, realizando análises mais completas apenas em representantes destes grupos. Empregou a DR-X na identificação mineralógica. A petrografia foi muito versátil, agregando informações sobre composição mineralógica, granulometria, porosimetria, traço aproximado e história geológica. O traço foi avaliado através da análise química, comprovada para argamassas de areia, cal e/ou cimento, entretanto limitada na presença de agregado carbonático e/ou argila. Os estudos de caso apresentaram composições de envolvendo cal, areia, saibro, cimento e agregado carbonático. Fragmentos de conchas/corais foram encontrados em diversas amostras. As patologias foram relacionadas a torrões de argila na composição da argamassa, falta de aglomerante e incompatibilidade entre emboço e reboco. A apresentação de resultados na forma de mapas temáticos foi eficaz no ambiente multidisciplinar. A aplicação da metodologia se demonstrou viável e útil no apoio às decisões de restauro de argamassas de fachadas históricas, podendo ser aplicada também a argamassas contemporâneas. / Façade renders protect the building, its content and reveal its identity. Those roles are even more important on cultural heritage buildings. Renders decay under weathering, damp, vandalism, unsuccessful interventions and manufacturing defects. This thesis presents and applies an assessment methodology concerning historical façades plasters, construction methods and damage. The architectural preservation literature is reviewed, highlighting the importance of such assessment. Rendering decay mechanisms are compiled, focused on historical mortars. Practical field information is described, including façade vertical access, assessment objectives definition, historical evolution representation, field forms design and implied costs. An inspector formation proposal is outlined. Materials, construction methods and damage assessment literature are reviewed and a method is proposed. Damage assessment relies on visual inspection and nondestructive essay, using ABS faced hammer percussion. Sampling criteria and laboratory protocols are reviewed and proposed. The proposed protocol applies macroscopic analysis, X-ray diffraction (XRD), petrography and acid digestion chemical analysis. Suggestions are made concerning results synthesis and presentation to preservation team. Presented methodology, based on simplicity and low cost, was developed during three case studies: two on São Cristóvão Palace Main Court and Main Façade- and the exterior façades of the future Casa da Moeda do Brasil Museum, both in Rio de Janeiro. The three studies account for roughly 4.400 m² of assessed rendered surface. Construction methods and materials assessment contributed to the buildings historical studies and pathology comprehension. Percussion assessment evaluated damage on 69% of the Main Court rendered area, although visual inspection accounted only 5% of missing render. Careful demolition of damaged render proved a 4% deviation from percussion assessment forecast. Laboratory protocol grouped samples based on macroscopic evaluation and/or XRD. Only group representatives were submitted to further analysis. XDR was used for mineralogical identification. Petrography proved a very versatile essay, yielding data on composition, granulometry, porosimetry, approximated formulation and geological history. Plaster composition was evaluated by chemical analysis blind proofed for sand, lime and/or Portland cement, though impaired when carbonate aggregate and clay particles are present. Case studies showed various mortar formulations, comprising lime, sand, clay, cement and carbonate aggregate. Sea shell/reef fragments were found in several samples. Pathology was related to the presence of clay lumps, lack of binder and incompatibility between plaster layers. Thematic maps were successfully used in results presentation to multidisciplinary teams. The proposed methodology is viable and useful, supporting decisions within architectural preservation of historical façades plaster rendering. Contemporary façades can benefit from the methodology as well.
18

TYLER KLINE’S <em>RENDER</em>: A FORMAL ANALYSIS AND PERFORMANCE GUIDE

Handshoe, John Douglas 01 January 2018 (has links)
Since the 1950s, composers worldwide have explored the use of the trombone in new and exciting ways, from expanding the functional range of the instrument to creating unique timbres through the use of mutes and extended techniques. Since then, many standard works in the literature have been born from this pushing of the envelope from composers like John Cage, Luciano Berio, Iannis Xenakis, and Daniel Schnyder. On the forefront of the newest crop of composers expanding the voice of the trombone is Tyler Kline (b. 1991). This project will function as a formal analysis and performer’s guide to his 2015 work render for bass or tenor trombone and fixed electronics. Through examination of this music, as well as a discussion with the composer and performances of this work, the performer will gain insight into the inspirations behind this work, Kline’s compositions on the whole, as well as performance considerations for this work. In addition to the performance guide, a recording of render, as well as several other works of Kline’s, will be produced and released as an album through New Branch Records in Lexington, KY.
19

A Depth of Field Algorithm for Realtime 3D Graphics in OpenGL / Algoritm i OpenGL för att rendera realtids 3D grafik med fokus

Henriksson, Ola January 2002 (has links)
<p>The company where this thesis was formulated constructs VR applications for the medical environment. The hardware used is ordinary dektops with consumer level graphics cards and haptic devices. In medicin some operations require microscopes or cameras. In order to simulate these in a virtual reality environment for educational purposes, the effect of depth of field or focus have to be considered. </p><p>A working algorithm that generates this optical occurence in realtime, stereo rendered computer graphics is presented in this thesis. The algorithm is implemented in OpenGL and C++ to later be combined with a VR application simulating eye-surgery which is built with OpenGL Optimizer. </p><p>Several different approaches are described in this report. The call for realtime stereo rendering (~60 fps) means taking advantage of the graphics hardware to a great extent. In OpenGL this means using the extensions to a specific graphic chip for better performance, in this case the algorithm is implemented for a GeForce3 card. </p><p>To increase the speed of the algorithm much of the workload is moved from the CPU to the GPU (Graphics Processing Unit). By re-defining parts of the ordinary OpenGL pipeline via vertex programs, a distance-from-focus map can be stored in the alpha channel of the final image with little time loss. </p><p>This can effectively be used to blend a previously blurred version of the scene with a normal render. Different techniques to quickly blur a renderedimage is discussed, to keep the speed up solutions that require moving data from the graphics card is not an option.</p>
20

A memory profiler for 3D graphics application using ninary instrumentation

Deo, Mrinal 25 July 2011 (has links)
This report describes the architecture and implementation of a memory profiler for 3D graphics applications. The memory profiling is done for parts of the program which runs on the graphics processor and is responsible for rendering the image. The shaders are parsed and every memory instruction is instrumented with additional instruction for profiling. The results are then transferred from the video memory to CPU memory. Profiling is done for a frame and completes in less than three minutes. The report also describes various analyses that can be done using the results obtained from this profiler. The report discusses the design of an analytical cache model that can be used to identify candidate memory buffers suitable for caching among all the buffers used by an application. The profiler can segregate results for reads and writes separately, can handle all formats of texture access instructions and predicated instructions. / text

Page generated in 0.13 seconds