Spelling suggestions: "subject:"oon parametric"" "subject:"oon parametrica""
1 |
A Praxis on Parametric Design; An Exploration of CityEngine as a Tool in the Development of Urban Design ScenariosB. Trevor B., Grafton 25 January 2016 (has links)
Landscape Architects should not create final products, but rather frameworks through which natural existing phenomenon can coexist, evolve and adapt; while bringing together an evolving final existence. The world is not static; the image which we design on paper will never exist exactly the way in which we imagined and there are certain aspects, which we cannot account for or control in the natural world. Given this belief, this practicum is not intended as a final design; but rather constructed as praxis; an exploration of a parametric modeling software and its use as an integral part of the natural design thinking process; a simulation tool rather than a representational conclusion. / February 2016
|
2 |
Robust Parameterization Schema for CAx Master ModelsBerglund, Courtney L. 19 March 2008 (has links) (PDF)
Today's engineering companies rely heavily on an engineer's ability to use computers to analyze and optimize designs. With this use of computers in the design process, products undergo multiple design iterations between preliminary concept and final form. This in turn results in Computer Aided Design (CAD) models being passed from one discipline to the next. In attempts to keep consistency within the design process, an industry wide shift towards the use of CAD master models is taking place. With this change to master models, manufacturing and engineering development companies are attempting to more fully employ the use of parametrics in their initial CAD models. This is in hopes that the initial models handed downstream are robust enough to be used throughout the entire design loop. Unfortunately, current parameter definitions are often not robust enough to incorporate all the design changes from the various analyses and manufacturing operations. To address this problem, we present a more robust parametric methodology that broadens the current definition of parametrics as currently employed on CAx master models within CAD packages.
|
3 |
Análisis de la eficiencia del gasto municipal y de sus determinantesHerrera Catalán, Pedro, Francke, Pedro 10 April 2018 (has links)
En este estudio se analizó la eficiencia del gasto local en 1686 municipalidades del país para elaño 2003, mediante la interpretación de las actividades públicas locales como un proceso de producción que transforma inputs en outputs (Bradford et al. 1969 y Fisher 1996). Se establecieron para ello diversas fronteras de producción, construidas a partir de los mejores resultados dentro de grupos de municipalidades y luego se estimó la eficiencia relativa como la distancia a dichas fronteras. Se utilizaron cinco metodologías para la estimación de las fronteras de producción: (i) 3 no paramétricas (Free Disposal Hull, FDH, y Data Envelopment Analysis, DEA-CRS yDEA-VRS) y (ii) dos paramétricas (una determinística y otra estocástica), las cuales se estimaron a través de las diez categorías de municipalidades (cuatro provinciales y seis distritales) definidas mediante una metodología de conglomerados. Finalmente, a partir del empleo de modelos de regresión de tipo TOBIT, se analizaron los determinantes fiscales, socioeconómicos y demográficos de los niveles de eficiencia encontrados. Los resultados obtenidos son diversos y varían según la categoría del municipio analizado; a pesar que fue posible identificar algunas buenas prácticas municipales, los resultados a nivel nacional son preocupantes puesto que indican que se podría alcanzar la misma provisión de bienes y servicios municipales con 57,6% menos recursos. Entre los principales determinantes de este gasto ineficiente se encontraron las transferencias de los recursos del FONCOMUN y del canon, sobre todo a nivel distrital, mientras que uno de los factores que permitió una provisión más eficiente de los servicios públicos locales fue la participación ciudadana, aproximada por la presencia de los Consejos de Coordinación Local. En el marco de la promulgación del Decreto Supremo 068-2006-PCM, en octubre de 2006, por el cual se establece la culminación de las transferencias de competencias y funciones inicialmente programadas para el período 2006-2010, hacia fines del año 2007, los resultados obtenidos resaltan la necesidad de concentrar mayores esfuerzos para mejorar la eficiencia del gasto a nivel local.---In this study we analyzed the efficiency of the spending of 1686 Peruvian municipalities for the year 2003 through the evaluation to the municipal public activities as a process of production which transforms inputs in outputs (Bradford et al. 1969 and Fisher 1996). In doing so, we established several «best-practice» production frontiers in order to estimate the extent of municipal spending that seems to be wasted relative to that frontier. Five methodologies for the production frontier estimations were used (i) three non parametric: (i) Free Disposal Hull (FDH) and Data Envelopment Analysis (DEA-CRS and DEA-VRS) and (ii) two parametric: one deterministic and the other stochastic, which were estimated through ten categories of municipalities (four groups of provincial and six of district municipalities) defined through a cluster methodology. Finally, using TOBIT regression models, we analyzed the fiscal, socioeconomics and demographic determinants of the efficiency levels. We found different levels of efficiency and although we identified some good practices, the result at the national level is matter of concern because the same results could have been obtained with 57.6% less resources. Among the main determinants of this inefficiency spending the FONCOMUN and canon transfers were found, mainly at the district municipal level. One of the factors that allowed for a more efficient provision of local services was the civil participation, represented in the study by the Local Coordination Council. The results of the study show that there is a need to make more efforts to improve the local government spending. Even more so, after the promulgation of the Supreme Decree 068-2006-PCM in October 2006, by virtue of which the culmination of the competences and function transfers process, initially programmed for 2006-2010, brought forward to the end of 2007.
|
4 |
Associative Design for Building Envelopes' Sun Control and Shading DevicesJanuary 2012 (has links)
abstract: In geographical locations with hot-arid climates, sun control in buildings is one primary problem to solve for the building envelope design. Today's technological advances in building science bring with them the opportunity to design dynamic façade systems for sun radiation control and daylighting. Although dynamic systems can become an attractive visual element, they can be costly and challenging to maintain for building owners. Alternatively, fixed solar-shading systems can be designed to create dynamism in the façade of the building, while providing similar functionalities for sun control. The work presented in this project focuses on the use of a visual scripting editor for modeling software, Grasshopper, to develop a Solar Control Visual Script that evaluates building envelope surfaces with planar and non-uniform rational basis-spline (NURBS) forms and provides projections for fixed sun control systems. The design platform of Grasshopper allows individuals with no experience or prior computer coding education to build up programming-like capabilities; this feature permits users to discover new design possibilities within flexible frames that can contribute to the overall design being pursued, while also having an environmental response. The Solar Control Visual Script provides minimum sizing geometries that achieve shading in openings at a particular date and time of the year. The model for this method of analysis makes use of three components to derive the appropriate values for the projections of shading geometries: typical meteorological year (TMY) data, irradiation isotropic equations and shading profile angles equations for vertical and tilted surfaces. Providing an automatic visual of generated geometries uncovers the opportunity to test several model forms and reiterates the analysis when modifying control parameters. By employing building science as a set of environmental parameters, the design outcome bears a dynamic form that responds to natural force conditions. The optimized results promote an efficient environmental design solution for sun control as an integral alternative into the building envelope. / Dissertation/Thesis / M.S. Built Environment 2012
|
5 |
Non-parametric edge detection in speckled imageryGiovanny Giron Amaya, Edwin 31 January 2008 (has links)
Made available in DSpace on 2014-06-12T18:02:43Z (GMT). No. of bitstreams: 2
arquivo4052_1.pdf: 1926198 bytes, checksum: a394edbf4f303fa7b25af920df83cf25 (MD5)
license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5)
Previous issue date: 2008 / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Este trabalho propõe uma técnica não-paramétrica para detecção de bordas em imagens
speckle. As imagens SAR ("Synthetic aperture Radar"), sonar, B-ultrasound
e laser são corrompidas por um ruído não aditivo chamado speckle. Vários modelos
estatísticos foram propostos para desrever este ruído, levando ao desenvolvimento
de técnicas especiais para melhoramento e análise de imagens. A distribuição G0 é
um modelo estatístico que consegue descrever uma ampla gama de áreas, como, por
exemplo, em dados SAR, pastos (lisos), florestas (rugosos) e áreas urbanas (muito
rugosos). O objetivo deste trabalho é estudar ténicas alternativas na detecção de
imagens speckled, tomando como ponto de partida Gambini et al. (2006, 2008).
Um novo detector de borda baseado no teste de Kruskal Wallis é proposto. Os
nossos resultados numéricos mostram que esse detector é uma alternativa atraente
ao detector de M. Gambini, que é baseado na função de verossimilhançaa.
Neste trabalho fornecemos evidências de que a técnica de M. Gambini pode ser
substituída om sucesso pelo método Kruskal Wallis. O ganho reside em ter um
algoritmo 1000 vezes mais rápido, sem omprometer a qualidade dos resultados
|
6 |
Semi-parametric Survival Analysis via Dirichlet Process Mixtures of the First Hitting Time ModelRace, Jonathan Andrew January 2019 (has links)
No description available.
|
7 |
The Storage of Parametric Data in Product Lifecycle Management SystemsLund, Jonathan Gary 23 March 2006 (has links) (PDF)
Product development companies are continually seeking methods to increase efficiency while maintaining quality. Distributed development is also more important than ever before as industries globalize. These forces have driven firms to adopt formal data management practices that allow groups and individuals to work from singular, centralized data source that are secure, reliable, and support collaboration. This thesis proposes a methodology to leverage globalized infrastructures for the efficient storage of product variations. The methodology is proved through a working prototype using the market leader in Product Lifecycle Management (PLM) systems, Teamcenter Engineering. First, paradigms are set forth for the storage of various types of engineering documents in PLM systems in parametric formats. Then the use of these paradigms is exemplified by various programs retrieving and storing document variations in the form of PLM metadata. Finally, the results show that this methodology produces drastic increases in system performance as well as the enabling of PLM-compatible automation and optimization. The impacts of these findings have significant implications for industry and has generated interest from several global engineering firms and academic journals.
|
8 |
Automated Tool Design for Complex Free-Form ComponentsFoster, Kevin G. 08 December 2010 (has links) (PDF)
In today's competitive manufacturing industries, companies strive to reduce manufacturing development costs and lead times in hopes of reducing costs and capturing more market share from early release of their new or redesigned products. Tooling lead time constraints are some of the more significant challenges facing product development of advanced free-form components. This is especially true for complex designs in which large dies, molds or other large forming tools are required. The lead time for tooling, in general, consists of three main components; material acquisition, tool design and engineering, and tool manufacturing. Lead times for material acquisition and tool manufacture are normally a function of vendor/outsourcing constraints, manufacturing techniques and complexity of tooling being produced. The tool design and engineering component is a function of available manpower, engineering expertise, type of design problem (initial design or redesign of tooling), and complexity of the design problem. To reduce the tool design/engineering lead time, many engineering groups have implemented Computer-Aided Design, Engineering, and Manufacturing (CAD/CAE/CAM or CAx) tools as their standard practice for the design and analysis of their products. Although the predictive capabilities are efficient, using CAx tools to expedite advanced die design is time consuming due to the free-form nature and complexity of the desired part geometry. Design iterations can consume large quantities of time and money, thus driving profit margins down or even being infeasible from a cost and schedule standpoint. Any savings based on a reduction in time are desired so long as quality is not sacrificed. This thesis presents an automated tool design methodology that integrates state-of-the-art numerical surface fitting methods with commercially available CAD/CAE/CAM technologies and optimization software. The intent is to virtually create tooling wherein work-piece geometries have been optimized producing products that capture accurate design intent. Results show a significant reduction in design/engineering tool development time. This is due to the integration and automation of associative tooling surfaces automatically derived from the known final design intent geometry. Because this approach extends commercially available CAx tools, this thesis can be used as a blueprint for any automotive or aerospace tooling need to eliminate significant time and costs from the manufacture of complex free-form components.
|
9 |
Bayesian Non-parametric Models for Time Series DecompositionGranados-Garcia, Guilllermo 05 January 2023 (has links)
The standard approach to analyzing brain electrical activity is to examine the spectral density function (SDF) and identify frequency bands, defined apriori, that have the most substantial relative contributions to the overall variance of the signal. However, a limitation of this approach is that the precise frequency and bandwidth of oscillations are not uniform across cognitive demands. Thus, these bands should not be arbitrarily set in any analysis. To overcome this limitation, we propose three Bayesian Non-parametric models for time series decomposition which are data-driven approaches that identifies (i) the number of prominent spectral peaks, (ii) the frequency peak locations, and (iii) their corresponding bandwidths (or spread
of power around the peaks). The standardized SDF is represented as a Dirichlet process mixture based on a kernel derived from second-order auto-regressive processes which completely characterize the location (peak) and scale (bandwidth) parameters. A Metropolis-Hastings within Gibbs algorithm is developed for sampling from the posterior distribution of the mixture parameters for each project. Simulation studies demonstrate the robustness and performance of the proposed methods. The methods developed were applied to analyze local field potential (LFP) activity from the hippocampus of laboratory rats across different conditions in a non-spatial sequence memory experiment to identify the most prominent frequency bands and examine the link between specific patterns of brain oscillatory activity and trial-specific cognitive demands. The second application study 61 EEG channels from two subjects performing a visual recognition task to discover frequency-specific oscillations present across brain zones. The third application extends the model to characterize the data coming from 10 alcoholics and 10 controls across three experimental conditions across 30 trials. The proposed models generate a framework to condense the oscillatory behavior of populations across different tasks isolating the target fundamental components allowing the practitioner different perspectives of analysis.
|
10 |
Probabilistic modelling of morphologically rich languagesBotha, Jan Abraham January 2014 (has links)
This thesis investigates how the sub-structure of words can be accounted for in probabilistic models of language. Such models play an important role in natural language processing tasks such as translation or speech recognition, but often rely on the simplistic assumption that words are opaque symbols. This assumption does not fit morphologically complex language well, where words can have rich internal structure and sub-word elements are shared across distinct word forms. Our approach is to encode basic notions of morphology into the assumptions of three different types of language models, with the intention that leveraging shared sub-word structure can improve model performance and help overcome data sparsity that arises from morphological processes. In the context of n-gram language modelling, we formulate a new Bayesian model that relies on the decomposition of compound words to attain better smoothing, and we develop a new distributed language model that learns vector representations of morphemes and leverages them to link together morphologically related words. In both cases, we show that accounting for word sub-structure improves the models' intrinsic performance and provides benefits when applied to other tasks, including machine translation. We then shift the focus beyond the modelling of word sequences and consider models that automatically learn what the sub-word elements of a given language are, given an unannotated list of words. We formulate a novel model that can learn discontiguous morphemes in addition to the more conventional contiguous morphemes that most previous models are limited to. This approach is demonstrated on Semitic languages, and we find that modelling discontiguous sub-word structures leads to improvements in the task of segmenting words into their contiguous morphemes.
|
Page generated in 0.103 seconds