• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1034
  • 365
  • 176
  • 137
  • 81
  • 75
  • 53
  • 47
  • 38
  • 30
  • 24
  • 20
  • 19
  • 17
  • 9
  • Tagged with
  • 2484
  • 1077
  • 792
  • 312
  • 254
  • 253
  • 248
  • 246
  • 223
  • 220
  • 207
  • 175
  • 155
  • 153
  • 145
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Classification of Genotype and Age of Eyes Using RPE Cell Size and Shape

Yu, Jie 18 December 2012 (has links)
Retinal pigment epithelium (RPE) is a principal site of pathogenesis in age-related macular de-generation (AMD). AMD is a main source of vision loss even blindness in the elderly and there is no effective treatment right now. Our aim is to describe the relationship between the morphology of RPE cells and the age and genotype of the eyes. We use principal component analysis (PCA) or functional principal component method (FPCA), support vector machine (SVM), and random forest (RF) methods to analyze the morphological data of RPE cells in mouse eyes to classify their age and genotype. Our analyses show that amongst all morphometric measures of RPE cells, cell shape measurements (eccentricity and solidity) are good for classification. But combination of cell shape and size (perimeter) provide best classification.
142

A web store based on reusable .NET components

Baig, Aftab, Ahmad, Iftikhar January 2011 (has links)
The thesis project describes  the analysis, process and major factors in development of a simple component oriented Web Shop in ASP.NET. It addresses to the concepts used in  the application as  well  as  derivation  of  technical  design  and  development  from  the  concepts  acquired  by studying existing approaches. The report  describes  a  brief  summary of  existing  approaches  and related  technologies.  It also lays  the foundation  of  goal  oriented  approach  by  providing  an  overview  of  component  based  software engineering. The  basic  concepts for modularization  were  barrowed from  entities  identification,  object models and component models pplication’s architecture is set to be a layered app roach combining the software layered  architecture  approach  with  multi  tier  architecture  of  web  applications. Class models explaining the inner structure of each component have been provided and an overview of user interface pages is given to explain the application outer flow. The application sets out to prove the significance of component oriented approach as well as  the  support  provided for it by ASP.Net. The resulting package proves  to  have  scalable  components  that could be scaled for  or  reused in another  application or  in  a later version of the same application. / First and Final Version of our Thesis Report / SoftIn - Introducing methods and tools for software development in small and medium-sized enterprises
143

Comparison of Classification Effects of Principal Component and Sparse Principal Component Analysis for Cardiology Ultrasound in Left Ventricle

Yang, Hsiao-ying 05 July 2012 (has links)
Due to the association of heart diseases and the patterns of the diastoles and systoles of heart in left ventricle, we analyze and classify the data gathered form Kaohsiung Veterans General Hospital by using the cardiology ultrasound images. We make use of the differences between the gray-scale values of diastoles and systoles in left ventricle to evaluate the function of heart. Following Chen (2011) and Kao (2011), we modified the way about the reduction and alignment of the image data. We also add some more subjects into the study. We treat images in two manners, saving the parts of concern. Since the ultrasound image after transformation to data form is expressed as a high-dimensional matrix, the principal component analysis is adapted to retain the important factors and reduce the dimensions. In this work, we compare the loadings calculated by the usual principal and sparse principal component analysis, then the factor scores are used to carry out the discriminant analysis and discuss the accuracy of classification. By the statistical methods in this work, the accuracy, sensitivity and specificity of the original classifications are over 80% and the cross validations are over 60%.
144

The Classification of In Vivo MR Spectra on Brain Abscesses Patients Using Independent Component Analysis

Liu, Cheng-Chih 04 September 2012 (has links)
Magnetic Resonance Imaging (MRI) can obtain the tissues of in vivo non-invasively. Proton MR Spectroscopy uses the resonance principle to collect the signals of proton and transforms them to spectrums. It provides information of metabolites in patient¡¦s brain for doctors to observe the change of pathology. Observing the metabolites of brain abscess patients is most important process in clinical diagnosis and treatment. Then, doctors use different spectrums of echo time (TE) to enhance the accuracy in the diagnosis. In our study, we use independent component analysis (ICA) to analyze MR spectroscopy. After analyzing, the independent components represent the elements which compose the input data. Then, we use the projection which is mentioned by Ssu-Ying Lu¡¦s Thesis to help us observe the relationship between independent components and spectrums of patients. We also discuss the result of spectrums with using ICA and PCA and discover some questions (whether it need to do scale normalization before inputting data or not, the result of scale normalization doesn¡¦t expect, and the peak in some independent components confuse us by locating in indistinct place) to discuss and to find possible reason after experiments.
145

An Object-Process Methodology for Implementation a Distribution Information System

Lu, Liang-Yu 16 July 2001 (has links)
Component base software development methodology is the most important technological revolution of software industry in the past few years. Straightly to push forward software industry from taking handiwork as principle thing, gradually to get into automation assisting tool procreation¡¦s automation industry. Component base software development technology give way to business information system easy fabricate flexibly. System developer may assemble software components depending on user requirement. We can increase or subtract system components to modulate a section of system capability any time. But do not influence whole system, only contained a part of system components. This thesis brings up an object-process methodology to apply develop a business distributed information system. Using object-process methodology to find business objects from business process. We can divide system analysis into two parts and eight steps, to analyze the user requirement than to design information system to guide stable software objects and system framework. Through object-process business system helps we establish the model of the complex business system, mapping the real world activity or the abstract conception into system model. We can analyze and design distributed objects efficiently for distributed operation system environment needed. Proceeding to the next step, to transform software model and to seal up distributed component object module (DCOM), than to put DCOM into system application layer. Let the business information system flexibly and ample fitting in user requirement.
146

Modeling and Timing Analysis of Industrial Component-Based Distributed Real-time Embedded Systems

Mubeen, Saad January 2012 (has links)
The model- and component-based development approach has emerged as an attractive option for the development of Distributed Real-time Embedded (DRE) systems. In this thesis we target several issues such as modeling of legacy communication, extraction of end-to-end timing models and support for holistic response-time analysis of industrial component-based DRE systems. We introduce a new approach for modeling legacy network communication in component-based DRE systems. By introducing special-purpose components to encapsulate and abstract the communication protocols in DRE systems, we allow the use of legacy nodes and legacy protocols in a component- and model-based software engineering environment. The proposed approach also supports the state-of-the-practice development of component-based DRE systems. The Controller Area Network (CAN) is one of the widely used real-time networks in DRE systems especially in automotive domain. We identify that the existing analysis of CAN does not support common message transmission patterns which are implemented by some high-level protocols used in the industry. Consequently, we extend the existing analysis to facilitate the worst-case response-time computation of these transmission patterns. The extended analysis is generally applicable to any high-level protocol for CAN that uses periodic, sporadic, or both periodic and sporadic transmission of messages. Because an end-to-end timing model should be available to perform the holistic response-time analysis, we present a method to extract the end-to-end timing models from component-based DRE systems. In order to show the applicability of our modeling techniques and extended analysis, we provide a proof of concept by extending the existing industrial component model (Rubus Component Model), implementing the holistic response-time analysis along with the extended analysis of CAN in the industrial tool suite (Rubus-ICE), and conducting an automotive case study. / EEMDEF
147

Komponentavskrivning – ett skenproblem? : En studie av de redovisningsmässiga effekterna / Component depreciation – an illusory problem? : A study of the accounting effects

Kåberg, Linda, Pettersson, Cecilia January 2014 (has links)
Problem: Från och med januari 2014 är det obligatoriskt för K3-redovisande företag att göra komponentavskrivning på sina anläggningstillgångar. Argumenten kring komponentavskrivningen går isär och frågan är om bristen på vägledning leder till oklar tillämpning och skillnader i redovisningen. Syfte: Uppsatsens syfte har en deskriptiv och normativ form varav det deskriptiva syftet ämnar skapa en bild av aktuella redovisningssituationer för att sedan ställa dem mot den normativa teorin. Referensram: Referensramen består av en allmän referensram där vi redogör för de grundläggande begrepp som berör avskrivningar samt de ekonomiska effekter avskrivningsmetoderna leder till. I den teoretiska referensramen redogör vi först allmänt om redovisningsteori och sedan mer utförligt om den normativa teorin. Metod: Studien består av en kvalitativ metod med semistrukturerade intervjuer. Simuleringar används sedan för att göra jämförelser mellan ett verkligt företags och två fiktiva företag. Empiri: Det empiriska materialet är hämtat ifrån intervjuer med två representanter från ett fastighetsbolag som innehar förvaltningsfastigheter samt material i form av bokslut och företagets avskrivningsplaner över sina komponenter. Slutsats: Olika nyttjandeperioder har visat sig leda till betydelsefulla skillnader i redovisningen. Med påverkan från den normativa teorin bör emellertid valfriheten och därmed skillnaderna minska. / Problem: From January 2014 it is obligatory for K3-reporting companies to make component depreciation on its fixed assets. The arguments surrounding the component depreciation differs greatly and the question is whether the lack of guidance in the transition leads to a difference in application and accounting. Purpose: The purpose of this thesis has a descriptive and normative form of which the descriptive purpose aims to create an image of actual accounting situations and then set them against the normative theory. Framework: The framework consists of a general reference framework in which we describe the basic concepts related to depreciation and the economic impact of the methods. In the theoretical framework we begin with presenting general accounting theory and in more detail about the normative theory. Method: The study consists of a qualitative approach with semi-structured interviews. Simulations have been used to make comparisons between a real company and two fictitious corporations. Empirical data: The empirical data are drawn from interviews with two representatives from a real estate company that owns investment properties and materials in form of financial statements and the company´s depreciation schedules over its components. Conclusion: Different useful lives have been shown lead to significant differences in the accounts. However should the freedom of choice and thereby the disparities be reduced under the influences from the normative theory.
148

Visual Composition In Component Oriented Development

Ozturk, Murat Mutlu 01 August 2005 (has links) (PDF)
This thesis introduces a visual composition approach for JavaBeans components, in compliance with the Component Oriented Software Engineering (COSE) process. The graphical modeling tool, COSECASE, is enhanced with the ability to build a system by integrating domain-specific components. Such integration is implemented by defining connection points and interaction details between components. The event model of the JavaBeans architecture is also added to the capabilities.
149

Variabilita běhových prostředí komponentových systémů / Variability of Execution Environments for Component-based Systems

Malohlava, Michal January 2012 (has links)
Reuse is considered as one of the most crucial software engineering concerns. It allows for delivering software systems faster with less effort. Therefore, the thesis explores limits of reuse in the context of component systems. It analyzes in depth contemporary component systems, finds their commonalities and variation points, and introduces a meta-component system -- a software product line which allows for producing a tailored component system based on a set of requirements. The thesis addresses the meta-component system definition and focuses on its crucial aspects which play the key role in component systems preparation- (1) a configurable execution environment and (2) generation of implementation artifacts. To address the first aspect, the thesis proposes a model-driven method for creating configurable execution environments. Motivated by creating execution environments, the thesis contributes to (2) by introducing a notion of domain-specific languages interoperability in the context of the code generation. Furthermore, the thesis elaborates the proposed notion resulting into a family of interoperable domain-specific languages which is parametrized by a general purpose language.
150

Development of a toolkit for component-based automation systems

McLeod, Charles S. January 2013 (has links)
From the earliest days of mass production in the automotive industry there has been a progressive move towards the use of flexible manufacturing systems that cater for product variants that meet market demands. In recent years this market has become more demanding with pressures from legislation, globalisation and increased customer expectations. This has lead to the current trends of mass customisation in production. In order to support this manufacturing systems are not only becoming more flexible† to cope with the increased product variants, but also more agile‡ such that they may respond more rapidly to market changes. Modularisation§ is widely used to increase the agility of automation systems, such that they may be more readily reconfigured¶. Also with globalisation into India and Asia semi-automatic machines (machines that interact with human operators) are more frequently used to reduce capital outlay and increase flexibility. There is an increasing need for tools and methodologies that support this in order to improve design robustness, reduce design time and gain a competitive edge in the market. The research presented in this thesis is built upon the work from COMPAG/COMPANION (COMponent- based Paradigm for AGile automation, and COmmon Model for PArtNers in automatION), and as part of the BDA (Business Driven Automation), SOCRADES (Service Oriented Cross-layer infrastructure for Distributed smart Embedded deviceS), and IMC-AESOP (ArchitecturE for Service- Oriented Process – monitoring and control) projects conducted at Loughborough University UK. This research details the design and implementation of a toolkit for building and simulating automation systems comprising components with behaviour described using Finite State Machines (FSM). The research focus is the development of the engineering toolkit that can support the automation system lifecycle from initial design through commissioning to maintenance and reconfiguration as well as the integration of a virtual human. This is achieved using a novel data structure that supports component definitions for control, simulation, maintenance and the novel integration of a virtual human into the automation system operation.

Page generated in 0.0333 seconds