Spelling suggestions: "subject:"label"" "subject:"babel""
31 |
Arranging simple neural networks to solve complex classification problemsGhaderi, Reza January 2000 (has links)
In "decomposition/reconstruction" strategy, we can solve a complex problem by 1) decomposing the problem into simpler sub-problems, 2) solving sub-problems with simpler systems (sub-systems) and 3) combining the results of sub-systems to solve the original problem. In a classification task we may have "label complexity" which is due to high number of possible classes, "function complexity" which means the existence of complex input-output relationship, and "input complexity" which is due to requirement of a huge feature set to represent patterns. Error Correcting Output Code (ECOC) is a technique to reduce the label complexity in which a multi-class problem will be decomposed into a set of binary sub-problems, based oil the sequence of "0"s and "1"s of the columns of a decomposition (code) matrix. Then a given pattern can be assigned to the class having minimum distance to the results of sub-problems. The lack of knowledge about the relationship between distance measurement and class score (like posterior probabilities) has caused some essential shortcomings to answering questions about "source of effectiveness", "error analysis", " code selecting ", and " alternative reconstruction methods" in previous works. Proposing a theoretical framework in this thesis to specify this relationship, our main contributions in this subject are to: 1) explain the theoretical reasons for code selection conditions 2) suggest new conditions for code generation (equidistance code)which minimise reconstruction error and address a search technique for code selection 3) provide an analysis to show the effect of different kinds of error on final performance 4) suggest a novel combining method to reduce the effect of code word selection in non-optimum codes 5) suggest novel reconstruction frameworks to combine the component outputs. Some experiments on artificial and real benchmarks demonstrate significant improvement achieved in multi-class problems when simple feed forward neural networks are arranged based on suggested framework To solve the problem of function complexity we considered AdaBoost, as a technique which can be fused with ECOC to overcome its shortcoming for binary problems. And to handle the problems of huge feature sets, we have suggested a multi-net structure with local back propagation. To demonstrate these improvements on realistic problems a face recognition application is considered. Key words: decomposition/ reconstruction, reconstruction error, error correcting output codes, bias-variance decomposition.
|
32 |
Evaluating loss minimization in multi-label classification via stochastic simulation using beta distributionMELLO, L. H. S. 20 May 2016 (has links)
Made available in DSpace on 2016-08-29T15:33:25Z (GMT). No. of bitstreams: 1
tese_9881_Ata de defesa.pdf: 679815 bytes, checksum: bd13283b6e7f400de68b79f04cf0b4a9 (MD5)
Previous issue date: 2016-05-20 / The objective of this work is to present the effectiveness and efficiency of algorithms for solving the loss minimization problem in Multi-Label Classification (MLC). We first prove that a specific case of loss minimization in MLC isNP-complete for the loss functions Coverage and Search Length, and therefore,no efficient algorithm for solving such problems exists unless P=NP. Furthermore, we show a novel approach for evaluating multi-label algorithms that has the advantage of not being limited to some chosen base learners, such as K-neareast Neighbor and Support Vector Machine, by simulating the distribution of labels according to multiple Beta Distributions.
|
33 |
Meta-aprendizado para análise de desempenho de métodos de classificação multi-labelPINTO, Eduardo Ribas 31 January 2009 (has links)
Made available in DSpace on 2014-06-12T15:52:45Z (GMT). No. of bitstreams: 1
license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5)
Previous issue date: 2009 / Nos últimos anos, têm surgido diversas aplicações que utilizam algoritmos de Aprendizagem
de Máquina Supervisionada para resolver problemas de classificação envolvendo diversos
domínios. No entanto, muitas destas aplicações se restringem a utilizarem algoritmos singlelabel,
ou seja, que atribuem apenas uma classe a uma dada instância. Tais aplicações se
tornam inadequadas quando essa mesma instância, no mundo real, pertence a mais de uma
classe simultaneamente. Tal problema é denominado na literatura como Problema de Classificação
Multi-Label. Atualmente, há uma diversidade de estratégias voltadas para resolver
problemas multi-label. Algumas delas fazem parte de um grupo denominado de Métodos de
Transformação de Problemas. Essa denominação vem do fato de esse tipo de estratégia
buscar dividir um problema de classificação multi-label em vários problemas single-label de
modo a reduzir sua complexidade. Outras buscam tratar conjuntos de dados multi-label diretamente,
sendo conhecidas como Métodos de Adaptação de Algoritmos. Em decorrência
desta grande quantidade de métodos multi-label existentes, é bastante difícil escolher o mais
adequado para um dado domínio. Diante disso, a presente dissertação buscou atingir dois
objetivos: realização de um estudo comparativo entre métodos de transformação de problemas
muito utilizados na atualidade e a aplicação de duas estratégias de Meta-Aprendizado
em classificação multi-label, a fim de predizer, com base nas características descritivas de
um conjunto de dados, qual algoritmo é mais provável de obter um desempenho melhor em
relação aos demais. As abordagens de Meta-aprendizado utilizadas no nosso trabalho foram
derivadas com base em técnicas de análise de correlação e mineração de regras. O uso de
Meta-Aprendizado no contexto de classificação multi-label é original e apresentou resultados
satisfatórios nos nossos experimentos, o que aponta que este pode ser um guia inicial para
o desenvolvimento de pesquisas futuras relacionadas
|
34 |
Marcas próprias de supermercado: um estudo com consumidoras na cidade de São Paulo / Supermarket private labels: a study with consumers in São Paulo city - BrazilMarcelo Felippe Figueira Junior 27 August 2008 (has links)
O objetivo deste trabalho é investigar o consumo de marcas próprias em supermercados, com enfoque nas escolhas do consumidor entre a marca própria e a marca líder . Pretende-se captar a percepção dos consumidores relativamente a preços e qualidade dos produtos fornecidos com a marca dos varejistas. Analisar com qual freqüência os produtos são consumidos, investigar qual o diferencial de preço que o consumidor está disposto a pagar pela marca própria em relação à marca líder, através do método de pesquisa de Grupos de Foco (Focus Group). A decisão de estudar a escolha de marcas próprias pelos consumidores em supermercados deve-se à necessidade de se conhecer esta modalidade específica de consumo, identificando a percepção, o comportamento e o processo de decisão do público supermercadista de modo a trazer contribuições teóricas para Administração de Varejo a partir da pesquisa. / ABSTRACT The objective of this work is to investigate the consumption of private label products in supermarkets, focusing on the consumers\' choices between private label and leading brand products. It is intended to catch the perception of the consumers relatively to the prices and product quality associated to the retailers\' brands. The Focus Group research method is used in order to analyze with which frequency the products are consumed and to investigate which is the price gap that the consumers are willing to pay for private label in relation the leading brands. The decision to study the choice of private label by supermarket consumers is due to the need of knowing, through this research, this specific type of consumption, identifying the perception, the behavior and the decision process of the supermarket clientele in order to bring theoretical contributions to Retail Management body of knowledge.
|
35 |
Interferometric imaging for high sensitivity multiplexed molecular measurementsMarn, Allison M. 25 September 2021 (has links)
The diagnostic and pharmaceutical industries rely on tools for characterizing, discovering, and developing bio-molecular interactions. Diagnostic assays require high affinity capture probes and binding specificity for accurate detection of biomarkers. Selection of drug candidates depends on the drug residency time and duration of drug action. Further, biologic drugs can induce anti-drug antibodies, which require characterization to determine the impact on the drug safety and efficacy. Label-free biosensors are an attractive solution for analyzing these and other bio-molecular interactions because they provide information based on the characteristics of the molecules themselves, without disturbing the native biological systems by labeling. While label-free biosensors can analyze a broad range of analytes, small molecular weight analytes (molecular weight < 1kDa) are the most challenging. Affinity measurements for small molecular weight targets require high sensitivity and long-term signal stability. Additional difficulties occur with different liquid refractive indices that result from to temperature, composition, or matrix effects of sensor surfaces. Some solutions utitlize strong solvents to increase the solubility of small molecules, which also alter the refractive index. Moreover, diagnostics require affinity measurements in relevant solutions, of various refractive indices. When a refractive index difference exists between the analyte solution and the wash buffer, a background signal is generated, referred to as the bulk effect, obscuring the small signal due to surface binding in the presence of large fluctuations due to variations of the optical refractive index of the solutions.
The signal generated by low molecular weight analytes is small, and conventional wisdom tends toward signal amplification or resonance for detection of these small signals. With this approach, Surface Plasmon Resonance (SPR) has become the gold standard in affinity measurement technologies. SPR is an expensive and complex technology that is highly susceptible to the bulk effect. SPR uses a reference channel to correct for the bulk effect in post-processing, which requires high precision and sophisticated temperature control, further increasing the cost and complexity. Additionally, multiplexing is desirable as it allows for simultaneous measurements of multiple ligands; however, multiplexing is only possible in the imaging modality of SPR, which has lower sensitivity and difficulty with referencing. The Interferometric Reflectance Imaging Sensor (IRIS) is a low-cost, optical label-free bio-molecular interaction analysis technology capable of providing precise binding affinity measurements; however, limitations in sensitivity and usability have previously prevented its widespread adaptation. Overcoming these limitations requires the implementation of automation, compact and easy-to-use instrumentation, and increased sensitivity. Here, we explore methods for improved sensitivity and usability. We achieve noise reduction and elimination of solution artifacts (bulk effect) through engineered illumination uniformity and temporal and spatial image processing. To validate these methods, we experimentally analyze small molecule molecular interactions to demonstrate highly sensitive kinetic binding measurements, independent of solution refractive index. / 2023-09-24T00:00:00Z
|
36 |
Virtuální privátní sítě na bázi technologie MPLS / MPLS based virtual private networksDaněk, Michal January 2017 (has links)
Master thesis deals with architecture of network based on multiprotocol label switching technology (MPLS). Work also describes use of this technology for point to point or multipoint connections based on network or data link layer. The practical part is focused on design of laboratory task which is aimed to configuration of virtual private LAN service (VPLS). This technology emulates multipoint connection based on the data link layer.
|
37 |
Off-Label and Unlicensed Medication Use and Associated Adverse Drug Events in a Pediatric Emergency DepartmentPhan, Hanna, Leder, Marc, Fishley, Matthew, Moeller, Matthew, Nahata, Milap 01 June 2010 (has links)
Objectives: The study objectives were to (1) determine the types and frequency of off-label (OL) or unlicensed (UL) medications used in a pediatric emergency department (PED) and before admission, (2) describe OL/UL-associated adverse drug events (ADEs) resulting in admission to the PED and those occurring during patient care in PED, and (3) determine the outcomes of these ADEs. Methods: Medical records of patients 18 years or younger admitted to the PED over a 5-month period were reviewed. Off-label/UL use of medications was determined based on Food and Drug Administration-approved labeling. The Adverse Drug Reaction Probability Scale was used to determine ADE causality. Data were analyzed using descriptive statistics. Results: A total of 2191 patients with 6675 medication orders were evaluated. About 26.2% (n = 1712) of medication orders were considered as OL/ UL use; 70.5% (n = 1208) of these medications were ordered as part of treatment in the PED, and the remaining 29.5% (n = 504) were home medications before their PED evaluation. Inhaled bronchodilators (30.4%), antimicrobials (14.8%), and antihistamines/antiemetics (9.1%) were the most common OL/UL medication classes. The frequency of ADEs among licensed medication use was greater compared with OL/UL use by 2-fold. Reported overall rate of ADEs was 0.6% (n = 40). Of these 40 ADEs, 5 resulted from the use of an OL/UL medication, 3 from home medication use, and 2 from PED-prescribed medications. Conclusions: The frequency of reported ADEs associated with OL/UL medications was less than the frequency of ADEs from licensed medication use, with overall ADE frequency of less than 1%.
|
38 |
Effects of the Diagnostic Label "ADHD " on Peer JudgmentToone, Jared 01 May 2006 (has links)
Diagnostic labels are frequently used with children exhibiting symptoms of learning and behavioral disorders. The effect that such labels have on the labeled children as well as their peers is not completely understood. In the present study, the effects of the label "ADHD" on peer acceptance were examined. Fourth- and fifth-grade boys and girls viewed a video of a peer listening to teacher instruction and working on a worksheet. For half of the participants, the child in the video was labeled as having ADHD, while the other participants were told nothing about the child. After viewing the video, the children responded to a questionnaire assessing the likelihood that they would befriend the peer in the video. An analysis of variance revealed that the label resulted in significantly lower friendship ratings. Gender of the participant was not found to impact peer ratings. These results indicate that parents, professionals, and children need to be educated about the effects that labels may have and that labels need to be used with caution. Labeled children may also benefit from counseling about how others may respond to their label.
|
39 |
Optimal Semantic Labeling of Social Network ClustersPeng, Shuyue 13 October 2014 (has links)
No description available.
|
40 |
Label-Free Microfluidic Devices for Single-Cell Analysis and Liquid BiopsiesGhassemi, Parham 05 January 2023 (has links)
Mortality due to cancer is a global health issue that can be improved through further development of diagnostic and prognostic tools. Recent advancements in technologies aiding cancer research have made significant strides, however a demand for a non-invasive clinically relevant point-of-care tools exists. To accomplish this feat, the desired instrument needs to be low-cost, easy-to-operate, efficient, and have rapid processing and analysis. Microfluidic platforms in cancer research have proven to be advantageous due to its operation at the microscale, which has low costs, favorable physics, high precision, short experimentation time, and requires minimal reagent and sample sizes. Label-free technologies rely on cell biophysical characteristics to identify, evaluate, and study biological samples. Biomechanical probing of cells through deformability assays provides a label-free method of identifying cell health and monitoring response to physical and chemical stimuli. Bioimpedance analysis is an alternative versatile label-free method of evaluating cell characteristics by measuring cell response to electrical signals. Microfluidic technologies can facilitate biomechanical and bioelectrical analysis through deformability assays and impedance spectroscopy. This dissertation demonstrates scientific contributions towards single-cell analysis and liquid biopsy devices focusing on cancer research. First, cell deformability assays were improved through the introduction of multi-constriction channels, which revealed that cells have a non-linear response to deformation. Combining impedance analysis with microfluidic deformability assays provided a large dataset of mechano-electrical information, which improved cell characterization and greatly decreased post-processing times. Next, two unique biosensors demonstrated improved throughput while maintaining sensitivity of single-cell analysis assays through parallelization and incorporating machine learning for data processing. Liquid biopsies involve studying cancer cells in patient vascular systems, called circulating tumor cells (CTCs), through blood samples. CTC tests reveal valuable information on patient prognosis, diagnosis and can aide therapy selection in a minimally invasive manner. This body of work presents two liquid biopsy devices that enrich murine and human blood samples and isolate CTCs to ease detection and analysis. Additionally, a microfluidic CTC detection biosensor is introduced to reliably count and identify cancer cells in murine blood, where an extremely low-cost version of the assay is also validated. Thus, the assays presented in this dissertation show promise of microfluidic technologies towards point-of-care systems for cancer research. / Doctor of Philosophy / Cancer is the second leading cause of death worldwide with approximately 2 million new cases each year in the just United States. Significant research development for diagnostic and prognostic tools have been conducted, however they can be expensive, invasive, time-consuming, unreliable, and not always easily accessible. Thus, a tool that is cheap, minimally invasive, easy-to-use, and robust needs to be developed to combat these issues. Typical cancer studies have primarily focused on biological and biochemical methods for evaluation; however, researchers have begun to leverage small-scale biosensors that utilize biophysical attributes. Recent studies have proven that these lab-on-a-chip technologies can produce meaningful results by exploiting these biophysical characteristics. Microfluidics is a science that consists of sub-millimeter sized channels which show a great deal of promise as they require minimal materials and can quickly and efficiently analyze biological samples. Label-free methods of studying cells rely on their physical properties, such as size, deformability, density, and electrical properties. These biophysical characteristics can be easily obtained at the single-cell level through microfluidic-based assays. Measuring and monitoring these attributes can provide valuable information to help understand cancer cell response to stimuli such as chemotherapeutic drugs or other therapies. A liquid biopsy is a non-invasive method of evaluating cancer patients by studying circulating tumor cells (CTCs) that exist in their blood. This dissertation reports a wide range of label-free microfluidic assays that evaluate and study biological samples at the single-cell level and for liquid biopsies. These assays consist of microfluidic channels with sensors that can rapidly obtain biophysical characteristics and process blood samples for liquid biopsy applications. Uniquely modifying microfluidic channel geometries and sensor configurations improved upon previously developed single-cell and CTC-based tools. The resulting devices were low in cost, easy-to-use, efficient, and reliable methods that alleviates current issues in cancer research while showing clinical utility.
|
Page generated in 0.0406 seconds