• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 106
  • 87
  • 70
  • 29
  • 26
  • 24
  • 14
  • 13
  • 12
  • 12
  • 10
  • 8
  • 6
  • 4
  • 2
  • Tagged with
  • 450
  • 124
  • 77
  • 77
  • 75
  • 60
  • 56
  • 49
  • 42
  • 40
  • 39
  • 39
  • 36
  • 35
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

Avaliação da efetividade de cartas de controle multivariadas na detecção de suspeitas de fraude financeira

Souza, Davenilcio Luiz de 13 March 2017 (has links)
Submitted by JOSIANE SANTOS DE OLIVEIRA (josianeso) on 2017-05-19T12:43:37Z No. of bitstreams: 1 Davenilcio Luiz de. Souza_.pdf: 539499 bytes, checksum: cf86851f0b7523f3b7d78589539fdbcb (MD5) / Made available in DSpace on 2017-05-19T12:43:37Z (GMT). No. of bitstreams: 1 Davenilcio Luiz de. Souza_.pdf: 539499 bytes, checksum: cf86851f0b7523f3b7d78589539fdbcb (MD5) Previous issue date: 2017-03-13 / Nenhuma / Os crimes de lavagem de dinheiro têm provocado grandes perdas aos países e a seus sistemas financeiros, o volume de dados em transações digitais representa dificuldade para a detecção deste tipo de ilícito. As auditorias em dados financeiros mostram-se limitadas na identificação de fraudes, pois em grande parte, ainda são realizadas com dados coletados por amostragem e incapazes de identificar as situações de delito em tempo real. Este trabalho, visando auxiliar no atendimento a esta lacuna, tem por objetivo propor um método estatístico de monitoramento por Cartas de Controle multivariadas, com base na Lei de Benford, para a detecção de suspeitas de fraude em lançamentos financeiros, entre eles os devidos à lavagem de dinheiro. Foi definido um modelo conceitual com distribuição de probabilidades representando dados oriundos de lançamentos financeiros, e adotada a suposição de que aderem a distribuição da Lei de Benford. Posteriormente foi considerada a distribuição empírica, estimada a partir dos próprios dados e dois procedimentos foram testados para verificar as suspeitas de fraude por lavagem de dinheiro utilizando a avaliação dos primeiros dígitos significativos: A Carta de Controle multivariada _2 e a Carta de Controle multivariada T2 de Hotelling. Foram simulados dados com auxílio do software R-Project até a ocorrência do 50.000o sinal. Foram avaliados casos simulados e reais, com o fim de exemplificar a operação do método. A partir da simulação, as duas Cartas de Controle testadas foram avaliadas quanto ao ARL, isto é, o número médio de observações até sinalizar que a série passou a operar em um estado fora de controle, o que significa a suspeita de lançamentos fraudulentos. Após aplicação do método de análise retrospectiva, com base nas proporções dos primeiros dígitos de Benford em lançamentos financeiros da campanha para Prefeito em 2016, não foram evidenciadas suspeitas de fraude nos dados obtidos junto ao sítio do Tribunal Superior Eleitoral (TSE). Em um conjunto de dados de uma instituição financeira, foram observados sinais de divergência entre as frequências dos primeiros dígitos nos lançamentos e nos valores esperados, porém os pontos além dos limites de controleidentificados encontram-se em um período próximo nas três análises realizadas, concentrando os dados de investigação para a auditoria financeira. A contribuição acadêmica deu-se pelo desenvolvimento de um modelo de aplicação de Cartas de Controle multivariadas e da Lei de Benford, com uma abordagem inovadora do controle estatístico de processos voltado à área financeira, utilizando recurso computacional acessível, de fácil processamento, confiável e preciso, que permite aprimoramento por novas abordagens acadêmicas. No que tange à contribuição à sociedade, se dá pelo uso do modelo por entidades que atuam com movimentações financeiras e pela comunidade, em dados de organizações civis e estatais divulgados nos canais de informação, de modo a proporcionar a prática cidadã pelo acesso à análise e a constatação da idoneidade dos fatos e dos dados. / Large losses are generated in the countryes financial systems, by money laundering. The volume of financial data is big issue to identify digital crime and money laundering. Audits in financial data have limitations in detecting fraud, in large part it is still performed in a traditional way, data are collected by sampling and often unable to identify a real-time crime situation. This research is aiming to serve in addressing this gap, to propose an monitoring statistical method, from multivariate control chart based on Benford’s law for detecting suspicious of fraud in financial data, including those due to money laundering. It was initially defined as a conceptual model in order to determine the type of probability distribution that represents data from financial launches. It was adopted an assumption that this type of data adheres to the Benford’s Law distribution. Subsequently, an empirical distribution was obtained, estimated from the own data. Two procedures were tested to verify a suspected money laundering fraud through the significant first-digit assessment: The Multivariate 2 Control Chart and the Multivariate Hotelling’s T2 Control Chart. Data were simulated using the R-Project software until the occurrence of the 50.000o signal. Finally, the simulation procedures were applied to real data in order to exemplify the method operationally. From the simulation, the two Control Charts tested were evaluated for ARL, that is, average number of observations until the signaling that the series started to operate in an out-of-control state, which it means suspicious of fraudulent launches. The application of the retrospective analysis method in the financial launchings of county’s campaign from 2016 Elections in five capitals of Brazil, based on the expected proportions from the first digit given by Benford’s Law, no suspicions fraud were evidenced in the data obtained from the site of Tribunal Superior Eleitoral (TSE). Considering the application in a set of data from a financial institution, signs of divergence between the frequencies of the first digits of the entries and the expected values were observed, but these points beyond the identified limits are close in all three analyzes. Indicating the period of the data which ones the audit will focus in a further investigation. Academic contribution is identified by developing a multivariate Control Chart together the Benford’s law in an application model with an innovative approach to the statistical process control aimed at the financial area,using accessible, easy to process, reliable and accurate computational resources that allow improvement through new academic approaches. As regard to the contribution to society, it is given the opportunity of applying the model by financial entities and the community in the data of civil and state organizations, disclosed in the information channels in order to provide access to analysis and verification of the suitability of facts and data by citizen practice.
352

Gráficos de controle fuzzy para o monitoramento da média e amplitude de processos univariados /

Mendes, Amanda dos Santos January 2019 (has links)
Orientador: Marcela Aparecida Guerreiro Machado Freitas / Resumo: O controle de qualidade, principalmente por meio do uso de gráficos de controle, torna-se essencial na indústria para garantir um processo livre de causas especiais de variabilidade. Como os dados amostrais podem incluir incertezas advindas da subjetividade humana e dos sistemas de medição, a teoria dos conjuntos fuzzy pode ser aplicada aos gráficos de controle quando dados precisos não estiverem disponíveis. Para tal feito, os valores da característica de qualidade são fuzzificados a partir da inserção de incertezas e transformados em valores representativos para uma melhor comparação com o gráfico de controle tradicional. Este trabalho propõe o uso da lógica fuzzy aos gráficos de controle para monitorar a média e a amplitude de processos univariados, assim como dois gráficos de controle fuzzy baseados nas regras especiais de decisão: Synthetic e Side Sensitive Synthetic. O desempenho do gráfico de controle é medido pelo número médio de amostras até sinal (NMA). Verificou-se neste trabalho que os gráficos de controle fuzzy possuem maior eficiência que os gráficos de controle tradicionais para menores valores de α-cut, ou seja, maior incerteza inserida no processo e para cenários onde se tem uma maior diferença entre os limitantes de incerteza dos números fuzzy. / Abstract: Quality control, mainly through the use of control charts, becomes essential in the industry to ensure a process free from special causes of variability. As sample data may include uncertainties arising from human subjectivity and measurement systems, fuzzy set theory can be applied to control charts when accurate data is not available. For this purpose, the quality characteristic values are fuzzified from the insertion of uncertainties and transformed into representative values for a better comparison with the traditional control chart. This work proposes the use of fuzzy logic to control charts to monitor the mean and range of univariate processes, as well as two fuzzy control charts based on the special run rules: Synthetic and Side Sensitive Syntehtic. The performance of the control chart is measured by the average run length (ARL). It was verified in this work that the fuzzy control charts have higher efficiency than the traditional control charts for lower values of α-cut, that is, greater uncertainty inserted in the process and for scenarios where there is a greater difference between the limiting uncertainties of fuzzy numbers. / Mestre
353

Vizualizace vícerozměrných statistických dat / Visualization of Multivariate Statistical Data

Maroušek, Vít January 2011 (has links)
The thesis deals with the possibilities of visualization of multivariate statistical data. Since this is a very broad area the thesis is divided into four sections, two of which are theoretically and two practically oriented. The first section is devoted to theoretical aspects of data visualization. It contains information about the building blocks of graphs, and how the brain processes graphs in various stages of perception. The second section charts the available chart types that can be used to display data. Selected types of graphs for continuous and discontinuous multidimensional data are described in detail. The third section focuses on available software tools for creating graphs. The section describes several programs, with focus on STATISTICA, R and MS Excel. The knowledge gained in previous chapters was sufficient source of information to perform a graphical analysis of multidimensional continuous and discrete data and using advanced analytical methods in the last section. This analysis is performed separately on the data file with continuous variables and on a data file with discontinuous (categorical) variables.
354

A Benign Paroxysmal Positional Vertigo Triage Clinic

Riska, Kristal M., Akin, Faith W., Williams, Laura, Rouse, Stephanie B., Murnane, Owen D. 12 December 2017 (has links)
Purpose: The purpose of this study was to evaluate the effectiveness of triaging patients with motion-provoked dizziness into a benign paroxysmal positional vertigo (BPPV) clinic. Method: A retrospective chart review was performed of veterans who were tested and treated for BPPV in a triaged BPPV clinic and veterans who were tested and treated for BPPV in a traditional vestibular clinic. Results: The BPPV triage clinic had a hit rate of 39%. On average, the triaged BPPV clinic reduced patient wait times by 23 days relative to the wait times for the traditional vestibular clinic while also reducing patient costs. Conclusion: Triaging patients with BPPV is one method to improve access to evaluation and treatment and a mechanism for the effective use of clinic time and resources.
355

時間序列在品質管制上的應用 / Apply time series to quality control

陳繼書, Chen, Gi Sue Unknown Date (has links)
當我們利用Shewhart管制圖(Shewhart control chart)或累積和管制圖(Cumulative-sum chart. CUSUM chart)來偵測製程時,通常假設製品係獨立取自一個服從均數μ和標準差為σ的獨立常態分配的管制下進行。但是若產品特性值呈現自相關時,這類管制圖就可能發生誤導的結果。本文利用時間序列模式來解決具相關變數的管制圖問題。並考慮利用非線性時間序列模式及特別原因管制圖(special-cause control chart)來檢視台灣經濟景氣指標是否處於控制中的狀態。並討論特別原因管制圖的連串長度分佈(run length distribution)。在最後的實例分析中,介紹自動控制的觀念。 / Traditionally, in the quality control process, such as: Shewhart control chart or CUSUM chart, it is assumed that the observation process follows an i.i.d normal distribution. If the assumption for independence fails, that is when the process exhibits type of autocorrelation, we need to find a more reliable decision method. In this paper, we will apply the time series analysis and structure changed concept to slove the serial correlation problem. The idea of automatic control can be applied in the explanation of this nonlinear process. Finally, a time series about the monitoring indicators of Taiwan is discussed in detail as an example.
356

System Availability Maximization and Residual Life Prediction under Partial Observations

Jiang, Rui 10 January 2012 (has links)
Many real-world systems experience deterioration with usage and age, which often leads to low product quality, high production cost, and low system availability. Most previous maintenance and reliability models in the literature do not incorporate condition monitoring information for decision making, which often results in poor failure prediction for partially observable deteriorating systems. For that reason, the development of fault prediction and control scheme using condition-based maintenance techniques has received considerable attention in recent years. This research presents a new framework for predicting failures of a partially observable deteriorating system using Bayesian control techniques. A time series model is fitted to a vector observation process representing partial information about the system state. Residuals are then calculated using the fitted model, which are indicative of system deterioration. The deterioration process is modeled as a 3-state continuous-time homogeneous Markov process. States 0 and 1 are not observable, representing healthy (good) and unhealthy (warning) system operational conditions, respectively. Only the failure state 2 is assumed to be observable. Preventive maintenance can be carried out at any sampling epoch, and corrective maintenance is carried out upon system failure. The form of the optimal control policy that maximizes the long-run expected average availability per unit time has been investigated. It has been proved that a control limit policy is optimal for decision making. The model parameters have been estimated using the Expectation Maximization (EM) algorithm. The optimal Bayesian fault prediction and control scheme, considering long-run average availability maximization along with a practical statistical constraint, has been proposed and compared with the age-based replacement policy. The optimal control limit and sampling interval are calculated in the semi-Markov decision process (SMDP) framework. Another Bayesian fault prediction and control scheme has been developed based on the average run length (ARL) criterion. Comparisons with traditional control charts are provided. Formulae for the mean residual life and the distribution function of system residual life have been derived in explicit forms as functions of a posterior probability statistic. The advantage of the Bayesian model over the well-known 2-parameter Weibull model in system residual life prediction is shown. The methodologies are illustrated using simulated data, real data obtained from the spectrometric analysis of oil samples collected from transmission units of heavy hauler trucks in the mining industry, and vibration data from a planetary gearbox machinery application.
357

System Availability Maximization and Residual Life Prediction under Partial Observations

Jiang, Rui 10 January 2012 (has links)
Many real-world systems experience deterioration with usage and age, which often leads to low product quality, high production cost, and low system availability. Most previous maintenance and reliability models in the literature do not incorporate condition monitoring information for decision making, which often results in poor failure prediction for partially observable deteriorating systems. For that reason, the development of fault prediction and control scheme using condition-based maintenance techniques has received considerable attention in recent years. This research presents a new framework for predicting failures of a partially observable deteriorating system using Bayesian control techniques. A time series model is fitted to a vector observation process representing partial information about the system state. Residuals are then calculated using the fitted model, which are indicative of system deterioration. The deterioration process is modeled as a 3-state continuous-time homogeneous Markov process. States 0 and 1 are not observable, representing healthy (good) and unhealthy (warning) system operational conditions, respectively. Only the failure state 2 is assumed to be observable. Preventive maintenance can be carried out at any sampling epoch, and corrective maintenance is carried out upon system failure. The form of the optimal control policy that maximizes the long-run expected average availability per unit time has been investigated. It has been proved that a control limit policy is optimal for decision making. The model parameters have been estimated using the Expectation Maximization (EM) algorithm. The optimal Bayesian fault prediction and control scheme, considering long-run average availability maximization along with a practical statistical constraint, has been proposed and compared with the age-based replacement policy. The optimal control limit and sampling interval are calculated in the semi-Markov decision process (SMDP) framework. Another Bayesian fault prediction and control scheme has been developed based on the average run length (ARL) criterion. Comparisons with traditional control charts are provided. Formulae for the mean residual life and the distribution function of system residual life have been derived in explicit forms as functions of a posterior probability statistic. The advantage of the Bayesian model over the well-known 2-parameter Weibull model in system residual life prediction is shown. The methodologies are illustrated using simulated data, real data obtained from the spectrometric analysis of oil samples collected from transmission units of heavy hauler trucks in the mining industry, and vibration data from a planetary gearbox machinery application.
358

Verteilt agierendes System zur Bereitstellung von geometrie- und bild-basierten Approximationen für das Multiresolution Rendering

Hilbert, Karsten 07 May 2010 (has links) (PDF)
In dieser Arbeit wird ein applikationsunabhängiges Reduktionssystem entworfen, das selbstständig und effizient für die ihm übergebenen Modellteile in allen Betrachtersituationen aus einem möglichen Spektrum von geometrie- und bild-basierten Approximationsformen jeweils die geeignete Approximation generiert, deren Komplexität möglichst gering ist und bei deren Verwendung ein Szenenbild erzeugt werden kann, dessen Bildfehler die vom Nutzer vorgegebenen Schranken nicht überschreitet. Das System nutzt bild- und geometrie-basierte Approximationsformen für unterschiedliche Bereiche im Sichtvolumen des Betrachters. Nailboards sind die benutzten bild-basierten Approximationen. In dieser Arbeit werden neue Nailboardarten vorgestellt, die für die Approximation von semi-transparenten Objekten und von dynamisch beleuchteten Objekten effizient verwendet werden können. Die vorgestellten Erzeugungs- und Darstellungsmethoden nutzen die Fähigkeiten der aktuellen Hardware intensiv aus, um die Nailboards im Echtzeitkontext nutzbar zu machen. Texturierte, sichtabhängige geometrie-basierte Approximationen werden aus einem texturierten Viewdependent Progressive Mesh (VDPM) gewonnen. In dieser Arbeit wird eine effiziente Methode zur Erzeugung von VDPM vorgestellt, aus der Approximationen mit optimal angepassten Parameterkoordinaten gewonnen werden können, ohne dass ein der VDPM-Erzeugung nachgeschalteter Optimierungsschritt der Parameterkoordinaten aller im VDPM kodierten Approximationen notwendig ist. Die Erzeugung der notwendigen Texturen erfolgt unter Nutzung einer schnellen Parametrisierungsmethode und hardware-gestützter Methoden zur Erzeugung dicht gepackter Texturatlanten. Durch die Kombination von selektiven Zugriffsmethoden auf TFGR mit effizienten Randanpassungsmethoden, wird erstmals ein effizientes und qualitativ hochwertiges Multiresolution Rendering mittels TFGR ermöglicht. Aus dem TFGR werden texturierte sichtunabhängige Approximationen gewonnen. Zur echtzeitfähigen, vollautomatischen Erzeugung aller drei Approximationsformen wird in dieser Arbeit ein Reduktionssystem vorgeschlagen, das diese Approximationsformen verteilt generiert. Für eine effiziente Kommunikation innerhalb dieses Systems werden entsprechende Kompressions-, Caching- und State-Differencing-Mechanismen vorgeschlagen. Lastverteilungsmechanismen sichern eine effiziente Ausnutzung der zur Verfügung stehenden Ressourcen ab. / In this thesis, an application-independent system for the distributed generation of object approximations used for multi-resolution rendering is proposed. The system generates approximations of objects of a scene sent to him in an efficient and fully automatic manner. The system is able to generate different kinds of geometry-based and image-based object approximations. For each given objects of a scene it generates that kind of approximation that is suitable for the current view. That means that its complexity is minimal and that it causes an error in the image generated with this approximation that does not exceed a user-specified threshold. Nailboards are image-based approximations that approximate objects whose size is small compared to the whole scene. In this thesis new kinds of nailboards are presented which can be used efficiently for the approximation of semi-transparent objects and objects in scenes with a dynamic illumination. Capabilities of current graphics hardware are intensively used to generate and render all kinds of Nailboards in real-time. So-called textured view-dependent progressive meshes (VDPM) are used as view dependent geometry-bases approximations for objects whose size is large compared to the whole scene. In this thesis an efficient method for generating VDPM is presented. This method allows the extraction of approximations with optimally adapted texture coordinates without the necessity of an separate optimization step for the texture coordinates in the generation procedure. The textures necessary for the compensation of detail loss are generated using a fast parameterization method from Yoshizawa. The generation of texture atlases is done hardware-accelerated. Further on a hardware-accelerated method for hardware-accelerated multi-resolution rendering using multi chart geometry images (MCGIM) is presented. Out of the MCGIM view-independent geometry-based approximations are extracted. Finally a system for the distributed generation of object approximations is proposed. It generates all three kinds of approximations fully automatic and almost in real time. For an efficient communication within this system suitable compression, caching and state-differencing mechanisms are proposed. Load balancing mechanisms ensure efficient utilization of available resources.
359

應用資料包絡法降低電源轉換器溫升之研究

廖 合, Liao,Ho Unknown Date (has links)
由績效觀點,品質(適質)與成本(適量),在概念上是完全一致的。因此,績效的管理,應以品質與成本作為其目標達成與否的衡量標準。本研究以績效觀點來解決公司面臨到品質與成本的兩難的問題。隨著電子產品的功能多樣化,發熱問題卻接踵而來,發熱密度的不斷提昇,對於散熱設計的需求也越來越受到重視。本研究以電源轉換器為對象,其目前已設計完成且已通過美國UL安規認證,但因為其溫升及其變異很大,因此降低電源轉換器的溫升及其變異是一急需解決的問題,以期能找出穩健於不可控因子,使產品變異小且各零件溫升與損失均能降至最低的最適外部零件組合。透過了田口與實驗設計的方法規劃及進行實驗並收集數據。引用加權SN比(multi-response S/N ratio)的方法,分別透過(1)管制圖法及(2)資料包絡法的CCR保證領域法(指CCR-AR模型)來決定加權SN比的權數,以決定可控因子及其水準值。對矩陣實驗的數據利用MTS ( M a h a l o n o b I s - Taguchi System)來篩選研究問題中較重要的特性變數,再針對篩選結果中較重要的特性變數的數據分別利用(1)倒傳遞類神經網路結合資料包絡法及(2)資料包絡法結合主成份分析法兩種分析方法,得到外殼鑽孔形狀與矽膠片大小的最佳因子組合。由改善後的確認實驗結果得知,雖然平均溫升下降的程度不大,然而大部份量測點的溫升標準差都顯著變小了,因此本研究在降低該電源轉換器溫升變異的效果顯著。
360

工業電腦技術支援服務流程之研究---以研華科技公司為例

潘錫生, Pan, Shi-Sheng Unknown Date (has links)
本研究選擇工業電腦產業的領導廠商研華科技公司為個案研究對象,想藉由研究該公司的技術支援服務流程,來瞭解其中可能出現的問題,並進一步提出改善的建議。 在整個研究架構方面,採用Davenport and Short(1990)的企業流程再造的五階段,乃因為此架構既具有完整的流程再造觀念,也稍具執行細節的考量,因此決定以該兩位學者所提的企業流程再造五階段作為本研究架構。 根據本研究,發現其現行的技術支援服務流程有以下的問題: 1. 業務工程師花費過多時間於技術支援服務上,使得他們缺乏時間開發與創造新的商機。 2. 客戶找業務請求技術支援並不會留下記錄。 3. 業務工程師與應用工程師有遺忘客戶需求的可能。 4. 業務無法隨時掌握顧客需求被滿足的狀況。 5. 應用工程師被動等待客戶跟催。 因此本研究提出以下建議: 1. 使用「即時支援軟體」來讓客戶可以輕鬆解決自己的問題,並留下服務需求記錄。 2. 加強公司知識庫的完整性、豐富性及更新速度。 3. 使用新的技術支援服務流程以提升服務效率及反應速度。 4. 建立每日客戶需求未被滿足的清單之電腦自動回報機制。 5. 提供客戶即時查詢其需求處理狀況。 另外在進行技術支援服務流程再造的時候,需要特別注意的是,要有一個流程負責人來統管整個新流程的運作狀況,董事長與總經理應該給予最高的授權支持,而該負責人應請各相關部門派出經理級以上的人參與流程再造,藉此整合各方意見並凝聚共識,如此才可能增加推行的助力,並將可行性納入考量,不然有可能變成難以執行計畫。 除此之外,還要有搭配相對應的誘因機制,讓相關的人員可以配合與適應此一新的技術支援服務流程,因為人們面對改變總是偏向抗拒的,因此一個新流程的推行需要相對應的誘因制度設計,才能讓大家有動力去從事變革。 關鍵字:工業電腦、技術支援、企業流程再造、流程圖、服務管理

Page generated in 0.0615 seconds