• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 14
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 64
  • 64
  • 20
  • 18
  • 17
  • 14
  • 12
  • 11
  • 10
  • 10
  • 9
  • 8
  • 8
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Catch the fraudster : The development of a machine learning based fraud filter

Andrée, Anton January 2020 (has links)
E-commerce has seen a rapid growth the last two decades, making it easy for customers to shop wherever they are. The growth has also led to new kinds of fraudulent activities affecting the customers. To make customers feel safe while shopping online, companies like Resurs Bank are implementing different kinds of fraud filters to freeze transactions that are thought to be fraudulent. The latest type of fraud filter is based on machine learning. While this seems to be a promising technology, data and algorithms need to be tuned properly to the task at hand. This thesis project gives a proof of concept of realizing a machine learning based fraud filter for Resurs Bank. Based on a literature study, available data and explainability requirements, this work opts for a supervised learning approach based on Random Forests with a sliding window to overcome concept drift. The inherent class imbalance of the setting makes the area-under-the-receiver operating-curve a suitable metric. This approach provided promising results that a machine learning based fraud filter can add value to companies like Resurs Bank. An alternative approach on how to incorporate non-numerical features by using recurrent neural networks (RNN) was implemented and compared. The non-numerical feature was transformed by a pre-trained RNN-model to a numerical representation that reflects the features suspiciousness. This new numerical feature was then included in the Random Forest model and the result demonstrated that this approach can add valuable insight to the fraud detection field.
52

Federated Learning with FEDn for Financial Market Surveillance

Voltaire Edoh, Isak January 2022 (has links)
Machine Learning (ML) is the current trend that most industries opt for to improve their business and operations. ML has also been adopted in the financial markets, where well-funded financial institutions employ the latest ML algorithms to gain an advantage on the market. The darker side of ML is the potential emergence of complex algorithmic trading schemes that are abusive and manipulative. Because of this, it is inevitable that ML will be applied to financial market surveillance in order to detect these abusive and manipulative trading strategies. Ideally, an accurate ML detection model would be developed with data from many financial institutions or trading venues. However, such ML models require vast quantities of data, which poses a problem in market surveillance where data is sensitive or limited. Data sharing between companies or countries is typically accompanied by legal and privacy concerns. By training ML models on distributed datasets, Federated Learning (FL) overcomes these issues by eliminating the need to centralise sensitive data. This thesis aimed to address these ML related issues in market surveillance by implementing and evaluating a FL model. FL enables a group of independent data-holding clients with the same intention to build a shared ML model collaboratively without compromising private data. In this work, a ML model is initially deployed in a centralised data setting and trained to detect the manipulative trading scheme known as spoofing. The LSTM-Autoencoder was the model chosen method for this task. The same model is also implemented in a federated setting but with decentralised data, using the FL framework FEDn. Another FL framework, Flower, is also employed to evaluate the performance of FEDn. Experiments were conducted comparing the FL models to the conventional centralised learning model, as well as comparing the two frameworks to each other. The results showed that under certain circumstances, the FL models performed better than the centralised model in detecting spoofing. FEDn was equivalent to Flower in terms of detection performance. In addition, the results indicated that Flower was marginally faster than FEDn. It is assumed that variations in the experimental setup and stochasticity account for the performance disparity.
53

Some Advances in Classifying and Modeling Complex Data

Zhang, Angang 16 December 2015 (has links)
In statistical methodology of analyzing data, two of the most commonly used techniques are classification and regression modeling. As scientific technology progresses rapidly, complex data often occurs and requires novel classification and regression modeling methodologies according to the data structure. In this dissertation, I mainly focus on developing a few approaches for analyzing the data with complex structures. Classification problems commonly occur in many areas such as biomedical, marketing, sociology and image recognition. Among various classification methods, linear classifiers have been widely used because of computational advantages, ease of implementation and interpretation compared with non-linear classifiers. Specifically, linear discriminant analysis (LDA) is one of the most important methods in the family of linear classifiers. For high dimensional data with number of variables p larger than the number of observations n occurs more frequently, it calls for advanced classification techniques. In Chapter 2, I proposed a novel sparse LDA method which generalizes LDA through a regularized approach for the two-class classification problem. The proposed method can obtain an accurate classification accuracy with attractive computation, which is suitable for high dimensional data with p>n. In Chapter 3, I deal with the classification when the data complexity lies in the non-random missing responses in the training data set. Appropriate classification method needs to be developed accordingly. Specifically, I considered the "reject inference problem'' for the application of fraud detection for online business. For online business, to prevent fraud transactions, suspicious transactions are rejected with unknown fraud status, yielding a training data with selective missing response. A two-stage modeling approach using logistic regression is proposed to enhance the efficiency and accuracy of fraud detection. Besides the classification problem, data from designed experiments in scientific areas often have complex structures. Many experiments are conducted with multiple variance sources. To increase the accuracy of the statistical modeling, the model need to be able to accommodate more than one error terms. In Chapter 4, I propose a variance component mixed model for a nano material experiment data to address the between group, within group and within subject variance components into a single model. To adjust possible systematic error introduced during the experiment, adjustment terms can be added. Specifically a group adaptive forward and backward selection (GFoBa) procedure is designed to select the significant adjustment terms. / Ph. D.
54

Evaluating the effectiveness of Benford's law as an investigative tool for forensic accountants / Lizan Kellerman

Kellerman, Lizan January 2014 (has links)
“Some numbers really are more popular than others.” Mark J. Nigrini (1998a:15) The above idea appears to defy common sense. In a random sequence of numbers drawn from a company’s financial books, every digit from 1 to 9 seems to have a one-in-nine chance of being the leading digit when used in a series of numbers. But, according to a mathematical formula of over 60 years old making its way into the field of accounting, certain numbers are actually more popular than others (Nigrini, 1998a:15). Accounting numbers usually follow a mathematical law, named Benford’s Law, of which the result is so unpredictable that fraudsters and manipulators, as a rule, do not succeed in observing the Law. With this knowledge, the forensic accountant is empowered to detect irregularities, anomalies, errors or fraud that may be present in a financial data set. The main objective of this study was to evaluate the effectiveness of Benford’s Law as a tool for forensic accountants. The empirical research used data from Company X to test the hypothesis that, in the context of financial fraud investigations, a significant difference between the actual and expected frequencies of Benford’s Law could be an indication of an error, fraud or irregularity. The effectiveness of Benford’s Law was evaluated according to findings from the literature review and empirical study. The results indicated that a Benford’s Law analysis was efficient in identifying the target groups in the data set that needed further investigation as their numbers did not match Benford’s Law. / MCom (Forensic Accountancy), North-West University, Potchefstroom Campus, 2014
55

Evaluating the effectiveness of Benford's law as an investigative tool for forensic accountants / Lizan Kellerman

Kellerman, Lizan January 2014 (has links)
“Some numbers really are more popular than others.” Mark J. Nigrini (1998a:15) The above idea appears to defy common sense. In a random sequence of numbers drawn from a company’s financial books, every digit from 1 to 9 seems to have a one-in-nine chance of being the leading digit when used in a series of numbers. But, according to a mathematical formula of over 60 years old making its way into the field of accounting, certain numbers are actually more popular than others (Nigrini, 1998a:15). Accounting numbers usually follow a mathematical law, named Benford’s Law, of which the result is so unpredictable that fraudsters and manipulators, as a rule, do not succeed in observing the Law. With this knowledge, the forensic accountant is empowered to detect irregularities, anomalies, errors or fraud that may be present in a financial data set. The main objective of this study was to evaluate the effectiveness of Benford’s Law as a tool for forensic accountants. The empirical research used data from Company X to test the hypothesis that, in the context of financial fraud investigations, a significant difference between the actual and expected frequencies of Benford’s Law could be an indication of an error, fraud or irregularity. The effectiveness of Benford’s Law was evaluated according to findings from the literature review and empirical study. The results indicated that a Benford’s Law analysis was efficient in identifying the target groups in the data set that needed further investigation as their numbers did not match Benford’s Law. / MCom (Forensic Accountancy), North-West University, Potchefstroom Campus, 2014
56

The best practices applied by forensic investigators in conducting lifestyle audits on white collar crime suspects

Gillespie, Roy Tamejen 05 1900 (has links)
This research looks at the best practices applied by forensic investigators in conducting lifestyle audits on white collar crime suspects. The researcher explored, firstly, how a lifestyle audit relates to white collar crime investigations; secondly, the best practices in performing lifestyle audits on white collar crime suspects, and lastly, the various sources of information available to forensic investigators when conducting a lifestyle audit of a white collar crime suspect. As lifestyle audits will serve as an investigative tool in future forensic investigations into white collar crime, this study’s aim was to understand and create an awareness of the current best practices applied by forensic investigators within private sector forensic investigation practices while conducting lifestyle audits during white collar crime investigations. It also makes available research data regarding the concept of lifestyle audits in white collar crime investigations, the implementation of these audits, the benefits, and the best practices of these audits. The general purpose of this study was to provide practical recommendations on the best practices for lifestyle audits for forensic investigators within private sector. / Criminology / M. Tech. (Forensic Investigation)
57

Data Mining Meets HCI: Making Sense of Large Graphs

Chau, Dueng Horng 01 July 2012 (has links)
We have entered the age of big data. Massive datasets are now common in science, government and enterprises. Yet, making sense of these data remains a fundamental challenge. Where do we start our analysis? Where to go next? How to visualize our findings? We answers these questions by bridging Data Mining and Human- Computer Interaction (HCI) to create tools for making sense of graphs with billions of nodes and edges, focusing on: (1) Attention Routing: we introduce this idea, based on anomaly detection, that automatically draws people’s attention to interesting areas of the graph to start their analyses. We present three examples: Polonium unearths malware from 37 billion machine-file relationships; NetProbe fingers bad guys who commit auction fraud. (2) Mixed-Initiative Sensemaking: we present two examples that combine machine inference and visualization to help users locate next areas of interest: Apolo guides users to explore large graphs by learning from few examples of user interest; Graphite finds interesting subgraphs, based on only fuzzy descriptions drawn graphically. (3) Scaling Up: we show how to enable interactive analytics of large graphs by leveraging Hadoop, staging of operations, and approximate computation. This thesis contributes to data mining, HCI, and importantly their intersection, including: interactive systems and algorithms that scale; theories that unify graph mining approaches; and paradigms that overcome fundamental challenges in visual analytics. Our work is making impact to academia and society: Polonium protects 120 million people worldwide from malware; NetProbe made headlines on CNN, WSJ and USA Today; Pegasus won an opensource software award; Apolo helps DARPA detect insider threats and prevent exfiltration. We hope our Big Data Mantra “Machine for Attention Routing, Human for Interaction” will inspire more innovations at the crossroad of data mining and HCI.
58

The best practices applied by forensic investigators in conducting lifestyle audits on white collar crime suspects

Gillespie, Roy Tamejen 05 1900 (has links)
This research looks at the best practices applied by forensic investigators in conducting lifestyle audits on white collar crime suspects. The researcher explored, firstly, how a lifestyle audit relates to white collar crime investigations; secondly, the best practices in performing lifestyle audits on white collar crime suspects, and lastly, the various sources of information available to forensic investigators when conducting a lifestyle audit of a white collar crime suspect. As lifestyle audits will serve as an investigative tool in future forensic investigations into white collar crime, this study’s aim was to understand and create an awareness of the current best practices applied by forensic investigators within private sector forensic investigation practices while conducting lifestyle audits during white collar crime investigations. It also makes available research data regarding the concept of lifestyle audits in white collar crime investigations, the implementation of these audits, the benefits, and the best practices of these audits. The general purpose of this study was to provide practical recommendations on the best practices for lifestyle audits for forensic investigators within private sector. / Criminology and Security Science / M. Tech. (Forensic Investigation)
59

Detecting financial reporting fraud : the impact and implications of management motivations for external auditors : evidence from the Egyptian context

Kassem, Rasha January 2016 (has links)
Financial reporting fraud is a concern for investors, regulators, external auditors, and the public. Although the responsibility for fraud detection lies upon management and those charged with governance, external auditors are likely to come under scrutiny if fraud scandals come to light. Despite the audit regulators efforts in fighting fraud, evidence from prior literature revealed that external auditors still need guidance in assessing and responding to fraud risks. Hence the current study aims at helping external auditors properly assess and respond to the risk of financial reporting fraud in an effort to increase the likelihood of detecting it. In order to achieve this, the current study sought to explore the significance of various fraud factors in assessing the risks of financial reporting fraud and examined how external auditors could assess these fraud factors. The current study also explored the likely motivations behind management fraud, the impact of management motivations on the financial statements, and how external auditors could assess the impact of management motivations. The data for the current study was collected from external auditors working at various audit firms in Egypt via the use of mixed research methods, namely through an online questionnaire and semi-structured interviews. The findings of the current study revealed that management motives are the most significant factor in assessing the risk of financial reporting fraud. Hence the current study suggests that external audit should be viewed in terms of management motivations rather than just the audit of financial statements figures and disclosures. The current study offers detailed guidance to external auditors in this area. The findings of the current study also revealed that management integrity is a significant factor in assessing the risk of financial reporting fraud and that rationalisation of fraud should be assessed as part of management integrity rather than a separate fraud risk factor. The current study found that fraud perpetrators capabilities are equally significant to the opportunity to commit fraud factor yet it is currently ignored by the audit standards and thus should be assessed as part of opportunity to commit fraud. The current study was the first to explore financial reporting fraud and the extent by which external auditors comply with ISA 240 in the Egyptian context. The current study offered recommendations to external auditors, audit firms, audit regulators, and the Egyptian government on how to combat financial reporting fraud. Potential areas for future research were also identified by the current study.
60

Možnosti počítačové detekce defraudací a anomálií v účetních datech / Methods of computer detection of fraud and anomalies in financial data

Spitz, Igor January 2012 (has links)
This thesis analyzes techniques of manipulation of accounting data for the purpose of fraud. It is further looking for methods, which could be capable of detecting these manipulations and it verifies the efficiency of the procedures already in use. A theoretical part studies method of financial analysis, statistical methods, Benford's tests, fuzzy matching and technologies of machine learning. Practical part verifies the methods of financial analysis, Benford's tests, algorithms for fuzzy matching and neural networks.

Page generated in 0.0309 seconds