• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 116
  • 65
  • 48
  • 41
  • 18
  • 16
  • 14
  • 10
  • 10
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 396
  • 50
  • 46
  • 41
  • 34
  • 32
  • 32
  • 31
  • 28
  • 25
  • 25
  • 25
  • 24
  • 24
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

The ability to account for Internet-based sales transactions according to GAAP / Blanché Steyn

Steyn, Blanché January 2007 (has links)
The incorporation of a new technology, such as the Internet, into business processes can have an unexpected influence on those business processes. The study focused on the hypothesis that four entities (case studies) can account for their South African Internet-based sales transactions in a manner that complies with the requirements of GAAP. To address the hypotheses, the study was divided into the following research questions: • How do the four entities capture and record their South African Internet-based sales transactions? • Are these sales transactions accounted for in a manner that complies with the requirements of GAAP? To answer the first research question, data were collected from four entities. To answer the last research question, the data collected were compared with the requirements of GAAP to enable the study to conclude positively on the hypothesis. / Thesis (M.Com. (Accounting))--North-West University, Potchefstroom Campus, 2007.
52

A generic privacy ontology and its applications to different domains

Hecker, Michael January 2009 (has links)
Privacy is becoming increasingly important due to the advent of e-commerce, but is equally important in other application domains. Domain applications frequently require customers to divulge many personal details about themselves that must be protected carefully in accordance with privacy principles and regulations. Here, we define a privacy ontology to support the provision of privacy and help derive the level of privacy associated with transactions and applications. The privacy ontology provides a framework for developers and service providers to guide and benchmark their applications and systems with regards to the concepts of privacy and the levels and dimensions experienced. Furthermore, it supports users or data subjects with the ability to describe their own privacy requirements and measure them when dealing with other parties that process personal information. The ontology developed captures the knowledge of the domain of privacy and its quality aspects, dimensions and assessment criteria. It is composed of a core ontology, which we call generic privacy ontology and application domain specific extensions, which commit to some of application domain concepts, properties and relationships as well as all of the generic privacy ontology ones. This allows for an evaluation of privacy dimensions in different application domains and we present case studies for two different application domains, namely a restricted B2C e-commerce scenario as well as a restricted hospital scenario from the medical domain.
53

Lipophilic M(α,α′-OC5H11)8phthalocyanines (M = H2 and Ni(II)): synthesis, electronic structure, and their utility for highly efficient carbonyl reductions

Jiang, Yu, Li, Minzhi, Liang, Xu, Mack, John, Wildervanck, Martijn, Nyokong, Tebello, Qin, Mingfeng, Zhu, Weihua 07 October 2015 (has links)
A lipophilic and electron-rich phthalocyanine (α,α′-n-OC5H11)8-H2Pc and its nickel(II) complex (α,α′-n-OC5H11)8-Ni(II)Pc have been synthesized and characterized. Detailed analyses of the electronic structure were carried out by spectroscopy, electrochemistry, spectroelectrochemistry, and TD-DFT calculations. A series of experiments demonstrate that the (α,α′-n-OC5H11)8-Ni(II)Pc complex can be used as a catalyst for highly efficient carbonyl reductions. / Original publication is available at http://dx.doi.org/10.1039/C5DT03256C
54

A Model Framework to Estimate the Fraud Probability of Acquiring Merchants

January 2015 (has links)
abstract: Using historical data from the third-party payment acquiring industry, I develop a statistical model to predict the probability of fraudulent transactions by the merchants. The model consists of two levels of analysis – the first focuses on fraud detection at the store level, and the second focuses on fraud detection at the merchant level by aggregating store level data to the merchant level for merchants with multiple stores. My purpose is to put the model into business operations, helping to identify fraudulent merchants at the time of transactions and thus mitigate the risk exposure of the payment acquiring businesses. The model developed in this study is distinct from existing fraud detection models in three important aspects. First, it predicts the probability of fraud at the merchant level, as opposed to at the transaction level or by the cardholders. Second, it is developed by applying machine learning algorithms and logistical regressions to all the transaction level and merchant level variables collected from real business operations, rather than relying on the experiences and analytical abilities of business experts as in the development of traditional expert systems. Third, instead of using a small sample, I develop and test the model using a huge sample that consists of over 600,000 merchants and 10 million transactions per month. I conclude this study with a discussion of the model’s possible applications in practice as well as its implications for future research. / Dissertation/Thesis / Doctoral Dissertation Business Administration 2015
55

Interface for displaying transactions in PEPPOL

Emelie, Åslund January 2020 (has links)
The purpose of this study is to show how transaction data from PEPPOL can bedisplayed in an effective way. PEPPOL is used to exchange e-documents betweenpublic and private entities worldwide. When there are many transactions comingthrough every day it is important to highlight the ones that have failed, and to beable to filter the transactions. This is done with Angular 9, and because Angular is being upgraded frequentlythis study also focuses on making maintainable and readable code which will helpfuture developers. Angular Materials table is being used, to display the transactions,as it provides a simple but modern table, and above the table there is a search barwhich helps us find specific transactions. Comments are added to the code, so it is easy to tell what each code block doesand the MVC pattern is used to split up the components, models and views.
56

Integrity issues of information created by book entries

Van der Poll, Huibrecht Margaretha 03 March 2004 (has links)
Book entries are vehicles used in accounting to accommodate non-cash transactions, timing differences and provisions. The use of book entries is a normal activity in accounting and may have their origin in accrual accounting. The management of a company may apply creative accounting techniques in the form of earnings management, in particular, adopting the practices of income smoothing and taking the so-called ‘big bath’. These practices may result in the financial manager or accountant misusing book entries. This could then lead to information of a different integrity to that which would have resulted had these creative accounting practices not been performed in the company. The question addressed in this dissertation and for which an answer is sought, is whether there is any notable difference in the integrity issues of information supplied through the accounting process and created by real transactions (real events) as opposed to information created by book entries (artificial events). The hypothesis underlying this dissertation is: The integrity of information created by book entries is based on subjective opinions because it is based on future events therefore it is not the same as integrity of information created by real transactions that is based on historical events. The new science is concerned with new guidelines, amongst other things, regarding reality, observation, objectivity, predictions and relationships among events. These new guidelines could be seen as explaining certain aspects which is relevant to the field of accounting. The attributes of a book entry are not based on reality, but are based on subjective predictions of future transactions etc. Another similarity is that a book entry is often not objective but is based on subjective observation. Notable differences were observed in the integrity of the information emerging from a real, historical event and a future event. These differences were established through the application of two research methods, namely, the use of a questionnaire and the analysis of the financial statements of 30 companies listed on The JSE Securities Exchange South Africa (JSE). The influence of book entries on certain ratios was considered, and the ratios influenced by two major book entries, namely, depreciation provision and deferred taxation, differ substantially in interpretation when the two book entries are reclassified. The results of the questionnaire also indicate that a large proportion of the financial managers in practice believe that book entries substantially influence the integrity of information. / Dissertation (MCom Financial Management Sciences)--University of Pretoria, 2003. / Financial Management / unrestricted
57

Measuring and improving the performance of the bitcoin network

Imtiaz, Muhammad Anas 26 January 2022 (has links)
The blockchain technology promises innovation by moving away from conventional centralized architectures, where trust is placed in a small number of actors, to a decentralized environment where a collection of actors must work together to maintain consensus in the overall system. Blockchain offers security and pseudo-anonymity to its adopters, through the use of various cryptographic methods. While much attention has focused on creating new applications that make use of this technology, equal importance must be given to studying naturally occurring phenomena in existing blockchain ecosystems and mitigating their effects where harmful. In this dissertation, we develop a novel open-source log-to-file system that provides the ability to record information relevant to events as they take place in live blockchain networks. Specifically, our open-source software facilitates in-situ measurements on full nodes in the live Bitcoin and Bitcoin Cash blockchain networks. This measurement framework sheds new light on many phenomena that were previously unknown or scarcely studied. First, we examine the presence and impact of churn, namely nodes joining and leaving, on the behavior of the Bitcoin network. Our data analysis over a two-month period shows that a large number of Bitcoin nodes churn at least once. We perform statistical distribution fitting to this churn and emulate it in our measurement nodes to evaluate the impact of churn on the performance of the Bitcoin protocol. From our experiments, we find that blocks received by churning nodes experience as much as five times larger propagation delay than those received by non-churning nodes. We introduce and evaluate a novel synchronization scheme to mitigate such effects on the performance of the protocol. Our empirical evaluation shows that blocks received by churning nodes that synchronize their mempools with peers have roughly half the delay in propagation experienced by those that do not synchronize their mempools. We next evaluate and compare the performance of three block relay protocols, namely the default protocol, and the more recent compact block and Graphene protocols. This evaluation is conducted over full nodes running the Bitcoin Unlimited client (which is used in conjunction with the Bitcoin Cash network). We find that in most scenarios, the Graphene block relay protocol outperforms the other two in terms of the block propagation delay and the amount of total communication associated with block relay. An exception is when nodes churn frequently and spend a significant fraction of time off the network, in which case the compact block relay protocol performs best. In-depth analyses reveal subtle inefficiencies of the protocols. Thus, in the case of frequent churns, the Graphene block relay protocol performs as many as two extra round-trips of communication to recover information necessary to reconstruct blocks. Likewise, an inspection of the compact block relay protocol indicates that the full transactions included in the initial block message are either unnecessary or insufficient for the successful reconstruction of blocks. Finally, we investigate the occurrence of orphan transactions which are those whose parental income sources are missing at the time that they are processed. These transactions typically languish in a local buffer until they are evicted or all their parents are discovered, at which point they may be propagated further. Our data reveals that slightly less than half of orphan transactions end up being included in the blockchain. Surprisingly, orphan transactions tend to have fewer parents on average than non-orphan transactions, and their missing parents have a lower fee, a larger size, and a lower transaction fee per byte than all other received transactions. Moreover, the network overhead incurred by these orphan transactions can be significant when using the default orphan memory pool size (i.e., 100 transactions), although this overhead can be made negligible if the pool size is simply increased to 1,000 transactions. In summary, this dissertation demonstrates the importance of characterizing the inner behavior of the peer-to-peer network underlying a blockchain. While our results primarily focus on the Bitcoin network and its variants, this work provides foundations that should prove useful for studying and characterizing other blockchains.
58

Three Essays in Entrepreneurial Finance and Innovation:

Zhang, Jingxuan January 2023 (has links)
Thesis advisor: Thomas Chemmanur / My doctoral dissertation consists of three chapters focused on topics in entrepreneurial finance and corporate innovation. In the first chapter, I analyze secondary market patent transactions from public assignors (seller firms) to assignees (buyer firms). I show that firms with higher innovation productivity (more able to innovate) but with lower production efficiency (less able to commercialize) are more likely to sell patents distant from their operations. Using a linked assignor-assignee dataset, I find that patents technologically closer to buyer than to seller firms are more likely to be sold in a patent transaction, implying gains from trading patents. I document that, in the three years following patent transactions, seller firms experience a positive and statistically significant improvement in their ROA and operating profitability. I find that the improvement in ROA and operating profitability is concentrated in seller firms which increase their R&D focus after patent transactions, suggesting that an increase in innovation focus is one of the channels driving these results. Consistent with this channel, I find that inventors who are either newly hired by or remaining in assignor firms over the three years subsequent to patent transactions have technological expertise more similar to those of assignor firms. In the second chapter, co-authored with Xi Chen, we study how venture capitalists (VCs) create value in the product market for the entrepreneurial firms backed by them. By constructing a novel dataset based on Nielsen Retail Scanner and VentureXpert, we document that, compared to non-VC-backed firms, VC-backed startups have more than doubled their sales and seized more nationwide market share in the five years following the first VC investment. A further decomposition indicates that VC-backed firms achieve the growth in sales and market share by lowering their product prices. In addition, subsequent to the first VC investment, VC-backed firms enlarge their product portfolios by introducing new products and establishing new product lines, and they expand their products to more stores and geographic locations. Using the limited partner return as an instrument for the supply of VC financing, we show that the above effects are causal. We document heterogeneous value creation effects of VC financing for firms with different market share and for firms with different geographic proximity to the lead VC investors. This suggests that, apart from providing capital, VCs also add value to startups by directing their marketing strategy and monitoring their operations. In the third chapter, co-authored with Thomas Chemmanur, Jiajie Xu, and Xiang Zheng, we analyze the effect of the composition of venture capital (VC) syndicates on value creation to the entrepreneurial firms they invest in. We hypothesize that VCs may learn about each other’s skills at value creation when they co-invest together in entrepreneurial firms, allowing for more efficient value creation when they co-invest in subsequent syndicates. Further, if VCs view syndication as a repeated game, this may generate incentives to co-operate to a greater extent with each other when investing together in a syndicate, reducing the probability of conflicts among VCs. We empirically analyze the implications of these hypotheses and find the following. First, prior collaboration between a lead VC and any of the VCs in a syndicate leads to greater short-term value creation, as evidenced by greater sales growth, employment growth, probability of patented innovation, and the quality of innovations generated during the three years subsequent to VC syndicate investment. Second, prior collaboration between the lead VC and at least one of the syndicate members leads to greater long-term value creation, as evidenced by the higher probability of a successful exit (IPO or acquisition). Third, if the prior collaboration is very successful (leading to an IPO exit resulting from the previous collaboration), then there is even greater value creation by the VC syndicate compared to the case where the prior collaboration was less successful. Finally, consistent with prior collaboration allowing VCs to learn about each other’s value creation skills and reducing potential conflicts among the VCs forming a syndicate, syndicates with prior collaboration between the lead VC and at least one syndicate member are characterized by more uniform syndicate compositions across financing rounds. / Thesis (PhD) — Boston College, 2023. / Submitted to: Boston College. Carroll School of Management. / Discipline: Finance.
59

Blockchain database; technical background and a reconnaissance on an implementation within the banking industry / Blockchain-databas; teknisk bakgrund och en översikt över genomförandet inom banksektorn

Johansson, Tom, Charpentier, Viktor January 2017 (has links)
All human interaction can be depicted as exchanges. We exchange trivial information, feelings, assets and more. Valuable exchanges have one thing in common; they all require some degree of trust. In today’s society we rely on institutionalized trust when commencing an exchange of value. Typically, this role is filled by a vast ecosystem consisting of commercial banks, clearinghouses and other third parties. The recent rise of Bitcoin, Ethereum and consequent attention on the underlying technology, blockchain, questions the future of current ecosystem. This report aims at uncovering what blockchain is, what different implementations are currently available and how it would affect today’s ecosystem. It does so through semistructured interviews with actors within the current ecosystem as well as weighing in the views of blockchain evangelists. It highlights five key aspects that are crucial when implementing blockchain technology within the existing banking paradigm. Today’s organized societies require law and order which, to a large extent, is limited within the realm of public blockchain technology. With the insight of society’s infrastructural limitations, this paper argue that the current transaction system of our society favors a permissioned implementation with trusted nodes. Such a system would result in more efficient financial markets and lower costs of transacting. However, this paper acknowledge the virtues and reasons behind the rise of public blockchains. Given recent developments within the field and interesting concepts, the report does not dare to exclude a future of banking relying on public blockchain technology as the underlying database.
60

Optimizing Distributed Transactions: Speculative Client Execution, Certified Serializability, and High Performance Run-Time

Pandey, Utkarsh 01 September 2016 (has links)
On-line services already form an important part of modern life with an immense potential for growth. Most of these services are supported by transactional systems, which are backed by database management systems (DBMS) in many cases. Many on-line services use replication to ensure high-availability, fault tolerance and scalability. Replicated systems typically consist of different nodes running the service co-ordinated by a distributed algorithm which aims to drive all the nodes along the same sequence of states by providing a total order to their operations. Thus optimization of both local DBMS operations through concurrency control and the distributed algorithm driving replicated services can lead to enhancing the performance of the on-line services. Deferred Update Replication (DUR) is a well-known approach to design scalable replicated systems. In this method, the database is fully replicated on each distributed node. User threads perform transactions locally and optimistically before a total order is reached. DUR based systems find their best usage when remote transactions rarely conflict. Even in such scenarios, transactions may abort due to local contention on nodes. A generally adopted method to alleviate the local contention is to invoke a local certification phase to check if a transaction conflicts with other local transactions already completed. If so, the given transaction is aborted locally without burdening the ordering layer. However, this approach still results in many local aborts which significantly degrades the performance. The first main contribution of this thesis is PXDUR, a DUR based transactional system, which enhances the performance of DUR based systems by alleviating local contention and increasing the transaction commit rate. PXDUR alleviates local contention by allowing speculative forwarding of shared objects from locally committed transactions awaiting total order to running transactions. PXDUR allows transactions running in parallel to use speculative forwarding, thereby enabling the system to utilize the highly parallel multi-core platforms. PXDUR also enhances the performance by optimizing the transaction commit process. It allows the committing transactions to skip read-set validation when it is safe to do so. PXDUR achieves performance gains of an order of magnitude over closest competitors under favorable conditions. Transactions also form an important part of centralized DBMS, which tend to support multi-threaded access to utilize the highly parallel hardware platforms. The applications can be wrapped in transactions which can then access the DBMS as per the rules of concurrency control. This allows users to develop applications that can run on DBMSs without worrying about synchronization. texttt{Serializability} is the de-facto standard form of isolation required by transactions for many applications. The existing methods employed by DBMSs to enforce serializability employ explicit fine-grained locking. The eager-locking based approach is pessimistic and can be too conservative for many applications. The locking approach can severely limit the performance of DBMSs especially for scenarios with moderate to high contention. This leads to the second major contribution of this thesis is TSAsR, an adaptive transaction processing framework, which can be applied to DBMSs to improve performance. TSAsR allows the DBMS's internal synchronization to be more relaxed and enforces serializability through the processng of external meta-data in an optimistic manner. It does not require any changes in the application code and achieves orders of magnitude performance improvements for high and moderate contention cases. The replicated transaction processing systems require a distributed algorithm to keep the system consistent by ensuring that each node executes the same sequence of deterministic commands. These algorithms generally employ texttt{State Machine Replication (SMR)}. Enhancing the performance of such algorithms is a potential way to increase the performance of distributed systems. However, developing new SMR algorithms is limited in production settings because of the huge verification cost involved in proving their correctness. There are frameworks that allow easy specification of SMR algorithms and subsequent verification. However, algorithms implemented in such framework, give poor performance. This leads to the third major contribution of this thesis Verified JPaxos, a JPaxos based runtime system which can be integrated to an easy to verify I/O automaton based on Multipaxos protocol. Multipaxos is specified in Higher Order Logic (HOL) for ease of verification which is used to generates executable code representing the Multipaxos state changes (I/O Automaton). The runtime drives the HOL generated code and interacts with the service and network to create a fully functional replicated Multipaxos system. The runtime inherits its design from JPaxos along with some optimizations. It achieves significant improvement over a state-of-art SMR verification framework while still being comparable to the performance of non-verified systems. / Master of Science

Page generated in 0.1008 seconds