Spelling suggestions: "subject:"business enterprises - computer networks"" "subject:"business enterprises - coomputer networks""
1 |
Providing resilient quality of service connections in provider-based virtual private networksRosenbaum, Gustav Filip, Computer Science & Engineering, Faculty of Engineering, UNSW January 2005 (has links)
This thesis focuses on efficient provisioning of resilient Virtual Private Network (VPN) services. It first confirms the intuition that network resources can be more efficiently utilized when resilience mechanisms are implemented by a network provider in the physical network than by its VPN customers in their VPNs. Next, a Multiprotocol Label Switching-based programmable VPN architecture is presented that delivers virtual links as resilient quality of service (QoS) connections and virtual sites. Virtual sites allow customers to implement functionality like customized routing and content adaptation ???in the cloud???, as opposed to the current network model where all functionality is implemented at the network edge. To provision a resilient QoS connection, two paths need to be computed from the ingress to the egress nodes, such that both paths meet the given QoS constraints. Two different frameworks have been proposed in the literature to compute resilient QoS connections when the QoS constraints are bandwidth and end-to-end delay. They both use a preprocessing step whereby either all links with less residual capacity than the given bandwidth constraint are pruned, or the given end-to-end delay is converted to an effective bandwidth. The frameworks thus reduce the problem to one with only a single constraint. We argue in this thesis that these frameworks individually lead to poor network utilization and propose a new framework where both constraints are considered simultaneously. Our framework exploits the dependency between endto- end delay, provisioned bandwidth and chosen path through using the provisioned bandwidth as a variable. Here, two link-disjoint paths are computed together with their respective minimum bandwidths such that both the bandwidth and end-to-end delay constraints are satisfied. Given our framework we first propose a new generic algorithm that decomposes the problem into subproblems where known algorithms can be applied. Then we propose two new linear programming (LP) formulations that return the two paths and their respective bandwidths such that they have the minimum combined cost. To make our framework applicable in a production environment, we develop two new algorithms with low run times that achieve even higher network performance than their LP formulation counterpart. These algorithms systematically use an algorithm that computes non-resilient QoS connections. As no algorithm for computing nonresilient QoS connections with sufficiently low run time has been proposed in the current literature we develop two new algorithms and their respective heuristics with a run time comparable to Dijkstra???s shortest-path algorithm. Our simulations show that exploiting the dependency between end-to-end delay, provisioned bandwidth and chosen path can significantly improve the network performance.
|
2 |
Providing resilient quality of service connections in provider-based virtual private networksRosenbaum, Gustav Filip, Computer Science & Engineering, Faculty of Engineering, UNSW January 2005 (has links)
This thesis focuses on efficient provisioning of resilient Virtual Private Network (VPN) services. It first confirms the intuition that network resources can be more efficiently utilized when resilience mechanisms are implemented by a network provider in the physical network than by its VPN customers in their VPNs. Next, a Multiprotocol Label Switching-based programmable VPN architecture is presented that delivers virtual links as resilient quality of service (QoS) connections and virtual sites. Virtual sites allow customers to implement functionality like customized routing and content adaptation ???in the cloud???, as opposed to the current network model where all functionality is implemented at the network edge. To provision a resilient QoS connection, two paths need to be computed from the ingress to the egress nodes, such that both paths meet the given QoS constraints. Two different frameworks have been proposed in the literature to compute resilient QoS connections when the QoS constraints are bandwidth and end-to-end delay. They both use a preprocessing step whereby either all links with less residual capacity than the given bandwidth constraint are pruned, or the given end-to-end delay is converted to an effective bandwidth. The frameworks thus reduce the problem to one with only a single constraint. We argue in this thesis that these frameworks individually lead to poor network utilization and propose a new framework where both constraints are considered simultaneously. Our framework exploits the dependency between endto- end delay, provisioned bandwidth and chosen path through using the provisioned bandwidth as a variable. Here, two link-disjoint paths are computed together with their respective minimum bandwidths such that both the bandwidth and end-to-end delay constraints are satisfied. Given our framework we first propose a new generic algorithm that decomposes the problem into subproblems where known algorithms can be applied. Then we propose two new linear programming (LP) formulations that return the two paths and their respective bandwidths such that they have the minimum combined cost. To make our framework applicable in a production environment, we develop two new algorithms with low run times that achieve even higher network performance than their LP formulation counterpart. These algorithms systematically use an algorithm that computes non-resilient QoS connections. As no algorithm for computing nonresilient QoS connections with sufficiently low run time has been proposed in the current literature we develop two new algorithms and their respective heuristics with a run time comparable to Dijkstra???s shortest-path algorithm. Our simulations show that exploiting the dependency between end-to-end delay, provisioned bandwidth and chosen path can significantly improve the network performance.
|
3 |
A model for best practice driven information security governanceLessing, Martha Maria 04 June 2008 (has links)
To ensure the likely success of an organisation’s Information Security Governance, discipline leaders recommend that organisations follow the guidelines as set out in Information Security Governance best practice documents. Best practices and related documents from the Information Security Governance discipline, as well as best practices and related documents from the Corporate Governance and Information Technology Governance disciplines, all include sections pertaining to Information Security, Information Security Governance and Information Technology assets. This study puts these sections together, and constructs an Information Security Governance model that combines all aspects of Information Security Governance. In theory, this model should guide an organisation to the ultimate level of Information Security Governance. / Prof. S. H. von Solms
|
4 |
The internet : a business tool for organisations in the nineties14 August 2012 (has links)
M.Comm. / The main aim of this study is to provide an insight on how businesses can use the Internet to enhance their existing business operations. Towards this aim, there are the following secondary aims. Explaining what the Internet can and cannot do, and the tools required. Identifying the current risks involved with the Internet and their impact on business. Providing a possible strategy to follow when planning on using the Internet for business. The output of this study should provide clear guidelines to any South African organisation on why they should utilise, or alternatively, avoid the Internet.
|
5 |
Is "best practice" really the best?: examining the effects of ERP adoption on core competency. / CUHK electronic theses & dissertations collection / Digital dissertation consortiumJanuary 2010 (has links)
Organizations become more homogenous when they adapt to the external environment for survival and competitiveness. Institutional theorists call this phenomenon "isomorphism," which is a constraining process that forces organizations---through coercive, mimetic, or normative pressures---to resemble each other when facing the same set of environmental conditions. In recent years, concerns about cost-efficiency and standardization of information technology (IT) have led organizations to rely more heavily on IT to enhance their business operations. Enterprise resource planning (ERP) systems enable the tight integration of all necessary business functions into a single system. Typically, a database, designed to standardize organizational IT platforms and business processes, is shared throughout an organization. The high adoption rate of ERP systems among the biggest corporations has pressured other organizations to adopt ERP systems. Information system (IS) researchers call this phenomenon "technical isomorphism". / This study examines the effects of ERP implementation on organizational homogeneity from the viewpoint of institutional theory. Through mediating factors, such as the extent of ERP implementation and software adaptation, this study also investigates the effects of organizational homogeneity on the core competencies of user-organizations. It addresses four important issues: (a) whether institutional pressures lead to organizational homogenization; (b) whether institutional pressures affect the extent of ERP implementation in organizations; (c) whether the extent of ERP implementation affects software adaptation and subsequently, homogenization; and (d) whether the core competencies of organizations are ultimately affected by the adoption of technology. / This study's findings contribute to our understanding on the effects of ERP implementation in organizations, particularly on the IT and business activities. They open a whole new arena of research into the impact of technology on organizational abilities, providing a new set of constructs, relationships, antecedents, and dependent variables. Moreover, this study provides the necessary evidence on the occurrence of homogenization, its origins, and its consequences. It also provides valuable guidelines in finding a balance between conformity and retaining the uniqueness of companies, which is regarded as a source of core competencies. Thus, the research findings can help organizations redirect their focus and efforts into ERP implementation, saving millions of dollars in the process. / Liu, Kar Wai Connie. / Adviser: Vincent S. Lai. / Source: Dissertation Abstracts International, Volume: 72-04, Section: A, page: . / Thesis (Ph.D.)--Chinese University of Hong Kong, 2010. / Includes bibliographical references (leaves 141-152). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. Ann Arbor, MI : ProQuest Information and Learning Company, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. Ann Arbor, MI : ProQuest Information and Learning Company, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstract also in Chinese; appendix 8.3 and 8.4 in Chinese.
|
6 |
Control and assurance services for electronic commerce /Wang, Wenli, January 2000 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 2000. / Vita. Includes bibliographical references (leaves 104-106). Available also in a digital version from Dissertation Abstracts.
|
7 |
Understanding the organization of managed service providers: an analysis of customer satisfaction and contracting in markets for hosted IT servicesSusarla, Anjana 28 August 2008 (has links)
Not available / text
|
8 |
Essays in business-to-business commerce on the internetMishra, Abhay Nath, 1971- 13 July 2011 (has links)
Not available / text
|
9 |
E-commerce information systems (ECIS) success: A South African studyPather, Shaun January 2006 (has links)
Thesis (DTech (Information Technology))--Cape Peninsula University of Technology, 2006 / As a phenomenon of the 1990s, e-Commerce is relatively new. Its advent
offered the promise of new opportunities to businesses and entrepreneurs
around the world. The hyperbole associated with the Internet and the Web
resulted in a mindset that e-Commerce was an easy road to success. It was
believed that this new technology-based approach would revolutionise
business in a number of ways, including changing the relationships between
thestakeholders and allowing small organisations to play on the global stage.
However, the road to business enhancement through e-Commerce has not
been easy. Many organisations have not survived their attempts to engage. in
e-Commerce and others have radically changed their approach since the e- .
Bubble burst. There were many reasons for the failure of these e-Commerce
initiatives. They included poor business ideas, no control of expenditure, lack of
general business experience and immaturity, as well as little understanding of
the crucial importance of managing the technology through which the Internet
and the Web delivers e-Commerce opportunities.
This thesis explores the intricacies of IS within the Sduth African B2C eCommerce
environment and argues that without a coherent understanding of
the factors affecting IS success, the implementation of traditional IS evaluation
mechanisms may be problematic. A comparative analysis of studies in this
field between the pre- and post- e-Commerce eras, ascertained a paucity of
theoretical frameworks and a fragmented body of knowledge in the extant
literature, with a narrow focus on web-interface issues.
|
10 |
The role of service-oriented architecture as an enabler for enterprise architectureKistasamy, Christopher January 2011 (has links)
Thesis (MTech (Information Technology))--Cape Peninsula University of Technology, 2011. / The adoption of Enterprise Architecture (EA) methodologies within organizations is causing an
interest in the methodologies and supporting technologies available. Service Oriented Architecture
(SOA) supports EA in many facets. However, there is much suspension with regard to the relationship
between EA and SOA within organizations as well as the guidelines that organisations should follow
in order for SOA to enable EA. There are potential problems that may arise if this relationship
between SOA and EA is not agreed to at the outset of implementing an EA. The purpose of this
research is to investigate the guidelines that are needed for SOA to enable EA, in order to provide
practical steps that organisations can use to begin aligning SOA and EA, ensuring that these
initiatives are driven from a business perspective. A qualitative approach using a case study was used
as a methodology for this research. The data collection was conducted using semi-structured
interviews, and the guidelines that were derived were validated through a survey that was distributed
to industry architecture practitioners. The contribution of this research was a set of guidelines that can
be used for SOA to enable EA. Further research areas were highlighted, including investigating the
mapping of the guidelines that were derived from this research, into the EA frameworks that exist
such as TOGAF and ZACHMAN.
|
Page generated in 0.1223 seconds