• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 258
  • 92
  • 41
  • 32
  • 24
  • 23
  • 14
  • 11
  • 9
  • 4
  • 4
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 621
  • 71
  • 57
  • 55
  • 44
  • 42
  • 41
  • 41
  • 39
  • 37
  • 36
  • 35
  • 33
  • 32
  • 30
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

CryptoNET : Generic Security Framework for Cloud Computing Environments

Abbasi, Abdul Ghafoor January 2011 (has links)
The area of this research is security in distributed environment such as cloud computing and network applications. Specific focus was design and implementation of high assurance network environment, comprising various secure and security-enhanced applications. “High Assurance” means that -               our system is guaranteed to be secure, -               it is verifiable to provide the complete set of security services, -               we prove that it always functions correctly, and -               we justify our claim that it can not be compromised without user neglect and/or consent.   We do not know of any equivalent research results or even commercial security systems with such properties. Based on that, we claim several significant research and also development contributions to the state–of–art of computer networks security. In the last two decades there were many activities and contributions to protect data, messages and other resources in computer networks, to provide privacy of users, reliability, availability and integrity of resources, and to provide other security properties for network environments and applications. Governments, international organizations, private companies and individuals are investing a great deal of time, efforts and budgets to install and use various security products and solutions. However, in spite of all these needs, activities, on-going efforts, and all current solutions, it is general belief that the security in today networks and applications is not adequate. At the moment there are two general approaches to network application’s security. One approach is to enforce isolation of users, network resources, and applications. In this category we have solutions like firewalls, intrusion–detection systems, port scanners, spam filters, virus detection and elimination tools, etc. The goal is to protect resources and applications by isolation after their installation in the operational environment. The second approach is to apply methodology, tools and security solutions already in the process of creating network applications. This approach includes methodologies for secure software design, ready–made security modules and libraries, rules for software development process, and formal and strict testing procedures. The goal is to create secure applications even before their operational deployment. Current experience clearly shows that both approaches failed to provide an adequate level of security, where users would be guaranteed to deploy and use secure, reliable and trusted network applications. Therefore, in the current situation, it is obvious that a new approach and a new thinking towards creating strongly protected and guaranteed secure network environments and applications are needed. Therefore, in our research we have taken an approach completely different from the two mentioned above. Our first principle is to use cryptographic protection of all application resources. Based on this principle, in our system data in local files and database tables are encrypted, messages and control parameters are encrypted, and even software modules are encrypted. The principle is that if all resources of an application are always encrypted, i.e. “enveloped in a cryptographic shield”, then -               its software modules are not vulnerable to malware and viruses, -               its data are not vulnerable to illegal reading and theft, -               all messages exchanged in a networking environment are strongly protected, and -               all other resources of an application are also strongly protected.   Thus, we strongly protect applications and their resources before they are installed, after they are deployed, and also all the time during their use. Furthermore, our methodology to create such systems and to apply total cryptographic protection was based on the design of security components in the form of generic security objects. First, each of those objects – data object or functional object, is itself encrypted. If an object is a data object, representing a file, database table, communication message, etc., its encryption means that its data are protected all the time. If an object is a functional object, like cryptographic mechanisms, encapsulation module, etc., this principle means that its code cannot be damaged by malware. Protected functional objects are decrypted only on the fly, before being loaded into main memory for execution. Each of our objects is complete in terms of its content (data objects) and its functionality (functional objects), each supports multiple functional alternatives, they all provide transparent handling of security credentials and management of security attributes, and they are easy to integrate with individual applications. In addition, each object is designed and implemented using well-established security standards and technologies, so the complete system, created as a combination of those objects, is itself compliant with security standards and, therefore, interoperable with exiting security systems. By applying our methodology, we first designed enabling components for our security system. They are collections of simple and composite objects that also mutually interact in order to provide various security services. The enabling components of our system are:  Security Provider, Security Protocols, Generic Security Server, Security SDKs, and Secure Execution Environment. They are all mainly engine components of our security system and they provide the same set of cryptographic and network security services to all other security–enhanced applications. Furthermore, for our individual security objects and also for larger security systems, in order to prove their structural and functional correctness, we applied deductive scheme for verification and validation of security systems. We used the following principle: “if individual objects are verified and proven to be secure, if their instantiation, combination and operations are secure, and if protocols between them are secure, then the complete system, created from such objects, is also verifiably secure”. Data and attributes of each object are protected and secure, and they can only be accessed by authenticated and authorized users in a secure way. This means that structural security properties of objects, upon their installation, can be verified. In addition, each object is maintained and manipulated within our secure environment so each object is protected and secure in all its states, even after its closing state, because the original objects are encrypted and their data and states stored in a database or in files are also protected. Formal validation of our approach and our methodology is performed using Threat Model. We analyzed our generic security objects individually and identified various potential threats for their data, attributes, actions, and various states. We also evaluated behavior of each object against potential threats and established that our approach provides better protection than some alternative solutions against various threats mentioned. In addition, we applied threat model to our composite generic security objects and secure network applications and we proved that deductive approach provides better methodology for designing and developing secure network applications. We also quantitatively evaluated the performance of our generic security objects and found that the system developed using our methodology performs cryptographic functions efficiently. We have also solved some additional important aspects required for the full scope of security services for network applications and cloud environment: manipulation and management of cryptographic keys, execution of encrypted software, and even secure and controlled collaboration of our encrypted applications in cloud computing environments. During our research we have created the set of development tools and also a development methodology which can be used to create cryptographically protected applications. The same resources and tools are also used as a run–time supporting environment for execution of our secure applications. Such total cryptographic protection system for design, development and run–time of secure network applications we call CryptoNET system. CrytpoNET security system is structured in the form of components categorized in three groups: Integrated Secure Workstation, Secure Application Servers, and Security Management Infrastructure Servers. Furthermore, our enabling components provide the same set of security services to all components of the CryptoNET system. Integrated Secure Workstation is designed and implemented in the form of a collaborative secure environment for users. It protects local IT resources, messages and operations for multiple applications. It comprises four most commonly used PC applications as client components: Secure Station Manager (equivalent to Windows Explorer), Secure E-Mail Client, Secure Web Browser, and Secure Documents Manager. These four client components for their security extensions use functions and credentials of the enabling components in order to provide standard security services (authentication, confidentiality, integrity and access control) and also additional, extended security services, such as transparent handling of certificates, use of smart cards, Strong Authentication protocol, Security Assertion Markup Language (SAML) based Single-Sign-On protocol, secure sessions, and other security functions. Secure Application Servers are components of our secure network applications: Secure E-Mail Server, Secure Web Server, Secure Library Server, and Secure Software Distribution Server. These servers provide application-specific services to client components. Some of the common security services provided by Secure Application Servers to client components are Single-Sign-On protocol, secure communication, and user authorization. In our system application servers are installed in a domain but it can be installed in a cloud environment as services. Secure Application Servers are designed and implemented using the concept and implementation of the Generic Security Server. It provides extended security functions using our engine components. So by adopting this approach, the same sets of security services are available to each application server. Security Management Infrastructure Servers provide domain level and infrastructure level services to the components of the CryptoNET architecture. They are standard security servers, known as cloud security infrastructure, deployed as services in our domain level could environment. CryptoNET system is complete in terms of functions and security services that it provides. It is internally integrated, so that the same cryptographic engines are used by all applications. And finally, it is completely transparent to users – it applies its security services without expecting any special interventions by users. In this thesis, we developed and evaluated secure network applications of our CryptoNET system and applied Threat Model to their validation and analysis. We found that deductive scheme of using our generic security objects is effective for verification and testing of secure, protected and verifiable secure network applications. Based on all these theoretical research and practical development results, we believe that our CryptoNET system is completely and verifiably secure and, therefore, represents a significant contribution to the current state-of-the-art of computer network security. / QC 20110427
72

Generic Wind Turbine Generator Model Comparison Based on Optimal Parameter Fitting

Dai, Zhen 18 March 2014 (has links)
Parameter tting will facilitate model validation of the generic dynamic model for type-3 WTGs. In this thesis, a test system including a single 1.5 MW DFIG has been built and tested in the PSCAD/EMTDC environment for dynamic responses. The data generated during these tests have been used as measurements for the parameter tting which is carried out using the unscented Kalman lter. Two variations of the generic type-3 WTG model (the detailed model and the simpli ed model) have been compared and used for parameter estimation. The detailed model is able to capture the dynamics caused by the converter and thus has been used for parameter tting when inputs are from a fault scenario. On the other hand, the simpli ed model works well for parameter tting when a wind speed disturbance is of interest. Given measurements from PSCAD, the estimated parameters using both models are indeed improvements compared to the original belief of the parameters in terms of prediction error.
73

Generic Wind Turbine Generator Model Comparison Based on Optimal Parameter Fitting

Dai, Zhen 18 March 2014 (has links)
Parameter tting will facilitate model validation of the generic dynamic model for type-3 WTGs. In this thesis, a test system including a single 1.5 MW DFIG has been built and tested in the PSCAD/EMTDC environment for dynamic responses. The data generated during these tests have been used as measurements for the parameter tting which is carried out using the unscented Kalman lter. Two variations of the generic type-3 WTG model (the detailed model and the simpli ed model) have been compared and used for parameter estimation. The detailed model is able to capture the dynamics caused by the converter and thus has been used for parameter tting when inputs are from a fault scenario. On the other hand, the simpli ed model works well for parameter tting when a wind speed disturbance is of interest. Given measurements from PSCAD, the estimated parameters using both models are indeed improvements compared to the original belief of the parameters in terms of prediction error.
74

Automatic Multi-word Term Extraction and its Application to Web-page Summarization

Huo, Weiwei 20 December 2012 (has links)
In this thesis we propose three new word association measures for multi-word term extraction. We combine these association measures with LocalMaxs algorithm in our extraction model and compare the results of different multi-word term extraction methods. Our approach is language and domain independent and requires no training data. It can be applied to such tasks as text summarization, information retrieval, and document classification. We further explore the potential of using multi-word terms as an effective representation for general web-page summarization. We extract multi-word terms from human written summaries in a large collection of web-pages, and generate the summaries by aligning document words with these multi-word terms. Our system applies machine translation technology to learn the aligning process from a training set and focuses on selecting high quality multi-word terms from human written summaries to generate suitable results for web-page summarization.
75

Lifting the Abstraction Level of Compiler Transformations

Tang, Xiaolong 16 December 2013 (has links)
Production compilers implement optimizing transformation rules for built-in types. What justifies applying these optimizing rules is the axioms that hold for built-in types and the built-in operations supported by these types. Similar axioms also hold for user-defined types and the operations defined on them, and therefore justify a set of optimization rules that may apply to user-defined types. Production compilers, however, do not attempt to construct and apply these optimization rules to user-defined types. Built-in types together the axioms that apply to them are instances of more general algebraic structures. So are user-defined types and their associated axioms. We use the technique of generic programming, a programming paradigm to design efficient, reusable software libraries, to identify the commonality of classes of types, whether built-in or user-defined, convey the semantics of the classes of types to compilers, design scalable and effective program analysis for them, and eventually apply optimizing rules to the operations on them. In generic programming, algorithms and data structures are defined in terms of such algebraic structures. The same definitions are reused for many types, both built-in and user-defined. This dissertation applies generic programming to compiler analyses and transformations. Analyses and transformations are specified for general algebraic structures, and they apply to all types, both built-in and primitive types.
76

The development and merchandising of generic food products : implications of pricing and quality

Bitton, Joseph January 1985 (has links)
No description available.
77

Controlling South Africa's private health care expenditures : the perceptions and experiences of private health care providers about generic medicines in the Mafikeng district, North West Province, South Africa / Patience Elizabeth Kerotse Seodi

Seodi, Patience Elizabeth Kerotse January 2004 (has links)
This was a study which sought to investigate the perceptions and experiences of private health care providers in Mafikeng, North West Province about generic medicines. The escalating cost of medicine in South Africa and elsewhere in the world has necessitated government intervention to come up with strategies to make health care accessible and affordable to the majority of the people. In South Africa, the Medicine and Related Substances Control Amendment Act (Act I0I of 1965), was implemented in May 2003. The Act makes it compulsory for pharmacist to offer patients generic medicines, apart from exceptions listed by the Medical Control Council and, if substitution takes place, to inform the doctor. The study was a prospective, cross- sectional survey of private health care providers in the greater Mafikeng area using a self- administered structured questionnaire. Participants received a structured questionnaire by hand mail and were given the same time to complete it. The questionnaires were them collected from their respective rooms. The main outcome measures were age, level of education, current occupation/profession and their perception and experiences about generic medicines. The total number of respondents was thirty two (32) out of forty (40) private health care providers who received the copies of the questionnaires. One questionnaire was incompletely answered and was therefore excluded from the final analysis. Seven questionnaires were returned unanswered. Age ranged from 26 to 51 and all had one or two university degrees. On average, private health care providers in Mafikeng perceived generic medicines and patent medicines to be identical and bioequivalent. Majority of the respondents prescribed generic medicines as their first line of treatment and were aware of the mandatory generic substitution law. According to the respondents, the majority of patients were not well informed about generic medicines. Majority of respondents were satisfied with the safety, quality, performance characteristics, intended use and route of administration of generic medicines. There is a need for a common essential drug list that will be used by all medical aids schemes in South Africa, wider generic prescribing in both the public and private health sector, speeding up the process of manufacturing generics, health care providers complying fully with the mandatory generic substitution law, parallel importation of generic medicines when a need arises and a widespread promotional campaigns targeting mainly consumers and health professionals. / Mini dissertation (M.B.A. (Financial Man.) North-West University, Mafikeng Campus, 2004
78

Identifying internet marketing principles relevant to generic marketers / Ayesha Lian Bevan-Dye

Bevan-Dye, Ayesha Lian January 2005 (has links)
To deliver the type of marketing graduate that meets industry demand necessitates that marketing curricula content be continuously updated to keep pace with the dynamic marketing environment. One of the major trends influencing the twenty-first century marketing environment is the advent of the Internet and substantial growth in Internet usage and Internet-based commerce. Not only is the Internet driving major marketing environmental change, it is also emerging as a new marketing tool of significant potential. The widespread implications of the Internet to marketing is making it increasingly necessary for general marketing practitioners, even those not actively engaged in Internet-based commerce, to be equipped with an understanding of Internet marketing principles. For marketing education to remain relevant in the twenty-first century, it is essential that Internet marketing content elements be included in undergraduate generic marketing curricula. The first step in this process, and the one addressed by this study, is to identify and reach consensus on which Internet marketing content elements are relevant to generic undergraduate marketing students. The primary purpose of -this study w a s t a develop an empirically derived inventory o f Internet marketing content elements relevant for inclusion in generic undergraduate marketing programs, based upon both marketing academic and marketing practitioner perspectives. Five focal questions were asked and answered by the study. Which Internet-driven marketing environmental changes do marketing academics consider relevant to generic undergraduate marketing students? Which principles guiding the use of Internet as a marketing tool do marketing academics consider relevant to generic undergraduate marketing students? What do marketing academics consider to be the most suitable approach to implementing Internet marketing principles within higher education undergraduate business programs? What do marketing academics consider to be the relevant Internet marketing learning outcomes for generic marketing students at undergraduate level? Do marketing practitioners hold the same opinion as marketing academics regarding research questions one, two, three and four? For the purpose of this study, research was undertaken amongst two groups of respondents. Firstly, a census of the marketing faculties/departments of each of South Africa's public higher education institutions was taken at the end of 2004. Secondly, a non-probability, judgment sample of marketing practitioners, employed in those companies listed on the Johannesburg Stock Exchange (JSE), that engage in marketing activities and which are operational in the South African market was taken at the start of 2005. The questionnaire requested respondents in both samples to indicate the relevance of five identified Internet-driven marketing environmental changes and twenty-four identified principles guiding the use of the Internet as a marketing tool to generic undergraduate marketing students. Further, both samples were requested to select the approach they judged to be the most suitable in implementing Internet marketing principles within undergraduate business programmes. Respondents in both samples were also requested to indicate which Internet marketing learning outcomes they believed. To be relevant generic undergraduate marketing student addition to both samples were asked to provide certain demographical data. The findings indicate that both the Internet-driven marketing environmental change's construct and the principles guiding the use of the Internet as a marketing tool construct to be relevant to generic undergraduate marketing students. The findings further suggest that Internet marketing content elements should be integrated into existing marketing subject offerings. Regarding the learning outcomes, the findings indicate descriptive Internet marketing principles to be the overriding learning outcome. / Thesis (Ph.D. (Business Management))--North-West University, Vaal Triangle Campus, 2005.
79

The trade-related aspects of intellectual property rights (TRIPS) agreement and access to patented medicines in developing countries - Canada's Bill C-9

Weitsman, Faina 05 October 2006 (has links)
TRIPS strengthened international patent protection, particularly in relation to pharmaceutical patents. A compulsory license mechanism is one of the exceptions from patent protection available under TRIPS. This mechanism applies mainly to domestic market supply. Underdeveloped countries with insufficient pharmaceutical manufacturing capacities are unable to use this exception to import medicines in public health emergencies. To resolve this problem, the WTO General Council’s decision allows the export of generic versions of patented drugs under certain conditions. Canada’s Bill C-9 was the first statute to implement the decision. Bill C-9 bears both humanitarian and TRIPS-like provisions. The role of the Government is unjustifiably limited to participation in administrative and legislative processes, while the main operators in the scheme are the generic manufacturer and partly, the patent holder. This thesis proposes several different models to transform the Bill into a workable system for the export of drugs to underdeveloped countries afflicted with pandemics.
80

Reconstruction of Complete Head Models with Consistent Parameterization

Niloofar, Aghayan 16 April 2014 (has links)
This thesis introduces an efficient and robust approach for 3D reconstruction of complete head models with consistent parameterization and personalized shapes from several possible inputs. The system input consists of Cyberware laser-scanned data where we perform scanning task as well as publically available face data where (i) facial expression may or may not exist and (ii) only partial information of head may exist, for instance only front face part without back part of the head. Our method starts with a surface reconstruction approach to either transfer point clouds to a mesh structure or to fill missing points on a triangular mesh. Then, it is followed by a registration process which unifies the representation of all meshes. Afterward, a photo-cloning method is used to extract an adequate set of features in a semi-automatic way on snapshots taken from front and left views of provided range data. We modify Radial Basis Functions (RBFs) deformation so that it would be based on not only distance, but also regional information. Using feature point sets and modified RBFs deformation, a generic mesh can be manipulated in a way that closed eyes and mouth movements like separating upper lip and lower lip can be properly handled. In other word, such mesh modification method makes construction of various facial expressions possible. Moreover, new functions are added where a generic model can be manipulated based on feature point sets to consequently recover missing parts such as ears, back of the head and neck in the input face. After feature-based deformation using modified radial basis functions, a fine mesh modification method based on model points follows to extract the fine details from the available range data. Then, some post refinement procedures employing RBFs deformation and averaging neighboring points are carried out to make the surface of reconstructed 3D head smoother and uniform. Due to existence of flaws and defects on the mesh surface such as flipped triangles, self-intersections or degenerate faces, an automatic repairing approach is leveraged to clean up the entire surface of the mesh. The experiments which are performed on various models show that our method is robust and efficient in terms of accurate full head reconstruction from input data and execution time, respectively. In our method, it is also aimed to use minimum user interaction as much as possible.

Page generated in 0.0763 seconds