• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 777
  • 220
  • 122
  • 65
  • 54
  • 33
  • 32
  • 30
  • 28
  • 21
  • 15
  • 14
  • 9
  • 9
  • 7
  • Tagged with
  • 1598
  • 1598
  • 390
  • 281
  • 243
  • 243
  • 240
  • 236
  • 231
  • 225
  • 215
  • 209
  • 176
  • 173
  • 152
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
341

Data-Intensive Biocomputing in the Cloud

Meeramohideen Mohamed, Nabeel 25 September 2013 (has links)
Next-generation sequencing (NGS) technologies have made it possible to rapidly sequence the human genome, heralding a new era of health-care innovations based on personalized genetic information. However, these NGS technologies generate data at a rate that far outstrips Moore\'s Law. As a consequence, analyzing this exponentially increasing data deluge requires enormous computational and storage resources, resources that many life science institutions do not have access to. As such, cloud computing has emerged as an obvious, but still nascent, solution. This thesis intends to investigate and design an efficient framework for running and managing large-scale data-intensive scientific applications in the cloud. Based on the learning from our parallel implementation of a genome analysis pipeline in the cloud, we aim to provide a framework for users to run such data-intensive scientific workflows using a hybrid setup of client and cloud resources. We first present SeqInCloud, our highly scalable parallel implementation of a popular genetic variant pipeline called genome analysis toolkit (GATK), on the Windows Azure HDInsight cloud platform. Together with a parallel implementation of GATK on Hadoop, we evaluate the potential of using cloud computing for large-scale DNA analysis and present a detailed study on efficiently utilizing cloud resources for running data-intensive, life-science applications. Based on our experience from running SeqInCloud on Azure, we present CloudFlow, a feature rich workflow manager for running MapReduce-based bioinformatic pipelines utilizing both client and cloud resources. CloudFlow, built on the top of an existing MapReduce-based workflow manager called Cloudgene, provides unique features that are not offered by existing MapReduce-based workflow managers, such as enabling simultaneous use of client and cloud resources, automatic data-dependency handling between client and cloud resources, and the flexibility of implementing user-defined plugins for data transformations. In-general, we believe that our work attempts to increase the adoption of cloud resources for running data-intensive scientific workloads. / Master of Science
342

Optimizing, Testing, and Securing Mobile Cloud Computing Systems For Data Aggregation and Processing

Turner, Hamilton Allen 22 January 2015 (has links)
Seamless interconnection of smart mobile devices and cloud services is a key goal in modern mobile computing. Mobile Cloud Computing is the holistic integration of contextually-rich mobile devices with computationally-powerful cloud services to create high value products for end users, such as Apple's Siri and Google's Google Now product. This coupling has enabled new paradigms and fields of research, such as crowdsourced data collection, and has helped spur substantial changes in research fields such as vehicular ad hoc networking. However, the growth of Mobile Cloud Computing has resulted in a number of new challenges, such as testing large-scale Mobile Cloud Computing systems, and increased the importance of established challenges, such as ensuring that a user's privacy is not compromised when interacting with a location-aware service. Moreover, the concurrent development of the Infrastructure as a Service paradigm has created inefficiency in how Mobile Cloud Computing systems are executed on cloud platforms. To address these gaps in the existing research, this dissertation presents a number of software and algorithmic solutions to 1) preserve user locational privacy, 2) improve the speed and effectiveness of deploying and executing Mobile Cloud Computing systems on modern cloud infrastructure, and 3) enable large-scale research on Mobile Cloud Computing systems without requiring substantial domain expertise. / Ph. D.
343

Distributed Architectures for Enhancing Artificial Intelligence of Things Systems. A Cloud Collaborative Model

Elouali, Aya 23 November 2023 (has links)
In today’s world, IoT systems are more and more overwhelming. All electronic devices are becoming connected. From lamps and refrigerators in smart homes, smoke detectors and cameras in monitoring systems, to scales and thermometers in healthcare systems, until phones, cars and watches in smart cities. All these connected devices generate a huge amount of data collected from the environment. To take advantage of these data, a processing phase is needed in order to extract useful information, allowing the best management of the system. Since most objects in IoT systems are resource limited, the processing step, usually performed by an artificial intelligence model, is offloaded to a more powerful machine such as the cloud server in order to benefit from its high storage and processing capacities. However, the cloud server is geographically remote from the connected device, which leads to a long communication delay and harms the effectiveness of the system. Moreover, due to the incredibly increasing number of IoT devices and therefore offloading operations, the load on the network has increased significantly. In order to benefit from the advantages of cloud based AIoT systems, we seek to minimize its shortcomings. In this thesis, we design a distributed architecture that allows combining these three domains while reducing latency and bandwidth consumption as well as the IoT device’s energy and resource consumption. Experiments conducted on different cloud based AIoT systems showed that the designed architecture is capable of reducing up to 80% of the transmitted data. / En el mundo actual, los sistemas de IoT (Internet de las cosas) son cada vez más abrumadores. Todos los dispositivos electrónicos se están conectando entre sí. Desde lámparas y refrigeradores en hogares inteligentes, detectores de humo y cámaras para sistemas de monitoreo, hasta básculas y termómetros para sistemas de atención médica, pasando por teléfonos, automóviles y relojes en ciudades inteligentes. Todos estos dispositivos conectados generan una enorme cantidad de datos recopilados del entorno. Para aprovechar estos datos, es necesario un proceso de análisis para extraer información útil que permita una gestión óptima del sistema. Dado que la mayoría de los objetos en los sistemas de IoT tienen recursos limitados, la etapa de procesamiento, generalmente realizada por un modelo de inteligencia artificial, se traslada a una máquina más potente, como el servidor en la nube, para beneficiarse de su alta capacidad de almacenamiento y procesamiento. Sin embargo, el servidor en la nube está geográficamente alejado del dispositivo conectado, lo que conduce a una larga demora en la comunicación y perjudica la eficacia del sistema. Además, debido al increíble aumento en el número de dispositivos de IoT y, por lo tanto, de las operaciones de transferencia de datos, la carga en la red ha aumentado significativamente. Con el fin de aprovechar las ventajas de los sistemas de AIoT (Inteligencia Artificial en el IoT) basados en la nube, buscamos minimizar sus desventajas. En esta tesis, hemos diseñado una arquitectura distribuida que permite combinar estos tres dominios al tiempo que reduce la latencia y el consumo de ancho de banda, así como el consumo de energía y recursos del dispositivo IoT. Los experimentos realizados en diferentes sistemas de AIoT basados en la nube mostraron que la arquitectura diseñada es capaz de reducir hasta un 80% de los datos transmitidos.
344

Service-Oriented Architecture based Cloud Computing Framework For Renewable Energy Forecasting

Sehgal, Rakesh 10 March 2014 (has links)
Forecasting has its application in various domains as the decision-makers are provided with a more predictable and reliable estimate of events that are yet to occur. Typically, a user would invest in licensed software or subscribe to a monthly or yearly plan in order to make such forecasts. The framework presented here differs from conventional software in forecasting, as it allows any interested party to use the proposed services on a pay-per-use basis so that they can avoid investing heavily in the required infrastructure. The Framework-as-a-Service (FaaS) presented here uses Windows Communication Foundation (WCF) to implement Service-Oriented Architecture (SOA). For forecasting, collection of data, its analysis and forecasting responsibilities lies with users, who have to put together other tools or software in order to produce a forecast. FaaS offers each of these responsibilities as a service, namely, External Data Collection Framework (EDCF), Internal Data Retrieval Framework (IDRF) and Forecast Generation Framework (FGF). FaaS Controller, being a composite service based on the above three, is responsible for coordinating activities between them. These services are accessible through Economic Endpoint (EE) or Technical Endpoint (TE) that can be used by a remote client in order to obtain cost or perform a forecast, respectively. The use of Cloud Computing makes these services available over the network to be used as software to forecast energy for solar or wind resources. These services can also be used as a platform to create new services by merging existing functionality with new service features for forecasting. Eventually, this can lead to faster development of newer services where a user can choose which services to use and pay for, presenting the use of FaaS as Platform-as-a-Service (PaaS) in forecasting. / Master of Science
345

Smart monitoring and controlling of government policies using social media and cloud computing

Singh, P., Dwivedi, Y.K., Kahlon, K.S., Sawhney, R.S., Alalwan, A.A., Rana, Nripendra P. 25 October 2019 (has links)
Yes / The governments, nowadays, throughout the world are increasingly becoming dependent on public opinion regarding the framing and implementation of certain policies for the welfare of the general public. The role of social media is vital to this emerging trend. Traditionally, lack of public participation in various policy making decision used to be a major cause of concern particularly when formulating and evaluating such policies. However, the exponential rise in usage of social media platforms by general public has given the government a wider insight to overcome this long pending dilemma. Cloud-based e-governance is currently being realized due to IT infrastructure availability along with mindset changes of government advisors towards realizing the various policies in a best possible manner. This paper presents a pragmatic approach that combines the capabilities of both cloud computing and social media analytics towards efficient monitoring and controlling of governmental policies through public involvement. The proposed system has provided us some encouraging results, when tested for Goods and Services Tax (GST) implementation by Indian government and established that it can be successfully implemented for efficient policy making and implementation.
346

Cloud-based augmented reality as a disruptive technology for Higher Education

Mohamad, A.M., Kamaruddin, S., Hamin, Z., Wan Rosli, Wan R., Omar, M.F., Mohd Saufi, N.N. 25 September 2023 (has links)
No / Augmented reality (AR) within the context of higher education is an approach to engage students with experiential learning by utilising AR technology. This paper discusses the process undertaken by a teacher in higher education in designing and implementing cloud-based AR lesson for the students. The methodology engaged was case study at one institution of higher learning in Malaysia. The AR teaching process involves six stages, beginning with the selection of the course, followed by selection of the topic, designing of the AR teaching plan and the implementation of the AR lesson. Upon completion of the implementation of the AR lesson, the teacher and students would provide reflection of their experiences. The process concludes by the improvement of the AR teaching plan by the teacher. The study found that cloud based has indeed disrupted higher education in terms of providing richer learning experiences to the students, as well as enhanced teaching practices for the teachers. Hopefully, this paper would provide insights into the practices of AR teaching and learning approach for teachers in general, and within the context of higher education in particular. It is also intended that the six-steps process outlined in this paper becomes a reference and be duplicated by teachers at large who might be interested to design and implement AR lessons for their own courses.
347

An analysis of authentication models in cloud computing and on-premise Windows environments.

Viktorsson, Samuel January 2024 (has links)
The increased usage of cloud computing has transformed modern information technology by providing organisations with a scalable, flexible, and cost-effective alternative to the traditional on-premise service model. Both service models have their own set of advantages and disadvantages. One key aspect both service models have in common is the importance of keeping private data secure. There is an ongoing debate on whether cloud computing is safe enough to store private data. This thesis will help organisations understand the security considerations of the different service models. This will be accomplished through a case study researching the different authentication models of both service models and an experiment to gain further insights. The case study and experiment will conclude with a heuristic that organisations can use when picking an authentication model. The main conclusion of this thesis is that we consider the cloud computing service model less secure than the on-premise Windows service model. We also concluded that we consider an LDAP on-premise Windows authentication model and the Azure authentication model to have a higher chance of being less secure than the other authentication models researched in this thesis.
348

Cloud computing based adaptive traffic control and management

Jaworski, P. January 2013 (has links)
Recent years have shown a growing concern over increasing traffic volume worldwide. The insufficient road capacity and the resulting congestions have become major problems in many urban areas. Congestions negatively impact the economy, the environment and the health of the population as well as the drivers satisfaction. Current solutions to this topical and timely problem rely on the exploitation of Intelligent Transportation Systems (ITS) technologies. ITS urban traffic management involves the collection and processing of a large amount of geographically distributed information to control distributed infrastructure and individual vehicles. The distributed nature of the problem prompted the development of a novel, scalable ITS-Cloud platform. The ITS-Cloud organises the processing and manages distributed data sources to provide traffic management methods with more accurate information about the state of the traffic. A new approach to service allocation, derived from the existing cloud and grid computing approaches, was created to address the unique needs of ITS traffic management. The ITS-Cloud hosts the collection of software services that form the Cloud based Traffic Management System (CTMS). CTMS combines intersection control algorithms with intersection approach advices to the vehicles and dynamic routing. The CTMS contains a novel Two-Step traffic management method that relies on the ITS-Cloud to deliver a detailed traffic simulation image and integrates an adaptive intersection control algorithm with a microscopic prediction mechanism. It is the first method able to perform simultaneous adaptive intersection control and intersection approach optimization. The Two-Step method builds on a novel pressure based adaptive intersection control algorithm as well as two new traffic prediction schemes. The developed traffic management system was evaluated using a new microscopic traffic simulation tool tightly integrated with the ITS-Cloud. The novel traffic management approaches were shown to outperform benchmark methods for a realistic range of traffic conditions and road network configurations. Unique to the work was the investigation of interactions between ITS components.
349

The right to privacy : how the proposed POPI Bill will impact data security in a cloud computing environment

Basson, Benhardus 04 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2014. / ENGLISH ABSTRACT: The growing popularity and continuing development of cloud computing services is ever evolving and is slowly being integrated into our daily lives through our interactions with electronic devices. Cloud Computing has been heralded as the solution for enterprises to reduce information technology infrastructure cost by buying cloud services as a utility. While this premise is generally correct, in certain industries for example banking, the sensitive nature of the information submitted to the cloud for storage or processing places information security responsibilities on the party using the cloud services as well as the party providing them. Problems associated with cloud computing are loss of control, lack of trust between the contracting parties in the cloud relationship (customer and cloud service provider) and segregating data securely in the virtual environment. The risk and responsibilities associated with data loss was previously mainly reputational in nature but with the promulgation and signing by the South African Parliament of the Protection of Personal Information Bill (POPI) in August 2013 these responsibilities to protect information are in the process to be legislated in South Africa. The impact of the new legislation on the cloud computing environment needs to be investigated as the requirements imposed by the Bill might render the use of cloud computing in regard to sensitive data nonviable without replacing some of the IT infrastructure cost benefits that cloud computing allows with increased data security costs. In order to investigate the impact of the new POPI legislation on cloud computing, the components and characteristics of the cloud will be studied and differentiated from other forms of computing. The characteristics of cloud computing are the unique identifiers that differentiate it from Grid and Cluster computing. The component study is focused on the service and deployment models that can be associated with cloud computing. The understanding obtained will be used to compile a new definition of cloud computing. By utilizing the cloud definition of what components and processes constitute cloud computing the different types of data security processes and technical security measures can be implemented are studied. This will include information management and governance policies as well as technical security measures such as encryption and virtualisation security. The last part of the study will be focussed on the Bill and the legislated requirements and how these can be complied with using the security processes identified in the rest of the study. The new legislation still has to be signed by the State President after which businesses will have one year to comply and due to the short grace period businesses need to align their business practices with the proposed requirements. The impact is wide ranging from implementing technical information security processes to possible re-drafting of service level agreements with business partners that share sensitive information. The study will highlight the major areas where the Bill will impact businesses as well as identifying possible solutions that could be implemented by cloud computing users when storing or processing data in the cloud. / AFRIKAANSE OPSOMMING: Die groei in gewildheid en die ontwikkeling van wolkbewerking dienste is besig om te verander en is stadig besig om in ons daaglikse lewens geintegreer te word deur ons interaksie met elektroniese toestelle. Wolkbewerking word voorgehou as ‘n oplossing vir besighede om hul inligtings tegnologie infrastruktuur kostes te verminder deur dienste te koop soos hulle dit benodig. Alhoewel die stelling algemeen as korrek aanvaar word, kan spesifieke industrië soos byvoorbeeld die bankwese se inligting so sensitief wees dat om die inligting aan wolkbewerking bloot te stel vir berging en prosesseering dat addisionele verantwoodelikhede geplaas op die verantwoordelike partye wat die wolk dienste gebruik sowel as die persone wat dit voorsien. Probleme geassosieër met wolk- bewerking is die verlies aan beheer, gebrekkige vertroue tussen kontakteurende partye in die wolk verhouding (verbruiker en wolk dienste verskaffer) en die beveiliging van verdeelde inligting in die virtuele omgewing. Die risiko’s en verantwoordelikhede geassosieër met inligtings verlies was voorheen grootliks gebasseer op die skade wat aan die besigheid se reputasie aangedoen kan word, maar met die publiseering en ondertekening deur die Suid-Afrikaans Parliament van die Beskerming van Persoonlike Inligting Wet (BVPI) in Augustus 2013 is hierdie verantwoordelikhede in die proses om in wetgewing in Suid Afrika vas gelê te word. Die impak van die nuwe wetgewing op die wolkbewering omgewing moet ondersoek word omdat die vereistes van die Wet die gebruik van wolkbewerking in terme van sensitiewe inligting so kan beinvloed dat dit nie die moeite werd kan wees om te gebruik nie, en veroorsaak dat addisionele verminderde IT infrastruktuur koste voordele vervang moet word met addisionele inligting beveiligings kostes. Om die impak van die nuwe BVPI wetgewing op wolkbewerking te ondersoek moet die komponente en karakter eienskappe van die wolk ondersoek word om vas te stel wat dit uniek maak van ander tipes rekenaar bewerking. Die karakter eienskappe van wolkbewerking is die unieke aspekte wat dit apart identifiseer van Rooster en Groep rekenaar bewerking. Die komponente studie sal fokus op die dienste en implimenterings modelle wat geassosieer word met wolkbewerking. Die verstandhouding wat deur voorsafgaande studie verkry is sal dan gebruik word om ‘n nuwe definisie vir wolkbewerking op te stel. Deur nou van die definisie gebruik te maak kan die inligtings sekuriteit prosesse en tegniese sekuriteits maatreëls wat deur die verantwoordelike party en die wolkbewerkings dienste verskaffer gebruik kan word om die komponente en prosesse te beveilig bestudeer word. Die studie sal insluit, inligtings bestuur prosesse en korporatiewe bestuur asook tegniese beveiligings maatreels soos kodering en virtualisasie sekuriteit. Die laaste deel van die studie sal fokus op die BVPI wetgewing en die vereistes en hoe om daaraan te voldoen deur die sekuritiets maatreëls geidentifiseer in die res van die studie te implimenteer. Die nuwe wetgewing moet nog deur die Staats President onderteken word waarna besighede ‘n jaar sal he om aan die vereistes te voldoen en omdat die periode so kort is moet besighede hulself voorberei en besigheid prosesse aanpas. Die impak van die wetgewing strek baie wyd en beinvloed van tegnise inligtings beveiligings prosesse tot kontrakte aangaande diens lewering wat dalk oor opgestel moet word tussen partye wat sensitiewe inligting uitruil. Die studie sal die prominente areas van impak uitlig asook die moontlike oplossings wat gebruik kan word deur partye wat wolkbewerking gebruik om inligting te stoor of te bewerk.
350

Risk-aware Business Process Modelling and Trusted Deployment in the Cloud / Modélisation de processus métiers sensibilisés aux risques et déploiement en confiance dans le cloud

Goettelmann, Elio 21 October 2015 (has links)
L’essor du Cloud Computing, permettant de partager les coûts et les ressources au travers de la virtualisation, présage une interconnexion dynamique et flexible entre entreprises et fournisseurs. Cependant, cette mise en commun de ressources, données et savoir-faire implique de nouvelles exigences en termes de sécurité. En effet, le manque de confiance dans les structures du Cloud est souvent vu comme un frein au développement de tels services. L’objectif de cette thèse est d’étudier les concepts d’orchestration de services, de confiance et de gestion des risques dans le contexte du Cloud. La contribution principale est un framework permettant de déployer des processus métiers dans un environnement Cloud, en limitant les risques de sécurité liés à ce contexte. La contribution peut être séparée en trois partie distinctes qui prennent la forme d'une méthode, d'un modèle et d'un framework. La méthode catégorise des techniques pour transformer un processus métier existant en un modèle sensibilisé (ou averti) qui prend en compte les risques de sécurité spécifiques aux environnements Cloud. Le modèle formalise les relations et les responsabilités entre les différents acteurs du Cloud. Ce qui permet d'identifier les différentes informations requises pour évaluer et quantifier les risques de sécurité des environnements Cloud. Le framework est une approche complète de décomposition de processus en fragments qui peuvent être automatiquement déployés sur plusieurs Clouds. Ce framework intègre également un algorithme de sélection qui combine les information de sécurité avec d'autres critère de qualité de service pour générer des configuration optimisées. Finalement, les travaux sont implémentés pour démontrer la validité de l'approche. Le framework est implémenté dans un outil. Le modèle d'évaluation des risques de sécurité Cloud est également appliqué dans un contexte de contrôle d'accès. La dernière partie présente les résultats de l'implémentation de nos travaux sur un cas d'utilisation réel. / Nowadays service ecosystems rely on dynamic software service chains that span over multiple organisations and providers. They provide an agile support for business applications, governments of end-users. This trend is reinforced by the Cloud based economy that allows sharing of costs and resources. However, the lack of trust in such cloud environments, that involve higher security requirements, is often seen as a braking force to the development of such services. The objective of this thesis is to study the concepts of service orchestration and trust in the context of the Cloud. It proposes an approach which supports a trust model in order to allow the orchestration of trusted business process components on the cloud. The contribution is threefold and consists in a method, a model and a framework. The method categorizes techniques to transform an existing business process into a risk-aware process model that takes into account security risks related to cloud environments. The model formalizes the relations and the responsibilities between the different actors of the cloud. This allows to identify the different information required to assess and quantify security risks in cloud environments. The framework is a comprehensive approach that decomposes a business process into fragments that can automatically be deployed on multiple clouds. The framework also integrates a selection algorithm that combines security information with other quality of service criteria to generate an optimized configuration. Finally, the work is implemented in order to validate the approach. The framework is implemented in a tool. The security assessment model is also applied over an access control model. The last part presents the results of the implementation of our work on a real world use case.

Page generated in 0.2982 seconds