• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 235
  • 45
  • 25
  • 25
  • 25
  • 25
  • 25
  • 25
  • 5
  • 1
  • Tagged with
  • 332
  • 332
  • 332
  • 61
  • 60
  • 47
  • 21
  • 21
  • 21
  • 21
  • 18
  • 16
  • 13
  • 13
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Constuction des normes, entre stratégie et communication: Un cas de négotiation collective

Parent, Marc-Antonie January 2006 (has links)
Nous décrivons un processus de négociation, afin d'identifier l'interaction entre l'argumentation, basée sur des principes, et le marchandage, basé sur un rapport de force. Selon Bent Flyvbjerg, le pouvoir permet des manquements systématiques à l'éthique communicative prônée par Jurgen Habermas comme une nécessité philosophique. Selon Joseph Heath, l'exercice du pouvoir stratégique est de facto limité par les normes sociales, qui ont concrètement besoin de l'éthique communicative pour appliquer les sanctions. Nous avons procédé à une observation participante dans un cadre de la renégociation d'une convention collective, et analysons l'emploi des arguments dans nos données ethnographiques. Notre analyse montre en effet des défauts dans l'argumentation, mais qui ne sont pas explicables simplement selon le pouvoir où les normes sociales. Nous proposons une superposition des niveaux d'analyse stratégique, normative, dramaturgique et communicationnelle. Nous identifions également une collusions qui maintient certains dissensus, interprétés, selon Habermas, comme résistance à la convention comme média gouvernant.
102

Window-based stream data mining for classification of Internet traffic

Mumtaz, Ali January 2008 (has links)
Accurate classification of Internet applications is a fundamental requirement for network provisioning, network security, maintaining quality of services and network management. Increasingly, new applications are being introduced on the Internet. The traffic volume and patterns of some of the new applications such as Peer-to-Peer (P2P) file sharing put pressure on service providers' networks in terms of congestion and delay, to the point that maintaining Quality of Services (QoS) planned in the access network requires the provisioning of additional bandwidth sooner than planned. Peer-to-Peer applications enable users to communicate directly over the Internet, thus bypassing central server control implemented by service providers and poses threats in terms of network congestion, and creating an environment for malicious attacks on networks. One key challenge in this area is to adapt to the dynamic nature of Internet traffic. With the growth in Internet traffic, in terms of number and type of applications, traditional classification techniques such as port matching, protocol decoding or packet payload analysis are no longer effective For instance, P2P applications may use randomly selected non-standard ports to communicate which makes it difficult to distinguish from other types of traffic only by inspecting port number. The present research introduces two new techniques to classify stream (online) data using K-means clustering and Fast Decision Tree (FDT). In the first technique, we first generate micro-clusters using k-means clustering with different values of k. Micro clusters are then merged into two clusters based on weighted averages of P2P and NonP2P population. This technique generates two merged clusters, each representing P2P or NonP2P traffic. The simulation results confirm that the two final clusters represent P2P and NonP2P traffic each with a good accuracy. The second technique employs a two-stage architecture for classification of P2P traffic, where in the first stage, the traffic is filtered using standard port numbers and layer 4 port matching to label well-known P2P and NonP2P traffics, leaving the rest of the traffic as "Unknown". The labeled traffic generated in the first stage is used to train a Fast Decision Tree (FDT) classifier with high accuracy. The Unknown traffic is then applied to the FDT model which classifies the traffic into P2P and NonP2P with high accuracy. The two-stage architecture, therefore, not only classifies well-known P2P applications, it also classifies applications that use random or private (non standard) port numbers and can not be classified otherwise. We performed various experiments where we captured Internet traffic at a main gateway router, pre-processed the data and selected three most significant attributes, namely Packet Length, Source IP address and Destination IP address. We then applied the proposed technique to three different windows of records. Accuracy, Specificity and Sensitivity of the model are calculated. Our simulation results confirm that the predicted output represents P2P and NonP2P traffic with accuracy higher than 90%.
103

Architectures for online reputation systems

Zhang, Bo January 2008 (has links)
E-commerce provides a platform where complete strangers from all over the world have the chance to transact with each other. Reputation is a key factor in the success of such transactions, therefore reputation systems are in widespread use to help participants build their online reputation. This thesis analyzes current reputation systems and studies how to design architectures for online reputation systems. Based on the shortcomings found in current reputation systems, this thesis proposes and implements a reputation architecture called ORAS (online reputation aggregation system) Plus which provides a portable reputation mechanism for Web users. With ORAS Plus, users can search others' global reputation and obtain their own reputation cards which can be displayed at any website. ORAS Plus can help users know someone's online behaviors all over the Internet. More importantly, it allows users to take their existing reputation from one website to other websites where they are unknown to others.
104

Autonomic computer network defence using risk states and reinforcement learning

Beaudoin, Luc January 2009 (has links)
Autonomic Computer Network Defence aims to achieve self-protection capability of IT networks in order to limit the risk caused by malicious and accidental events. To achieve this, networks require an automated controller with a policy, which allows selecting the most appropriate action in any undesired network state. Due to the complexity and constant evolution of the Computer Network Defence environment, a-priori design of an automated controller is not effective. A solution for generating and continuously improving decision policies is needed. A significant technical challenge in Autonomic Computer Network Defence is finding a strategy to efficiently generate, trial and compare different policies and retain the best performing one. To address this challenge, we use Reinforcement Learning to explore Computer Network Defence action and state spaces and to learn which policy optimally reduces risk. A simulated Computer Network Defence environment is implemented using Discrete Event Dynamic System simulation and a novel graph model. A network asset value assessment technique and a dynamic risk assessment algorithm are also implemented to provide evaluation metrics. This environment is used to train two Reinforcement Learning agents, one with a table policy and the other with a neural network policy. The resulting policies are then compared to three other empirical policies for their risk performances. These empirical policies serve as evaluation baseline and include: letting risk grow without any action, randomly selecting valid actions, and selecting the next action based on the pre-computed asset value of the affected assets and choosing the one with the highest value first. We found that in all test scenarios, both Reinforcement Learning policies evaluated improved the overall risk when compared to random selection of valid actions. In one simple scenario, both Reinforcement Learning policies converged to the same optimum policy with better risk performance than other assessed policies. Generally, we found that, for the tested scenarios and training strategies, a simple policy addressing affected assets in the order of their asset values, can generally yield superior results.
105

A multiplexed feedback solution for isolated digital power supplies

Hu, Zhiyuan January 2010 (has links)
As digital power supply became a significant market opportunity, manufacturers started looking at ways to control the system cost by reducing components and their technical requirements. This thesis proposes a multiplexed digital feedback solution for isolated digital power supply systems. Several digital isolators used in existing applications can be reduced to one. The data transmission speed is greatly reduced. The communication protocol is DC-balanced; therefore pulse-transformer type digital isolators can be used resulting in further lower cost. The proposed solution also improves output accuracy over existing solutions. Presentation of the proposed system will be in three parts: architecture, protocol, and Piece-wise Linear Feedback method. Both theoretical analysis and practical implementation will be discussed, and the simulation results included support the theoretical analysis. A prototype board was built and its performance is promising.
106

Exploring How Model Oriented Programming Can Be Extended to the User Interface Level

Solano, Julian January 2010 (has links)
The purpose of our research is to explore the alternatives to extend well-defined UML to the user interface level. For the novice software modeler there is a gap between how the model looks and how the final product should look. The implications of some design decisions might not be easy to analyze without strategies like story boards, prototyping, etc. A cornerstone of our work is the use of the text-based modeling language Umple (UML Programming Language) and its metamodel as input. Umple has a similar syntax to Java, but is enhanced with additional modeling constructs. In this way our target was the creation of a code generator capable of interpreting a subset of the Umple language to produce complete working applications, by providing a translation into existing object-oriented programming languages. Using this generator, the software modeler can create working prototypes to help him to validate the correctness of the designed model.
107

Modeling Internet Penetration in Canada

Algwaiz, Noura January 2010 (has links)
Internet penetration is an important measure for a knowledge-based economy as it indicates how connected it is to the internet. It does not spread evenly across regions and societies, which results in digital divides. Despite being one of the most connected countries, Canada suffers from uneven penetration rates across the country. In this research, we study the socio-economic factors that int1uence Internet penetration in Canada. We found that the int1uence of rurality has decreased between 2005 and 2007, which suggests that initiatives that took place in those two years were effective. We also analyze the differences among the regions of Canada and found that the regions least int1uenced by the demographic variables are not necessarily the ones with the highest penetration rates. Therefore, a mere look at the penetration rates across the regions is not enough to assess the connectivity of the region and the digital divide within.
108

A Recommender System using Tag-based Collaborative User Model

Alkhaldi, Abdulmajeed January 2011 (has links)
Internet users are overwhelmed by a huge media amount available online. Therefore there is a need of an automated way to make compelling recommendations to users according to their needs. There have been many research efforts to reduce that huge amount of content to what the user really needs or prefers. Recommender systems assisting users in easily finding the useful information, are a main research topic that serves this area. According to techniques recommender systems employ, they are mainly classified into three categories: a collaborative-based filtering, content-based filtering, and hybrid filtering. Collaborative filtering relies on the collaboration of users by capturing their judgments on items, and then recommends these items to users with similar taste. Content-based filtering takes advantage of content of a user's preferred items and recommends new items that have similar content. Hybrid filtering takes advantage of both collaborative and content- based filtering and might be in a different ways. No matter what the technique is used, recommender systems require an accurate user model that can reflect a user's characteristics, preferences, and topics of interest. In addition, the systems should take into account users who newly join the systems and thus has presented few opinions, commonly referred to as the cold start users problem. In our research, by leveraging user-generated tags, we propose the topic-driven enriched user model (EM), which is a new way of modeling a user's topics of interest in collaboration with other similar users, in order to improve the recommendation quality and alleviate the cold start user problem. We also present how the proposed model is applied to item recommendations by using locally weighted naive Bayes approach. For evaluating the performance of our model, we compare experimental results with a user model based on user-based collaborative filtering, a user model based on an item-based collaborative filtering, and a vector space model. The experimental results shows that EM outperforms the three algorithms in both recommendation quality and the cold start situation.
109

A Mashup Based Framework for Multi Level Healthcare Interoperability

Sadeghi, Payam January 2011 (has links)
During the past few years, various healthcare models and e-Health 2.0 technologies have been developed in order to effectively deliver the right information to the right process to provide effective and efficient healthcare services. On the other hand, healthcare delivery is evolving from disease-centered to patient-centered where patients are active participants in their healthcare delivery. Thus communications and collaboration among different healthcare actors is taking place on a much larger scale. There is also an increasing demand for personalized health systems facilitating the effective management of information, simplifying communication and collaboration, and supporting applications and services for meeting different users' specific requirements and ongoing needs. In order to properly address the aforementioned challenges, a framework is needed to advance information integration and interoperability of health applications and services in a controlled manner. In this thesis, we present a framework which allows patients and other healthcare actors to collaboratively develop personalized online health applications according to their specific and ongoing needs and requirements. For this purpose, we illustrate how Web 2.0 collaborative technologies, such as mashups, can represent an adequate foundation for implementing such framework. The value and capabilities of mashups in healthcare have already been studied and demonstrated, and this technology is able to provide an interoperable framework for communication and integration between healthcare processes and applications. We believe that integration and interoperability of health applications/services can be defined at the following levels: Process Level, System Level, and Data Level. The interoperability and integration of services at the system and data levels have already been intensively researched. However, not enough consideration has been given to interoperability issues at the process level. Healthcare must have interoperable systems and interoperable people who will use the systems. Therefore, a shift from a technology-driven implementation to a process-driven conceptual model is needed. Our aim in this thesis is to further research how Web 2.0 technologies and tools, such a mashups, can facilitate the exchange of processes between various healthcare entities and actors, and the role of mashup patterns for enhancing the interoperability and integration of healthcare services and applications.
110

Experimental data acquisition and modeling of three-dimensional deformable objects using neural networks

Cretu, Ana-Maria January 2009 (has links)
Nowadays there are many technologies and design tools available to accurately obtain and model the geometric shape and the color of objects. However, these methods are not able to provide any information about the elasticity of the objects. This thesis presents a general-purpose scheme for measuring, constructing and representing geometric and elastic behavior of deformable objects without a priori knowledge on the shape and the material that the objects under study are made of. The proposed solution is based on an advantageous combination of neural network architectures and an original force-deformation measurement procedure. An innovative non-uniform selective data acquisition algorithm based on self-organizing neural architectures (namely neural gas and growing neural gas) is developed to selectively and iteratively identify regions of interest and guide the acquisition of data only on those points that are relevant for both the geometric model and the mapping of the elastic behavior, starting from a sparse point-cloud of an object. Multi-resolution object models are obtained using the initial sparse model or the (growing or) neural gas map if a more compressed model is desired, and augmenting it with the higher resolution measurements selectively collected over the regions of interest. A feedforward neural network is then employed to capture the complex relationship between an applied force, its magnitude, its angle of application and its point of interaction, the object pose and the deformation stage of the object on one side, and the object surface deformation for each region with similar geometric and elastic behavior on the other side. The proposed framework works directly from raw range data and obtains compact point-based models. It can deal with different types of materials, distinguishes between the different stages of deformation of an object and models homogeneous and non-homogeneous objects as well. It also offers the desired degree of control to the user.

Page generated in 0.1856 seconds