151 |
A class of theory-decidable inference systems : toward a decision procedure for structured cryptographic protocolsGagnon, François 02 1900 (has links) (PDF)
Dans les deux dernières décennies, l’Internet a apporté une nouvelle dimension aux communications.
Il est maintenant possible de communiquer avec n’importe qui, n’importe
où, n’importe quand et ce, en quelques secondes. Alors que certains systèmes de
communication distribués, comme le courriel, le chat, . . . , sont plutôt informels et
ne nécessitent aucune sécurité, d’autres comme l’échange d’informations militaires ou
encore médicales, le commerce électronique, . . . , sont très formels et nécessitent de très
hauts niveaux de sécurité.
Pour atteindre les objectifs de sécurité voulus, les protocoles cryptographiques sont
souvent utilisés. Cependant, la création et l’analyse de ces protocoles sont très difficiles.
Certains protocoles ont été montrés incorrects plusieurs années après leur conception.
Nous savons maintenant que les méthodes formelles sont le seul espoir pour avoir des
protocoles parfaitement corrects. Ce travail est une contribution dans le domaine de
l’analyse des protocoles cryptographiques de la façon suivante:
• Une classification des méthodes formelles utilisées pour l’analyse des protocoles
cryptographiques.
• L’utilisation des systèmes d’inférence pour la mod´elisation des protocoles cryptographiques.
• La définition d’une classe de systèmes d’inférence qui ont une theorie décidable.
• La proposition d’une procédure de décision pour une grande classe de protocoles
cryptographiques / In the last two decades, Internet brought a new dimension to communications. It
is now possible to communicate with anyone, anywhere at anytime in few seconds.
While some distributed communications, like e-mail, chat, . . . , are rather informal
and require no security at all, others, like military or medical information exchange,
electronic-commerce, . . . , are highly formal and require a quite strong security.
To achieve security goals in distributed communications, it is common to use cryptographic
protocols. However, the informal design and analysis of such protocols are
error-prone. Some protocols were shown to be deficient many years after their conception.
It is now well known that formal methods are the only hope of designing
completely secure cryptographic protocols. This thesis is a contribution in the field of
cryptographic protocols analysis in the following way:
• A classification of the formal methods used in cryptographic protocols analysis.
• The use of inference systems to model cryptographic protocols.
• The definition of a class of theory-decidable inference systems.
• The proposition of a decision procedure for a wide class of cryptographic protocols. / Inscrit au Tableau d'honneur de la Faculté des études supérieures
|
152 |
RTL-Check: A practical static analysis framework to verify memory safety and moreLacroix, Patrice 09 1900 (has links) (PDF)
Puisque les ordinateurs sont omniprésents dans notre société et que, de plus en
plus, nous dépendons de programmes pour accomplir nos activités de tous les jours,
les bogues peuvent parfois avoir des conséquences cruciales. Une grande proportion des
programmes existants sont écrits en C ou en C++ et la plupart des erreurs avec ces
langages sont dues à l’absence de sûreté d’accès à la mémoire. Notre objectif à long
terme est d’être en mesure de vérifier si un programme C ou C++ accède correctement
à la mémoire malgré les défauts de ces langages.
À cette fin, nous avons créé un cadre de développement d’analyses statiques que
nous présentons dans ce mémoire. Il permet de construire des analyses à partir de petits
composants réutilisables qui sont liés automatiquement par métaprogrammation. Il
incorpore également le modèle de conception (design pattern) du visiteur et des algorithmes
utiles pour faire de l’analyse statique. De plus, il fournit un modèle objet pour
le RTL, la représentation intermédiaire de bas niveau pour tous les langages supportés
par GCC. Ceci implique qu’il est possible de concevoir des analyses indépendantes des
langages de programmation.
Nous décrivons également les modules que comporte l’analyse statique que nous
avons développée à l’aide de notre cadre d’analyse et qui vise à vérifier si un programme
respecte les règles d’accès à la mémoire. Cette analyse n’est pas complète, mais elle
est conçue pour être améliorée facilement. Autant le cadre d’analyse que les modules
d’analyse des accès à la mémoire sont distribués dans RTL-Check, un logiciel libre. / Since computers are ubiquitous in our society and we depend more and more on programs
to accomplish our everyday activities, bugs can sometimes have serious consequences.
A large proportion of existing programs are written in C or C++ and the
main source of errors with these programming languages is the absence of memory
safety. Our long term goal is to be able to verify if a C or C++ program accesses
memory correctly in spite of the deficiencies of these languages.
To that end, we have created a static analysis framework which we present in this
thesis. It allows building analyses from small reusable components that are automatically
bound together by metaprogramming. It also incorporates the visitor design
pattern and algorithms that are useful for the development of static analyses. Moreover,
it provides an object model for RTL, the low-level intermediate representation for
all languages supported by GCC. This implies that it is possible to design analyses that
are independent of programming languages.
We also describe the modules that comprise the static analysis we have developed
using our framework and which aims to verify if a program is memory-safe. This analysis
is not yet complete, but it is designed to be easily improved. Both our framework and
our memory access analysis modules are distributed in RTL-Check, an open-source
project.
|
153 |
The research on algorithm of image mosaicWang, Xin January 2008 (has links) (PDF)
Image based rendering (IBR) has been the most Important and rapid developed techniques in the computer graphics and virtual reality fields these years. Image mosaic which is one of the hot topics of IBR is also becoming research interests of many researchers in the image processing and computer vision fields. Its application covers the areas of virtual scene construction, remote sensing, medical image and military affairs etc. However, some difficult issues need to be studied further, including new optimization methods for image registration, new accelerating methods for image stitching etc, which are the main topics of this thesis.
First, as the precision and automatic degree of image mosaic suffers from the algorithm of image registration, a new image stitching optimization method based on maximum mutual information is presented in this thesis. The main idea of the new method is to combine PSO algorithm with wavelet multiresolution strategy and parameters of PSO are adapted along with the resolution of the images.The experiments show that this method can avoid registration process to get stuck into local extremes in image interpolation operations and finds the optimal exchange through limited iterations computation, and obtain subpixel registration accuracy in process of image stitching.
Secondly, to solve the problem of image blur stitching when the geometric deformation and the changes of the scale factor among images are serious, a new method based robust features matching is proposed in this thesis. Firest, it searches overlap area between two images by using phase correlation, and then detects Harris comer points in the overlap areas which are reduced to different scale images by adopting multi resolution pyramid construction. This method can solve the Harris arithmetic operator robust characteristics inspection algorithm for the scale effects. In order to improve the running performance of image feature matching, a dimension reduction method of high dimension feature vector is proposed based on PCA. And last the globe parameters are optimized by Lmeds to realize images mosaic. The experiments prove that the methods proposed by this thesis can reduce the computation cost with guarantee of image mosaic quality.
|
154 |
Nonstandard reamer CAD/CAE/CAM integrated systemZhu, Shijun January 2006 (has links) (PDF)
The Chinese economy has increased quickly and continuously in recent years. Manufacturing is an important part of Chinese industry. The enhancement of modem manufacturing technology is becoming an important issue. The aim of the present thesis is to develop nonstandard (over-stiff) tools5 CAD/CAE/CAPP/CAM integrated system. This will involve integrating modern information technology, manufacturing technology and management technology. The thesis objective is to propose a totally new and integrated design, analysis and manufacturing system. This would provide to manufacturers a capability to carry out the incorporation of CAD/CAE/CAM systems.
Many enterprises and universities have developed standard CAD system tools5 in recent years in China. But in regard to nonstandard complex CAD/CAM tools5, there is still much to be done. After more than one year investigation and research, a great quantity of information and data has been assembled, thus setting a solid foundation for completing this project successfully.
This thesis considers in detail the problem associated with non-standard over stiff end milling cutters to illustrate the design approach used and its implementation. It uses a feature based model constructing technology and applies the CAD and finite element analysis together with relevant theories and technologies for processing and manufacturing. The thesis completes the development of non-standard complicated tools5 CAD \CAE \ CAM integrated system.
The thesis consists of six chapters, dealing with the project's background, system's design and structural design requirements, finite element analysis, CAM, and some concluding remarks based on what was learned during the thesis work. The project's kernel technology is to use finite elements method to analyze the nonstandard complex tools' models using ANS YS large-scaled finite element analysis software.
|
155 |
The implementation of CRP (Capacity requirements planning) moduleBai, Hai January 2009 (has links) (PDF)
ERP (Enterprise Resource Planning) was originated as a management concept based on the supply chain to provide a solution to balance the planning of production within an enterprise. This article focuses on the CRP module in ERP, describes in detail the concept, content and modularization of CRP as well as its usefulness and application in the production process. The function of CRP module is to expand orders to production process to calculate production time, then to adjust the load or accumulative load of each center or machine accordingly. The function described in the article tries to load production time used as load to an unlimited extend, then to use auto-balance or normal adjustment to determine the feasibility of the production order planning.
|
156 |
Research on distributed data mining system and algorithm based on multi-agentJiang, Lingxia January 2009 (has links) (PDF)
Data mining means extracting hidden, previous unknown knowledge and rules with potential value to decision from mass data in database. Association rule mining is a main researching area of data mining area, which is widely used in practice. With the development of network technology and the improvement of level of IT application, distributed database is commonly used. Distributed data mining is mining overall knowledge which is useful for management and decision from database distributed in geography. It has become an important issue in data mining analysis. Distributed data mining can achieve a mining task with computers in different site on the internet. It can not only improve the mining efficiency, reduce the transmitting amount of network data, but is also good for security and privacy of data. Based on related theories and current research situation of data mining and distributed data mining, this thesis will focus on analysis on the structure of distributed mining system and distributed association rule mining algorithm.
This thesis first raises a structure of distributed data mining system which is base on multi-agent. It adopts star network topology, and realize distributed saving mass data mining with multi-agent. Based on raised distributed data mining system, this these brings about a new distributed association rule mining algorithm?RK-tree algorithm. RK-tree algorithm is based on the basic theory of twice knowledge combination. Each sub-site point first mines local frequency itemset from local database, then send the mined local frequency itemset to the main site point. The main site point combines those local frequency itemset and get overall candidate frequency itemset, and send the obtained overall candidate frequency itemset to each sub-site point. Each sub-site point count the supporting rate of those overall candidate frequency itemset and sent it back to the main site point. At last, the main site point combines the results sent by sub-site point and gets the overall frequency itemset and overall association rule. This algorithm just needs three times communication between the main and sub-site points, which greatly reduces the amount and times of communication, and improves the efficiency of selection. What's more, each sub-site point can fully use existing good centralized association rule mining algorithm to realize local association rule mining, which can enable them to obtain better local data mining efficiency, as well as reduce the workload. This algorithm is simple and easy to realize. The last part of this thesis is the conclusion of the analysis, as well as the direction of further research.
|
157 |
Research on detecting mechanism for Trojan horse based on PE filePan, Ming January 2009 (has links) (PDF)
As malicious programs, Trojan horses have become a huge threat to computer networks security. Trojan horses can easily cause loss, damage or even theft of data because they are usually disguised as something useful or desirable, and are always mistakenly activated by computer users, corporations and other organizations. Thus, it is important to adopt an effective and efficient method to detect the Trojan horses, and the exploration of a new method of detection is of greater significance.
Scientists and experts have tried many approaches to detecting Trojan horses since they realized the harms of the programs. Up to now, these methods fall mainly into two categories [2]. The first category is to detect Trojan horses through checking the port of computers since the Trojan horses send out message through computer ports [2]. However, these methods can only detect the Trojan horses that are just working when detected. The second class is to detect Trojan horses by examining the signatures of files [2] [19], in the same way as people deal with computer virus. As new Trojan horses may contain unknown signatures, methods in this category may not be effective enough when new and unknown Trojan horses appear continuously, sending out unknown signatures that escape detection.
For the above-mentioned reasons, without exception, there are limitations in the existing methods if the un-awakened and unknown Trojan horses are to be detected. This thesis proposes a new method that can detect un-awakened and unknown Trojan horses- the detection by using of a file's static characteristics. This thesis takes PE file format as the object of the research, because approximately 75% of personal computers worldwide are installed the Microsoft Windows [4], and that Trojan horses usually exist as a Portable Executable (PE) file in the Windows platform. Based on the (PE) file format, my research gets all the static information of each part of PE file which is characteristic of a file. Then, this static information is analyzed by the intelligent information processing techniques. Next, a detection model is established to estimate whether a PE file is a Trojan horse. This model can detect the unknown Trojan horses by analyzing static characteristics of a file. The information that is used to verify detecting model is new and unknown to the detecting model; in other words, the information is not used during the training of the model.
The thesis is organized as follows. First, this thesis discusses the limitations of traditional detection techniques, related works of research, and a new method to detect Trojan horse based on file's static information. Second, the thesis focuses on the research of the Trojan horse detecting models, covering the extracting of the static information from PE file, choice of intelligent information processing techniques, and setting up the Trojan horse detecting model. Lastly, the thesis discusses the direction of future research in this field.
|
158 |
Analysis of credit card data based on data mining techniqueZheng, Ying January 2009 (has links) (PDF)
In recent years, large amounts of data have accumulated with the application of database systems. Meanwhile, the requirements of applications have not been confined in the simple operations, such as search and retrieval, because these operations were not helpful in finding the valuable information from the databases. The hidden knowledge is hard to be handled by the present database techniques, so a great wealth of knowledge concealed in the databases is not developed and utilized mostly.
Data mining aimed at finding the essential significant knowledge by automatic process of database. DM technique was one of the most challenging studies in database and decision-making fields. The data range processed was considerably vast from natural science, social science, business information to the data produced from scientific process and satellite observation. The present focuses of DM were changed from theories to practical application. Where the database existed, there were many projects about DM to be studied on.
The paper concentrated on the research about data information in credit card by DM theories, techniques and methods to mine the valuable knowledge from the card. Firstly, the basic theories, key algorithms of DM techniques were introduced. The emphases were focused on the decision tree algorithms, neural networks, X-means algorithm in cluster and Apriori algorithm in association rule by understanding the background of bank and analyzing the knowledge available in the credit card. A preliminary analysis of credit card information, Industry and Business Bank at Tianjin Department, was performed based on the conversion and integration of data warehouse. The combined databases including information of customers and consumptive properties were established in accordance with the idea of data-warehouse. The data were clustered by iT-means algorithm to find valuable knowledge and frequent intervals of transaction in credit card. Back propagation neural networks were designed to classify the information of credit card, which played an important role in evaluation and prediction of customers. In addition, the Apriori algorithm was achieved to process the abovementioned data, which could establish the relations between credit information of customers and consumption properties, and to find the association rule among credit items themselves, providing a solid foundation for further revision of information evaluation.
Our work showed that DM technique made great significance in analyzing the information of credit card, and laid down a firm foundation for further research in the retrieval information from the credit card.
|
159 |
Design of uniform campus identity authentication system based on LDAPGuo, Hongpei January 2006 (has links) (PDF)
With the development of campus network, many kinds of applications based on campus network get a rapid development in recent years. For management and expansion requirements, many of these applications need functionality provided by a uniform identity authentication system. On the other hand, LDAP (Lightweight Directory Accessing Protocol) and related technologies, taking advantage of distributed Directory Service architecture, organize and manage resources in network effectively and provide availability and security of system accessing, can be used for designing of a uniform identity authentication system.
This thesis discusses a design of uniform campus identity authentication system based on LDAP. The thesis analyzes the common situation of campus network application in China, describes what a uniform identity authentication system is and the necessity of designing a uniform identity authentication system, introduces the LDAP and related technologies in brief and designs a framework of uniform campus identity authentication system.
|
160 |
The research of the error analysis algorithm with 3D scanning dataDou, Yanmei January 2006 (has links) (PDF)
With the increase in the automatization level of industrial production, the three-dimensional laser scanning system has been applied in industrial production more and more widely with its advantages of high identification rate, non-destructive capability, and so on. In order to increase the product quality, product testing becomes an important and indispensable link in the industrial production. The research aim of this thesis is, to develop an automatic on-line testing system, based on 3D laser scanner technology. Such a system can contribute to the realization of a complete automatic process from measuring the product to calculating its form error and further judging whether the product is satisfactory or not.
This thesis, starting from the data points obtained from a from 3D scanner, after filtering the noise points in the right and left CCD, makes use of NURBS surface to fit these point data, and reconstruct the surface of the measured object. Based on the principle of the Least Region to assess the form error of the surface, we obtain a mathematic model of the form error of the surface, and further making use an improved Genetic Algorithm, calculate the form error of the object. According to the given acceptable scale of the error, we can judge whether the product is satisfactory or not. Finally, the program design work of the whole system is realized through the use of VC++6.0, with a user interface to calculate form error being implemented.
This research can be applied in the industrial production in order to increase the product quality and the level of automatization. Other potential applications include artificial intelligence, medicine, etc.
|
Page generated in 0.0347 seconds