Spelling suggestions: "subject:"pages"" "subject:"wages""
1 |
Barocke Titelgraphik am Beispiel der Verlagsstadt Köln, 1570-1700 : Funktion, Sujet, Typologie /Frese, Anette. January 1989 (has links)
Diss.--[Kunstgeschichte]--Universität Köln, 1986.
|
2 |
Internet-based resource discovery in chemo-bioinformaticsGkoutos, Georgios Vasileios January 2002 (has links)
No description available.
|
3 |
Automatic classification and metadata generation for world-wide web resourcesJenkins, Charlotte January 2002 (has links)
The aims of this project are to investigate the possibility and potential of automatically classifying Web documents according to a traditional library classification scheme and to investigate the extent to which automatic classification can be used in automatic metadata generation on the web. The Wolverhampton Web Library (WWLib) is a search engine that classifies UK Web pages according to Dewey Decimal Classification (DDC). This search engine is introduced as an example application that would benefit from an automatic classification component such as that described in the thesis. Different approaches to information resource discovery and resource description on the Web are reviewed, as are traditional Information Retrieval (IR) techniques relevant to resource discovery on the Web. The design, implementation and evaluation of an automatic classifier, that classifies Web pages according to DDC, is documented. The evaluation shows that automatic classification is possible and could be used to improve the performance of a search engine. This classifier is then extended to perform automatic metadata generation using the Resource Description Framework (RDF) and Dublin Core. A proposed RDF data model, schema and automatically generated RDF syntax are documented. Automatically generated RDF metadata describing a range of automatically classified documents is shown. The research shows that automatic classification is possible and could potentially be used to enable context sensitive browsing in automated web search engines. The classifications could also be used in generating context sensitive metadata tailored specifically for the search engine domain.
|
4 |
A moderate revolutionary of 1848 : a study of the republicanism of Louis Antoine Garnier-Pages /Archer, Prue H. January 1972 (has links) (PDF)
Thesis (B.A.(Hons.))-- University of Adelaide, Dept. of History, 1972.
|
5 |
Entwicklung und Implementierung einer statistischen Auswertung für die Dekanatsdatenbank der Fakultät Elektrotechnik und InformationstechnikKarimian Sichani, Mandana. January 2000 (has links)
Stuttgart, Univ., Studienarb., 2000.
|
6 |
The title page as the source of information for bibliographic description an analysis of its visual and linguistic characteristics /Jeng, Ling Hwey, January 1900 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 1987. / Vita. Description based on print version record. Includes bibliographical references (leaves 227-233).
|
7 |
The colophon: History and analysisUnknown Date (has links)
"It is delightful to read for pleasure and information at the same time, but only rarely does the reader find material that satisfies him equally on both counts. Too often is presented 'more matter with less art' or form with too little substance. A happy combination conducive to both pleasure and instruction, however, may be found in The Colophon, A Book Collector's Quarterly. This magazine, which for thirteen years provided readers with entertaining articles concerning bibliography, book illustration, and fine printing, was able to survive the depression, yet was unable to be continued during the period when the nation was preparing for war. A detailed consideration of its history and an evaluation of its contents is the burden of the paper, a project which would seem appropriate in the training of a librarian in that the evaluation and selection of magazines is a part of almost every librarian's duties and is of as much importance as the selection of books"--Introduction. / "May, 1958." / Typescript. / "Submitted to the Graduate Council of Florida State University in partial fulfillment of the requirements for the degree of Master of Arts." / Advisor: Agnes Gregory, Professor Directing Paper. / Includes bibliographical references (leaves 87-89).
|
8 |
Vergleichende Implementierung einer verteilten Anwendung unter Nutzung von CORBA/IIOP, RMI und JSPTandjung, Kristian. January 2001 (has links)
Stuttgart, Univ., Diplomarb., 2001.
|
9 |
Segmentation de pages web, évaluation et applications / Web page segmentation, evaluation and applicationsSanoja Vargas, Andrés 22 January 2015 (has links)
Les pages web sont devenues plus complexes que jamais, principalement parce qu'elles sont générées par des systèmes de gestion de contenu (CMS). Il est donc difficile de les analyser, c'est-à-dire d'identifier et classifier automatiquement les différents éléments qui les composent. La segmentation de pages web est une des solutions à ce problème. Elle consiste à décomposer une page web en segments, visuellement et sémantiquement cohérents, appelés blocs. La qualité d'une segmentation est mesurée par sa correction et sa généricité, c'est-à-dire sa capacité à traiter des pages web de différents types. Notre recherche se concentre sur l'amélioration de la segmentation et sur une mesure fiable et équitable de la qualité des segmenteurs. Nous proposons un modèle pour la segmentation ainsi que notre segmenteur Block-o-Matic (BoM). Nous définissons un modèle d'évaluation qui prend en compte le contenu ainsi que la géométrie des blocs pour mesurer la correction d'un segmenteur par rapport à une vérité de terrain. Ce modèle est générique, il permet de tester tout algorithme de segmentation et observer ses performances sur différents types de page. Nous l'avons testé sur quatre segmenteurs et quatre types de pages. Les résultats montrent que BOM surpasse ses concurrents en général et que la performance relative d'un segmenteur dépend du type de page. Enfin, nous présentons deux applications développées au dessus de BOM. Pagelyzer compare deux versions de pages web et décide si elles sont similaires ou pas. C'est la principale contribution de notre équipe au projet européen Scape (FP7-IP). Nous avons aussi développé un outil de migration de pages HTML4 vers le nouveau format HTML5. / Web pages are becoming more complex than ever, as they are generated by Content Management Systems (CMS). Thus, analyzing them, i.e. automatically identifying and classifying different elements from Web pages, such as main content, menus, among others, becomes difficult. A solution to this issue is provided by Web page segmentation which refers to the process of dividing a Web page into visually and semantically coherent segments called blocks.The quality of a Web page segmenter is measured by its correctness and its genericity, i.e. the variety of Web page types it is able to segment. Our research focuses on enhancing this quality and measuring it in a fair and accurate way. We first propose a conceptual model for segmentation, as well as Block-o-Matic (BoM), our Web page segmenter. We propose an evaluation model that takes the content as well as the geometry of blocks into account in order to measure the correctness of a segmentation algorithm according to a predefined ground truth. The quality of four state of the art algorithms is experimentally tested on four types of pages. Our evaluation framework allows testing any segmenter, i.e. measuring their quality. The results show that BoM presents the best performance among the four segmentation algorithms tested, and also that the performance of segmenters depends on the type of page to segment.We present two applications of BoM. Pagelyzer uses BoM for comparing two Web pages versions and decides if they are similar or not. It is the main contribution of our team to the European project Scape (FP7-IP). We also developed a migration tool of Web pages from HTML4 format to HTML5 format in the context of Web archives.
|
10 |
Project and team handler (PTH)Alhabashneh, Mohammad Abad Alhameed January 2006 (has links)
There is always a need for easy-to-follow processes that enable accurate and non-time consuming solutions. Nowadays we see a lot of different approaches to development processes in software engineering. This project is concerned with how to manage a software development process in a reliable, secure and efficient way. Software is available which provides some help for project managers / administrators to work more productively, with effective communication. Using such systems, it is possible to keep track of all the phases of development, including task distribution, making maximum use of previous hands-on experience and increasing productivity, to deliver a finished product in minimum time. No existing solution, however, fulfills all the desirable criteria. This paper describes the motivation, design and implementation of an improved development management system using Active Server Pages and Microsoft Internet Information Services with a backend Microsoft Access Database developed using a waterfall software development process. The resulting system is described and evaluated. This system will be beneficial for software houses, because they can communicate on the web, allowing efficiency gains by avoiding the need to call meetings for distribution of tasks among employees, with the additional advantage of location-transparent team management through the Internet.
|
Page generated in 0.0375 seconds