Spelling suggestions: "subject:" forminformation"" "subject:" informationation""
1 |
A generic framework for the development of 3D information visualization applicationsLeissler, Martin. January 2004 (has links)
Darmstadt, Techn. University, Diss., 2004. / Dateien im PDF-Format.
|
2 |
Improving Enterprise Data Governance Through Ontology and Linked DataDeStefano, R.J. 16 April 2016 (has links)
<p> In the past decade, the role of data has increased exponentially from being the output of a process, to becoming a true corporate asset. As the business landscape becomes increasingly complex and the pace of change increasingly faster, companies need a clear awareness of their data assets, their movement, and how they relate to the organization in order to make informed decisions, reduce cost, and identify opportunity. The increased complexity of corporate technology has also created a high level of risk, as the data moving across a multitude of systems lends itself to a higher likelihood of impacting dependent processes and systems, should something go wrong or be changed. The result of this increased difficulty in managing corporate data assets is poor enterprise data quality, the impacts of which, range in the billions of dollars of waste and lost opportunity to businesses. </p><p> Tools and processes exist to help companies manage this phenomena, however often times, data projects are subject to high amounts of scrutiny as senior leadership struggles to identify return on investment. While there are many tools and methods to increase a companies’ ability to govern data, this research stands by the fact that you can’t govern that which you don’t know. This lack of awareness of the corporate data landscape impacts the ability to govern data, which in turn impacts overall data quality within organizations. </p><p> This research seeks to propose a means for companies to better model the landscape of their data, processes, and organizational attributes through the use of linked data, via the Resource Description Framework (RDF) and ontology. The outcome of adopting such techniques is an increased level of data awareness within the organization, resulting in improved ability to govern corporate data assets. It does this by primarily addressing corporate leadership’s low tolerance for taking on large scale data centric projects. The nature of linked data, with it’s incremental and de-centralized approach to storing information, combined with a rich ecosystem of open source or low cost tools reduces the financial barriers to entry regarding these initiatives. Additionally, linked data’s distributed nature and flexible structure help foster maximum participation throughout the enterprise to assist in capturing information regarding data assets. This increased participation aids in increasing the quality of the information captured by empowering more of the individuals who handle the data to contribute. </p><p> Ontology, in conjunction with linked data, provides an incredibly powerful means to model the complex relationships between an organization, its people, processes, and technology assets. When combined with the graph based nature of RDF the model lends itself to presenting concepts such as data lineage to allow an organization to see the true reach of it’s data. This research further proposes an ontology that is based on data governance standards, visualization examples and queries against data to simulate common data governance situations, as well as guidelines to assist in its implementation in a enterprise setting. </p><p> The result of adopting such techniques will allow for an enterprise to accurately reflect the data assets, stewardship information and integration points that are so necessary to institute effective data governance.</p>
|
3 |
Health informatics on the WebEbenezer, Catherine January 2002 (has links)
An overview of web-based resources for health informatics in the UK.
|
4 |
Measuring information gain in the objective forceBaird, Joseph A. 06 1900 (has links)
Approved for public release; distribution is unlimited. / Many researchers are attempting to quantify or understand the value of information, especially for the Army as it enters its transformation. Information can be decomposed into various qualities. Three of these qualities, timeliness, accuracy, and completeness, form the basis for this thesis. This thesis uses a simulation framework developed by the author to analyze the three components of information listed above. The scenario selected is a typical vignette of an Objective Force company-sized element conducting offensive operations against threat elements. Knowledge of the threat was compromised by the presence of decoy elements as well as previously damaged or killed systems (BDA). In this scenario the fires are initiated from standoff ranges. The initial and running assessments of the threat composition are made based on the information provided by sensors on board the unit's organic unmanned aerial vehicles (UAVs). Analysis of the simulation results helps in understanding how components of information quality affect the overall effectiveness of the force as reflected in an efficiency measure. Additionally, critical thresholds for accuracy, completeness, and timeliness of information are pinpointed to inform Objective Force decision makers. / Major, United States Army
|
5 |
Credit risk models under partial information / Modèles de risque de crédit en information partielleCallegaro, Giorgia 20 October 2010 (has links)
Cette thèse se compose de cinq parties indépendantes dédiées à la modélisation et à l’étude des problèmes liés au risque du défaut, en information partielle. La première partie constitue l’Introduction. La deuxième partie est dédiée au calcul de la probabilité de survie d’une firme, conditionnellement à l’information à disposition de l’investisseur, dans un modèle structurel en information partielle. On utilise une technique numérique hybride basée sur la méthode Monte Carlo et la quantification optimale. Dans la troisième partie on traite, avec l’approche Programmation Dynamique, un problème en temps discret de maximisation de l’utilité de la richesse terminale, dans un marché où des titres soumis au risque du défaut sont négociés. Le risque de contagion entre les défauts est modélisé, ainsi que la possible incertitude du modèle. Dans la quatrième partie on s’intéresse au problème de l’incertitude liée à l’horizon temporel d’investissement. Dans un marché complet soumis au risque du défaut, on résout, soit avec la méthode martingale, soit avec la Programmation Dynamique, trois problèmes de maximisation de l’utilité de la consommation: quand l’horizon temporel est fixe, fini mais incertain et infini. Enfin, dans la cinquième partie on traite un problème purement théorique. Dans le contexte du grossissement de filtrations, notre but est de redémontrer, dans un cadre spécifique, les résultats déjà connus sur la caractérisation des martingales, la décomposition des martingales par rapport à la filtration de référence comme semimartingales dans les filtrations progressivement et initialement grossies et le Théorème de Représentation Prévisible. / This Ph.D. thesis consists of five independent parts devoted to the modeling and to studying problems related to default risk, under partial information. The first part constitutes the Introduction. The second part is devoted to the computation of survival probabilities of a firm, conditionally to the information available to the investor, in a structural model, under partial information. We exploit a numerical hybrid technique based on the application of the Monte Carlo method and of optimal quantization. In the third part we deal, by means of the Dynamic Programming, with a discrete time maximization of the expected utility from terminal wealth problem, in a market where defaultable assets are traded. Contagion risk between the default times is modeled, as well as model uncertainty, by working under partial information. In the fourth part we are interested in studying the problem linked to the uncertainty of the investment horizon. In a complete market model subject to default risk, we solve, both with the martingale method and with the Dynamic Programming, three different problems of maximization of expected utility from consumption: when the investment horizon is fixed, finite but uncertain, and infinite. Finally, in the fifth part we deal with a purely theoretical problem. In the context of enlargement of filtrations our aim is to retrieve, in a specific setting, the already known results on martingales’ characterization, on the decomposition of martingales with respect to the reference filtration as semi-martingales in the progressively and in the initially enlarged filtrations and the Predictable Representation Theorem.
|
6 |
Applying Marchand’s Information Orientation Theory to Sigma Kudos—an Information Product Companyke, yuan January 2011 (has links)
Marchand, Kettinger and Rollins’s (2001) definition of Information Orientation gave us an overall understanding about company’s information management. Different from Market Orientation and Customer Orientation etc., Information Orientation Theory focuses on study of how interaction of people, information and technology affect business performance. It pursues to establish an infrastructure of information technology application capabilities within a business organization to achieve effective information use and business performance improvement. In this study, we are applying Marchand’s Information Orientation Theory to the information technology application system of the Sigma Kudos, an Information Logistics Company by using Case Study methodology. We learnt that Marchand’s Information Orientation theory and its entities can be used perfectly to exam and measure the information management capability of the company. During the study, we find out many interesting aspects which information product companies need to consider about. Among of them, one aspect is the main finding from this study that we integrate resource and vision/strategy to the Information Orientation Theory as they are also the key areas that have an effective impact to a company’s business performance.
|
7 |
Enterprise Information Security - Backup Systems Planning and PracticeLin, Gary 05 July 2002 (has links)
It is well understood that competitiveness is the foundation of business. Efficient information acquisition, distribution and protection proves to not only improve business¡¦ competitiveness but also extend business value to both business partners and customers. Consequently, Information Security has been the rigorous and sustaining challenge to the business.
Thanks to the booming evolution of information technology, business nowadays has proliferated it widely for business operations. Sept 11 catastrophe in US has brought to business a significant yet unforeseen impact ¡V information security reassessment on both backup systems and disaster recovery planning. This document aims at exploring the status quo of domestic enterprises in this regard as well as possible obstacles of the implementation. Through field research and thorough understanding, we¡¦ve observed the differentiation among the industries we investigated. Meanwhile, we hoped to come out some solid recommendations and awareness to the business by applying generally acknowledged standard ¡V BS7799 rules and policies. With that in mind, enterprises then would be able to move themselves faster toward globalization.
For a long time, IT professionals tend to use tape or jukebox as primary data backup media. Today, we can only rely on those tools for alternatives. By current working field, I¡¦m taking the advantage by introducing high-level technologic system frameworks, practices and experiences from international key players in this field. Enterprises are also recommended to start the ¡§BIA ¡V Business Impact Analysis¡¨ to outline a proper DR and Contingency Plan for the sake of substantial and continual support to business interests and long-term benefits!
|
8 |
Measuring information gain in the objective force /Baird, Joseph A. January 2003 (has links) (PDF)
Thesis (M.S. in Operations Research)--Naval Postgraduate School, June 2003. / Thesis advisor(s): Eugene P. Paulo, Alvin F. Crowder, Susan M. Sanchez. Includes bibliographical references (p. 49). Also available online.
|
9 |
A system to support clerical review, correction and confirmation assertions in entity identity information managementChen, Cheng 12 August 2015 (has links)
<p> Clerical review of Entity Resolution(ER) is crucial for maintaining the entity identity integrity of an Entity Identity Information Management (EIIM) system. However, the clerical review process presents several problems. These problems include Entity Identity Structures (EIS) that are difficult to read and interpret, excessive time and effort to review large Identity Knowledgebase (IKB), and the duplication of effort in repeatedly reviewing the same EIS in same EIIM review cycle or across multiple review cycles. Although the original EIIM model envisioned and demonstrated the value of correction assertions, these are applied to correct errors after they have been found. The original EIIM design did not focus on the features needed to support the process of clerical review needed to find these errors. </p><p> The research presented here extends and enhances the original EIIM model in two very significant ways. The first is a design for a pair of confirmation assertions that complement the original set of correction assertions. The confirmation assertions confirm correct linking decisions so that they can be excluded from further clerical review. The second is a design and demonstration of a comprehensive visualization system that supports clerical review, and both correction and confirmation assertion configurations in EIIM. This dissertation also describes how the confirmation assertions and the new visualization system have been successfully integrated into the OYSTER open source EIIM framework.</p>
|
10 |
The Global Research Village: A view from the PeripheryArunachalam, Subbiah January 2002 (has links)
There is a vast difference between the rich and poor countries in every respect. The difference is very pronounced in scientific and technical research, in terms of both volume and impact. Indeed the distribution of science is even more skewed than the distribution of wealth among nations. Science in the developing countries suffers from poor funding, poor laboratory and library facilities, low productivity and poor visibility. Developing country scientists have access to only a tiny fraction of the information they need and their own contribution to science is hardly noticed by others. They are often the also-rans in world science and are rarely members of international invisible colleges or collaboratories. It is important that these countries strengthen their scientific research and their scientists become fully integrated members of the worldwide network of science. But, unfortunately, the transformations effected in the conduct of science with the advent of the new ICTs (such as high bandwidth Internet) and the ever-increasing cost of subscriptions to journals and secondary services are widening the gulf between the industrialized and developing countries. Ironically, the steep rise in the cost of S&T information has helped Third World scientists in a way, as it forced scientists and librarians in the advanced countries to think of measures to overcome the â serials crisisâ many of which can benefit Third World scientists. These include, among others, the Open Archives and E-print Initiatives, Public Library of Science, the Budapest Open Access Initiative, SPARC (the Scholarly Publishing and Academic Resources Coalition), and BioMed Central. Also, eminent scientists like Bruce Alberts and editors like Richard Smith and world leaders like Gro Harlem Brundtland are championing the cause of enhanced access to information for Third World scientists. In response to such moves, commercial publishers of journals have allowed free delayed electronic access to a few high impact journals through institutions such as the Highwire Press of the Stanford University. Under WHOâ s Health InterNetwork, more than 25 commercial publishers have agreed to provide free (or low-cost) web access to about 2,000 biomedical journals for scientists, faculty and students working in universities, hospitals and other public institutions in the poor countries. To benefit from these initiatives, scientists in the Third World should have access to PCs and high bandwidth Internet, and many of them do not. As Bruce Alberts suggests, even if it means subsidising, such access must be ensured. Agencies such as the Third World Academy of Sciences, Inter Academy Panel, and the Inter Academy Council and Foundations such as the Soros Foundation, Rockefeller Foundation, Andrew Mellon Foundation, and the Bill and Melinda Gates Foundation should work in unison to facilitate free flow of S&T information for the benefit of scientists and people everywhere. Scientists everywhere should stop publishing in expensive commercial journals and support efforts aimed at democratising access to scientific information. All this is easier said than done. Commercial publishers will not easily let go the stranglehold they enjoy now, and those who want to bring about drastic changes are dispersed around the world and cannot really act as a cohesive body that can take on the might of the commercial publishers. Mere idealism cannot win. Scientists in developing countries should take advantage of recent initiatives to open up free and low-cost access to scientific and technical information, examine the pros and cons of different possibilities that have become available and choose the right options and enlist the support of key organizations, both national and regional and international. They should become proactive.
This is a background paper commissioned by the International Development Research Centre (IDRC).
|
Page generated in 0.1294 seconds