Spelling suggestions: "subject:"anline data"" "subject:"bnline data""
41 |
Developing distributed applications with distributed heterogenous databasesDixon, Eric Richard 19 May 2010 (has links)
This report identifies how Tuxedo fits into the scheme of distributed database processing. Tuxedo is an On-Line Transaction Processing (OLTP) system. Tuxedo was studied because it is the oldest and most widely used transaction processing system on UNIX. That means that it is established, extensively tested, and has the most tools available to extend its capabilities. The disadvantage of Tuxedo is that newer UNIX OLTP systems are often based on more advanced technology. For this reason, other OLTPs were examined to compare their additional capabilities with those offered by Tuxedo.
As discussed in Sections I and II, Tuxedo is modeled according to the X/Open's Distributed Transaction Processing (DTP) model. The DTP model includes three pieces: Application Programs (APs), Transaction Monitors (TMs), and Resource Managers (RMs). Tuxedo provides a TM in the model and uses the XA specification to communicate with RMs (e.g. Informix). Tuxedo's TX specification, which defines communications between the APs and TMs is also being considered by X/Open as the standard interface between APs and TMs. There is currently no standard interface between those two pieces. Tuxedo conforms to all X/Open's current standards related to the model.
Like the other major OLTPs for UNIX, Tuxedo is based on the client/server model. Tuxedo expands that support to include both synchronous and asynchronous service calls. Tuxedo calls that extension the enhanced client/server model. Tuxedo also expands their OLTP support to allow distributed transactions to include databases on IBM compatible Personal Computers (PCs) and proprietary mainframe (Host) systems. Tuxedo calls this extension Enterprise Transaction Processing (ETP). The name enterprise comes from the fact that since Tuxedo supports database transactions supporting UNIX, PCs. and Host computers, transactions can span the computer systems of entire businesses, or enterprises.
Tuxedo is not as robust as the distributed database system model presented by Date. Tuxedo requires programmer participation in providing the capabilities that Date says the distributed database manager should provide. The coordinating process is the process which is coordinating a global transaction. According to Date's model, agents exist on remote sites participating in the transaction in order to handle the calls to the local resource manager. In Tuxedo, the programmer must provide that agent code in the form of services.
Tuxedo does provide location transparency, but not in the form Date describes. Date describes location transparency as controlled by a global catalog. In Tuxedo, location transparency is provided by the location of servers as specified in the Tuxedo configuration file. Tuxedo also does not provide replication transparency as specified by Date. In Tuxedo, the programmer must write services which maintain replicated records.
Date also describes five problems faced by distributed database managers. The first problem is query processing. Tuxedo provides capabilities to fetch records from databases, but does not provide the capabilities to do joins across distributed databases. The second problem is update propagation. Tuxedo does not provide for replication transparency. Tuxedo does provide enough capabilities for programmers to reliably maintain replicated records. The third problem is concurrency control, which is supported by Tuxedo. The fourth problem is the commit protocol. Tuxedo's commit protocol is the two-phase commit protocol. The fifth problem is the global catalog. Tuxedo does not have a global catalog.
The other comparison presented in the paper was between Tuxedo and the other major UNIX OL TPs: Transarc's Encina, Top End, and CICS. Tuxedo is the oldest and has the largest market share. This gives 38 Tuxedo the advantage of being the most thoroughly tested and the most stable. Tuxedo also has the most tools available to extend its capabilities. The disadvantage Tuxedo has is that since it is the oldest, it is based on the oldest technology.
Transarc's Encina is the most advanced UNIX OLTP. Encina is based on DCB and supports multithreading. However, Encina has been slow to market and has had stability problems because of its advanced features. Also, since Encina is based on DCB, its success is tied to the success of DCB. Top End is less advanced than Encina, but more advanced than Tuxedo. It is also much more stable than Encina. However. Top End is only now being ported from the NCR machines on which it was originally built. CICS is not yet commercially available. CICS is good for companies with CICS code to port to UNIX and CICS programmers who are already experts. The disadvantage to CICS is that companies which work with UNIX already and do not use CICS will find the interface less natural than Tuxedo, which originated under UNIX. / Master of Science
|
42 |
Monitoring Dengue Outbreaks Using Online DataChartree, Jedsada 05 1900 (has links)
Internet technology has affected humans' lives in many disciplines. The search engine is one of the most important Internet tools in that it allows people to search for what they want. Search queries entered in a web search engine can be used to predict dengue incidence. This vector borne disease causes severe illness and kills a large number of people every year. This dissertation utilizes the capabilities of search queries related to dengue and climate to forecast the number of dengue cases. Several machine learning techniques are applied for data analysis, including Multiple Linear Regression, Artificial Neural Networks, and the Seasonal Autoregressive Integrated Moving Average. Predictive models produced from these machine learning methods are measured for their performance to find which technique generates the best model for dengue prediction. The results of experiments presented in this dissertation indicate that search query data related to dengue and climate can be used to forecast the number of dengue cases. The performance measurement of predictive models shows that Artificial Neural Networks outperform the others. These results will help public health officials in planning to deal with the outbreaks.
|
43 |
On-line real-time information system in manufacturing -- key to survive?.January 1990 (has links)
by Leung, Brain Shui-kei, Ng, Timmy Kwok-wai. / Thesis (M.B.A.)--Chinese University of Hong Kong, 1990. / Bibliography: leaves 72-73. / ABSTRACT --- p.ii / TABLE OF CONTENTS --- p.iii / LIST OF ILLUSTRATIONS --- p.vi / LIST OF TABLES --- p.vii / ACKNOWLEDGEMENTS --- p.viii / Chapter / Chapter I. --- INTRODUCTION --- p.1 / Research Methodology --- p.4 / Chapter II. --- FACTORY-FLOOR COMPUTERIZATION OF ELECTRONICS MANUFACTURING INDUSTRY IN HONG KONG --- p.5 / An Overview of the Worldwide Electronics Manufacturing Industry --- p.5 / Electronics Manufacturing Industry in Hong Kong --- p.11 / Production Control Problems in Electronics Manufacturing Firms in Hong Kong --- p.13 / The Survey --- p.15 / The Solution: On-Line Real-Time Information System --- p.24 / Chapter III. --- A STUDY OF THE APPLICABILITY OF THE SUGGESTED ON-LINE REAL-TIME INFORMATION SYSTEM TO A MEDIUM SIZED ELECTRONICS MANUFACTURING FIRM IN HONG KONG / The Company --- p.29 / A Study of the Existing Production Controlling System --- p.30 / Problems with the Existing Production Controlling System --- p.31 / """OLRTIS""" --- p.35 / The Design of OLRTIS --- p.36 / Implementation of the Proposed OLRTIS --- p.47 / Cost / Benefits Analysis --- p.49 / Chapter IV. --- PRACTICAL CONSIDERATION --- p.56 / Top Management Attitude --- p.56 / Training of Staff and Workers --- p.57 / Acceptance/Resistance of the New System --- p.57 / "Garbage In, Garbage Out" --- p.59 / Impact of the New System on Management Organization --- p.59 / Human Resources --- p.60 / Chapter V. --- CONCLUSION --- p.61 / APPENDIXES --- p.63 / BIBLIOGRAPHY --- p.72
|
44 |
A prototype to illustrate interaction with a personnel databaseRashid, Haroon January 2010 (has links)
Digitized by Kansas Correctional Industries
|
45 |
Quality of service support for progressive video transmission over InternetKim, Minjung 01 December 2003 (has links)
No description available.
|
46 |
Towards automatic understanding and integration of web databases for developing large-scale unified access systemsHe, Hai. January 2006 (has links)
Thesis (Ph. D.)--State University of New York at Binghamton, Computer Science Department, 2006. / Includes bibliographical references.
|
47 |
Design and Evaluation of Web-Based Economic Indicators: A Big Data Analysis ApproachBlázquez Soriano, María Desamparados 15 January 2020 (has links)
Tesis por compendio / [ES] En la Era Digital, el creciente uso de Internet y de dispositivos digitales está transformando completamente la forma de interactuar en el contexto económico y social. Miles de personas, empresas y organismos públicos utilizan Internet en sus actividades diarias, generando de este modo una enorme cantidad de datos actualizados ("Big Data") accesibles principalmente a través de la World Wide Web (WWW), que se ha convertido en el mayor repositorio de información del mundo. Estas huellas digitales se pueden rastrear y, si se procesan y analizan de manera apropiada, podrían ayudar a monitorizar en tiempo real una infinidad de variables económicas.
En este contexto, el objetivo principal de esta tesis doctoral es generar indicadores económicos, basados en datos web, que sean capaces de proveer regularmente de predicciones a corto plazo ("nowcasting") sobre varias actividades empresariales que son fundamentales para el crecimiento y desarrollo de las economías. Concretamente, tres indicadores económicos basados en la web han sido diseñados y evaluados: en primer lugar, un indicador de orientación exportadora, basado en un modelo que predice si una empresa es exportadora; en segundo lugar, un indicador de adopción de comercio electrónico, basado en un modelo que predice si una empresa ofrece la posibilidad de venta online; y en tercer lugar, un indicador de supervivencia empresarial, basado en dos modelos que indican la probabilidad de supervivencia de una empresa y su tasa de riesgo. Para crear estos indicadores, se han descargado una diversidad de datos de sitios web corporativos de forma manual y automática, que posteriormente se han procesado y analizado con técnicas de análisis Big Data.
Los resultados muestran que los datos web seleccionados están altamente relacionados con las variables económicas objeto de estudio, y que los indicadores basados en la web que se han diseñado en esta tesis capturan en un alto grado los valores reales de dichas variables económicas, siendo por tanto válidos para su uso por parte del mundo académico, de las empresas y de los decisores políticos. Además, la naturaleza online y digital de los indicadores basados en la web hace posible proveer regularmente y de forma barata de predicciones a corto plazo. Así, estos indicadores son ventajosos con respecto a los indicadores tradicionales.
Esta tesis doctoral ha contribuido a generar conocimiento sobre la viabilidad de producir indicadores económicos con datos online procedentes de sitios web corporativos. Los indicadores que se han diseñado pretenden contribuir a la modernización en la producción de estadísticas oficiales, así como ayudar a los decisores políticos y los gerentes de empresas a tomar decisiones informadas más rápidamente. / [CA] A l'Era Digital, el creixent ús d'Internet i dels dispositius digitals està transformant completament la forma d'interactuar al context econòmic i social. Milers de persones, empreses i organismes públics utilitzen Internet a les seues activitats diàries, generant d'aquesta forma una enorme quantitat de dades actualitzades ("Big Data") accessibles principalment mitjançant la World Wide Web (WWW), que s'ha convertit en el major repositori d'informació del món. Aquestes empremtes digitals poden rastrejar-se i, si se processen i analitzen de forma apropiada, podrien ajudar a monitoritzar en temps real una infinitat de variables econòmiques.
En aquest context, l'objectiu principal d'aquesta tesi doctoral és generar indicadors econòmics, basats en dades web, que siguen capaços de proveïr regularment de prediccions a curt termini ("nowcasting") sobre diverses activitats empresarials que són fonamentals per al creixement i desenvolupament de les economies. Concretament, tres indicadors econòmics basats en la web han sigut dissenyats i avaluats: en primer lloc, un indicador d'orientació exportadora, basat en un model que prediu si una empresa és exportadora; en segon lloc, un indicador d'adopció de comerç electrònic, basat en un model que prediu si una empresa ofereix la possibilitat de venda online; i en tercer lloc, un indicador de supervivència empresarial, basat en dos models que indiquen la probabilitat de supervivència d'una empresa i la seua tasa de risc. Per a crear aquestos indicadors, s'han descarregat una diversitat de dades de llocs web corporatius de forma manual i automàtica, que posteriorment s'han analitzat i processat amb tècniques d'anàlisi Big Data.
Els resultats mostren que les dades web seleccionades estan altament relacionades amb les variables econòmiques objecte d'estudi, i que els indicadors basats en la web que s'han dissenyat en aquesta tesi capturen en un alt grau els valors reals d'aquestes variables econòmiques, sent per tant vàlids per al seu ús per part del món acadèmic, de les empreses i dels decisors polítics. A més, la naturalesa online i digital dels indicadors basats en la web fa possible proveïr regularment i de forma barata de prediccions a curt termini. D'aquesta forma, són avantatjosos en comparació als indicadors tradicionals.
Aquesta tesi doctoral ha contribuït a generar coneixement sobre la viabilitat de produïr indicadors econòmics amb dades online procedents de llocs web corporatius. Els indicadors que s'han dissenyat pretenen contribuïr a la modernització en la producció d'estadístiques oficials, així com ajudar als decisors polítics i als gerents d'empreses a prendre decisions informades més ràpidament. / [EN] In the Digital Era, the increasing use of the Internet and digital devices is completely transforming the way of interacting in the economic and social framework. Myriad individuals, companies and public organizations use the Internet for their daily activities, generating a stream of fresh data ("Big Data") principally accessible through the World Wide Web (WWW), which has become the largest repository of information in the world. These digital footprints can be tracked and, if properly processed and analyzed, could help to monitor in real time a wide range of economic variables.
In this context, the main goal of this PhD thesis is to generate economic indicators, based on web data, which are able to provide regular, short-term predictions ("nowcasting") about some business activities that are basic for the growth and development of an economy. Concretely, three web-based economic indicators have been designed and evaluated: first, an indicator of firms' export orientation, which is based on a model that predicts if a firm is an exporter; second, an indicator of firms' engagement in e-commerce, which is based on a model that predicts if a firm offers e-commerce facilities in its website; and third, an indicator of firms' survival, which is based on two models that indicate the probability of survival of a firm and its hazard rate. To build these indicators, a variety of data from corporate websites have been retrieved manually and automatically, and subsequently have been processed and analyzed with Big Data analysis techniques.
Results show that the selected web data are highly related to the economic variables under study, and the web-based indicators designed in this thesis are capturing to a great extent their real values, thus being valid for their use by the academia, firms and policy-makers. Additionally, the digital and online nature of web-based indicators makes it possible to provide timely, inexpensive predictions about the economy. This way, they are advantageous with respect to traditional indicators.
This PhD thesis has contributed to generating knowledge about the viability of producing economic indicators with data coming from corporate websites. The indicators that have been designed are expected to contribute to the modernization of official statistics and to help in making earlier, more informed decisions to policy-makers and business managers. / Blázquez Soriano, MD. (2019). Design and Evaluation of Web-Based Economic Indicators: A Big Data Analysis Approach [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/116836 / Compendio
|
48 |
We the undersigned: anonymous dissent and the struggle for personal identity in online petitionsRiley, Will 12 February 2009 (has links)
Anonymous signatures pose a significant threat to the legitimacy of the online petition as a persuasive form of political communication. While anonymous signatures address some privacy concerns for online petitioners, they often fail to identify petitioners as numerically distinct and socially relevant persons, Since anonymous signatures often fail to personally identify online petitioners, they often fail to provide sufficient reason for targeted political authorities to review and respond to their grievances. To recover the personal rhetoric of the online petition in a way that strikes a balance between the publicity and privacy concerns of petitioners, we should reformat online petitions as pseudonymous social networks of personal testimony between petitioners and targeted political authorities. To this end, the pseudonymous signatures of online petitions should incorporate social frames, co-authored complaints and demands, multimedia voice, and revisable support.
|
49 |
A framework of an effective online help system to support nurses using a nursing information systemQiu, Yiyu. January 2007 (has links)
Thesis (M.Info.Tech.-Res.)--University of Wollongong, 2007. / Typescript. Includes bibliographical references.
|
50 |
Novel Online Data Cleaning Protocols for Data Streams in Trajectory, Wireless Sensor NetworksPumpichet, Sitthapon 12 November 2013 (has links)
The promise of Wireless Sensor Networks (WSNs) is the autonomous collaboration of a collection of sensors to accomplish some specific goals which a single sensor cannot offer. Basically, sensor networking serves a range of applications by providing the raw data as fundamentals for further analyses and actions. The imprecision of the collected data could tremendously mislead the decision-making process of sensor-based applications, resulting in an ineffectiveness or failure of the application objectives. Due to inherent WSN characteristics normally spoiling the raw sensor readings, many research efforts attempt to improve the accuracy of the corrupted or “dirty” sensor data. The dirty data need to be cleaned or corrected. However, the developed data cleaning solutions restrict themselves to the scope of static WSNs where deployed sensors would rarely move during the operation. Nowadays, many emerging applications relying on WSNs need the sensor mobility to enhance the application efficiency and usage flexibility. The location of deployed sensors needs to be dynamic. Also, each sensor would independently function and contribute its resources. Sensors equipped with vehicles for monitoring the traffic condition could be depicted as one of the prospective examples. The sensor mobility causes a transient in network topology and correlation among sensor streams. Based on static relationships among sensors, the existing methods for cleaning sensor data in static WSNs are invalid in such mobile scenarios. Therefore, a solution of data cleaning that considers the sensor movements is actively needed. This dissertation aims to improve the quality of sensor data by considering the consequences of various trajectory relationships of autonomous mobile sensors in the system. First of all, we address the dynamic network topology due to sensor mobility. The concept of virtual sensor is presented and used for spatio-temporal selection of neighboring sensors to help in cleaning sensor data streams. This method is one of the first methods to clean data in mobile sensor environments. We also study the mobility pattern of moving sensors relative to boundaries of sub-areas of interest. We developed a belief-based analysis to determine the reliable sets of neighboring sensors to improve the cleaning performance, especially when node density is relatively low. Finally, we design a novel sketch-based technique to clean data from internal sensors where spatio-temporal relationships among sensors cannot lead to the data correlations among sensor streams.
|
Page generated in 0.0652 seconds