• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 2
  • 1
  • 1
  • Tagged with
  • 6
  • 6
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Design And Implementation Of Mobile Patient Data Collection And Transmission System For An Emergency Ambulance

Kosen, Emre 01 June 2004 (has links) (PDF)
In this thesis, a low-cost system, called Mobile Ambulance, is designed and implemented that provides patient&rsquo / s medical data collection and transmission from a moving ambulance. The aim of the system is to decrease the waiting time for critical care patients to be seen at the emergency department (ED) at the same time to equip the emergency physician with the essential medical data before the patient arrives the ED. Mobile Ambulance is a multi-tiered distributed application composed of three components: ambulance component to capture patient&rsquo / s essential medical data (EMD) and to transmit it to the ED (transmission is wireless via General Packet Radio Service, GPRS), synchronization component (synch for short) to persist incoming data into the back-end database and to warn the emergency physician, and service component to analyze the patient&rsquo / s EMD.
2

Novel Application Models and Efficient Algorithms for Offloading to Clouds

González Barrameda, José Andrés January 2017 (has links)
The application offloading problem for Mobile Cloud Computing aims at improving the mobile user experience by leveraging the resources of the cloud. The execution of the mobile application is offloaded to the cloud, saving energy at the mobile device or speeding up the execution of the application. We improve the accuracy and performance of application offloading solutions in three main directions. First, we propose a novel fine-grained application model that supports complex module dependencies such as sequential, conditional and parallel module executions. The model also allows for multiple offloading decisions that are tailored towards the current application, network, or user contexts. As a result, the model is more precise in capturing the structure of the application and supports more complex offloading solutions. Second, we propose three cost models, namely, average-based, statistics-based and interval-based cost models, defined for the proposed application model. The average-based approach models each module cost by the expected cost value, and the expected cost of the entire application is estimated considering each of the three module dependencies. The novel statistics-based cost model employs Cumulative Distribution Function (CDFs) to represent the costs of the modules and of the mobile application, which is estimated considering the cost and dependencies of the modules. This cost model opens the doors for new statistics-based optimization functions and constraints whereas the state of the art only support optimizations based on the average running cost of the application. Furthermore, this cost model can be used to perform statistical analysis of the performance of the application in different scenarios such as varying network data rates. The last cost model, the interval-based, represents the module costs via intervals in order to addresses the cost uncertainty while having lower requirements and computational complexity than the statistics-based model. The cost of the application is estimated as an expected maximum cost via a linear optimization function. Finally, we present offloading decision algorithms for each cost model. For the average-based model, we present a fast optimal dynamic programming algorithm. For the statistics-based model, we present another fast optimal dynamic programming algorithm for the scenario where the optimization function meets specific properties. Finally, for the interval-based cost model, we present a robust formulation that solves a linear number of linear optimization problems. Our evaluations verify the accuracy of the models and show higher cost savings for our solutions when compared to the state of the art.
3

An Application Framework for Monitoring Care Processes

Baarah, Aladdin 17 December 2013 (has links)
Care process monitoring is important in healthcare domains to provide precise and detailed analytics on patients, providers, and resources participating in a care process and their status. These analytics are used to keep track of whether the quality of care goals set by healthcare organizations are satisfied and ensure that legislative and organizational guidelines are followed. The complexity of care process monitoring can vary depending on whether the care process takes place in a hospital or out in the community, and it can vary depending on the complexity of the information technology infrastructure that is in place to support the care process. A Care Process Monitoring Application (CPMA) is a software application which collects and integrates data from various sources while a care process is being provided, in order to provide performance reporting of metrics that are used to measure how well the performance goals and guidelines for the care process are being met. In our research, we have studied how CPMAs are built in order to improve the quality of their engineering. The significant challenge in this context is how to engineer a CPMA so that the engineering process is repeatable, produces a CPMA of consistent high quality, and requires less time, less effort and less complexity. This thesis proposes an application framework for care process monitoring that collects and integrates events from event sources, maintains the individual and aggregate states of the care process and populates a metrics data mart to support performance reporting. Our contributions are the following: a state-based application meta-model of care process monitoring, a care process monitoring architectural pattern, and finally, a behavior driven development methodology for CPMAs based on our meta-model and architectural pattern. Our results are validated through three different case studies in which we collaborated with two different health care organizations to build and deploy CPMAs for two different care processes (one hospital-based, the other community-based) in collaboration with healthcare clinicians and researchers.
4

An Application Framework for Monitoring Care Processes

Baarah, Aladdin January 2014 (has links)
Care process monitoring is important in healthcare domains to provide precise and detailed analytics on patients, providers, and resources participating in a care process and their status. These analytics are used to keep track of whether the quality of care goals set by healthcare organizations are satisfied and ensure that legislative and organizational guidelines are followed. The complexity of care process monitoring can vary depending on whether the care process takes place in a hospital or out in the community, and it can vary depending on the complexity of the information technology infrastructure that is in place to support the care process. A Care Process Monitoring Application (CPMA) is a software application which collects and integrates data from various sources while a care process is being provided, in order to provide performance reporting of metrics that are used to measure how well the performance goals and guidelines for the care process are being met. In our research, we have studied how CPMAs are built in order to improve the quality of their engineering. The significant challenge in this context is how to engineer a CPMA so that the engineering process is repeatable, produces a CPMA of consistent high quality, and requires less time, less effort and less complexity. This thesis proposes an application framework for care process monitoring that collects and integrates events from event sources, maintains the individual and aggregate states of the care process and populates a metrics data mart to support performance reporting. Our contributions are the following: a state-based application meta-model of care process monitoring, a care process monitoring architectural pattern, and finally, a behavior driven development methodology for CPMAs based on our meta-model and architectural pattern. Our results are validated through three different case studies in which we collaborated with two different health care organizations to build and deploy CPMAs for two different care processes (one hospital-based, the other community-based) in collaboration with healthcare clinicians and researchers.
5

網際網路超媒體資料庫工作量模型產生之研究 / Web-Based Hypermedia Database Benchmark Workload Development

林嬿芳, Lin, Yen-Fang Unknown Date (has links)
網際網路超媒體資訊系統讓使用者能透過各種媒體共享資訊。每個超媒體資訊系統都擁有其後端資料庫。超媒體資料庫是用來儲存各種形式的資料。在近幾年來,超媒體資料庫有愈來愈多的趨勢。但我們無法知道哪種超媒體資料庫的績效較佳。績效評估是個重要的工具,它可以用來衡量和評估超媒體資料庫的效能。現今有許多績效評估的工作量模型可用來測試一般的資料庫,但卻沒有適合用來測試超媒體資料庫的工作量模型。在這情況下,我們發展了更一般化、由四個元件組成的工作量模型,用來作超媒體資料庫的績效評估。這四個元件分別為物件模型(Object Model)、應用模型(Application Model)、導覽模型(Navigation Model)及控制模型(Control Model)。每一個元件是用來建立每一面向的超媒體資料庫工作量需求。之後,我們建立此工作量模型的系統雛形,以證明此超媒體工作量模型是可行的。發展此正規且系統化的工作量方法可以幫助使用者預測及表示超媒體資料庫系統的績效。最後我們討論並解釋我們的工作量模型及和其它物件導向資料庫績效評估的關係。 / Web-based Hypermedia information systems (WHIS) enable users to share the information through a variety of media. Every Hypermedia information system has its own backend databases. The Hypermedia database is used to store a variety of data. Hypermedia databases are created more and more in the recent years. But we cannot find out which Hypermedia database provides higher performance. Benchmark gives a vital tool to measure and evaluate the performance of the Hypermedia databases. There are many benchmark workload models used to test the general databases. However, there is few benchmark workload model to test the Hypermedia databases. With this scenario, we have developed a more generalized four-component workload model for Hypermedia database benchmark. These components are the object model, the application model, the navigation model, and the control model. Each models one key aspect of the Hypermedia database workload requirements. And then, we build the prototype of this workload model to showht the Hypermedia benchmark workload model is feasible. A formal and systematic workload method is developed that can help users predict and profile the performance of the Hypermedia databases. Finally, we discuss and explain our workload model and the relationships with other object-oriented database benchmarks.
6

BASEL II 與銀行企業金融授信實務之申請進件模型

陳靖芸, Chen,Chin-Yun Unknown Date (has links)
授信業務是銀行主要獲利來源之一,隨著國際化趨勢以及政府積極推動經濟自由,國內金融環境丕變,金融機構之授信業務競爭日漸激烈,加上近年來國內經濟成長趨緩,又於千禧年爆發本土性金融風暴,集團企業財務危機猶如骨牌效應ㄧ樁接ㄧ樁,原因在於大企業過度信用擴張,過高槓桿操作,導致負債比率上升,面臨償債困難;還有銀行對企業放款之授信審核常有大企業不會倒閉之迷思。故如何找出企業財務危機出現之徵兆,及早防範於未然,將是本研究在建立企業授信之申請進件模型的重點之ㄧ。 此外,2002年新修定的巴塞爾資本協定主在落實銀行風險管理,國際清算銀行決定於2006年正式實行新巴塞爾協定,我國修正的「銀行資本適足性管理辦法」自民國九十五年十二月三十一日起實施,故本國銀行需要依據本身的商品特色、市場區隔、客戶性質、以及經營方式與理念等因素,去建制一套適合自己的內部風險評估系統。故本研究第二個重點即在於依據我國現有法令,做出一個符合信用風險基礎內部評等法要求之申請進件模型。 本研究使用某銀行有財務報表之企業授信戶,利用財報中的財務比率變數建立模型。先使用主成分分析將所有變數分為七大類,分別是企業之財務構面、經營能力、獲利能力、償債能力、長期資本指標、流動性、以及現金流量,再進行羅吉斯迴歸模型分析。 / Business loan is one of the main profits in the bank. But increasing business competition causes the loan process in the bank is not very serious, the bankers allow enterprise to expand his credit or has higher debt ratio, that would cause financial crises. The first point in this study is to find the symptom when enterprise has financial crises. The second point is that under the framework of New Basel Capital Accord〈Basel II〉, we try to build an application model that committed the domestic requirements. The bank should develop the fundamental internal rating-based approach that accords with its strategy、market segmentation、and customers type. This research paper uses financial variables〈ex. liquid ratio、debt ratio、ROA、ROE、… 〉to build enterprise application model. We use the principle component analysis to separate different factors which affect loan process: financial facet、ability to pay、profitability、management ability、long-term index、liquidity、and cash flow. Then, we show the result about these factors in the logistic regression model.

Page generated in 0.1343 seconds