• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 10
  • 10
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Data Privacy Preservation in Collaborative Filtering Based Recommender Systems

Wang, Xiwei 01 January 2015 (has links)
This dissertation studies data privacy preservation in collaborative filtering based recommender systems and proposes several collaborative filtering models that aim at preserving user privacy from different perspectives. The empirical study on multiple classical recommendation algorithms presents the basic idea of the models and explores their performance on real world datasets. The algorithms that are investigated in this study include a popularity based model, an item similarity based model, a singular value decomposition based model, and a bipartite graph model. Top-N recommendations are evaluated to examine the prediction accuracy. It is apparent that with more customers' preference data, recommender systems can better profile customers' shopping patterns which in turn produces product recommendations with higher accuracy. The precautions should be taken to address the privacy issues that arise during data sharing between two vendors. Study shows that matrix factorization techniques are ideal choices for data privacy preservation by their nature. In this dissertation, singular value decomposition (SVD) and nonnegative matrix factorization (NMF) are adopted as the fundamental techniques for collaborative filtering to make privacy-preserving recommendations. The proposed SVD based model utilizes missing value imputation, randomization technique, and the truncated SVD to perturb the raw rating data. The NMF based models, namely iAux-NMF and iCluster-NMF, take into account the auxiliary information of users and items to help missing value imputation and privacy preservation. Additionally, these models support efficient incremental data update as well. A good number of online vendors allow people to leave their feedback on products. It is considered as users' public preferences. However, due to the connections between users' public and private preferences, if a recommender system fails to distinguish real customers from attackers, the private preferences of real customers can be exposed. This dissertation addresses an attack model in which an attacker holds real customers' partial ratings and tries to obtain their private preferences by cheating recommender systems. To resolve this problem, trustworthiness information is incorporated into NMF based collaborative filtering techniques to detect the attackers and make reasonably different recommendations to the normal users and the attackers. By doing so, users' private preferences can be effectively protected.
2

Optimization Strategies for Data Warehouse Maintenance in Distributed Environments

Liu, Bin 30 April 2002 (has links)
Data warehousing is becoming an increasingly important technology for information integration and data analysis. Given the dynamic nature of modern distributed environments, both source data updates and schema changes are likely to occur autonomously and even concurrently in different data sources. Current approaches to maintain a data warehouse in such dynamic environments sequentially schedule maintenance processes to occur in isolation. Furthermore, each maintenance process is handling the maintenance of one single source update. This limits the performance of current data warehouse maintenance systems in a distributed environment where the maintenance of source updates endures the overhead of network delay as well as IO costs for each maintenance query. In this thesis work, we propose two different optimization strategies which can greatly improve data warehouse maintenance performance for a set of source updates in such dynamic environments. Both strategies are able to support source data updates and schema changes. The first strategy, the parallel data warehouse maintainer, schedules multiple maintenance processes concurrently. Based on the DWMS_Transaction model, we formalize the constraints that exist in maintaining data and schema changes concurrently and propose several parallel maintenance process schedulers. The second strategy, the batch data warehouse maintainer, groups multiple source updates and then maintains them within one maintenance process. We propose a technique for compacting the initial sequence of updates, and then for generating delta changes for each source. We also propose an algorithm to adapt/maintain the data warehouse extent using these delta changes. A further optimization of the algorithm also is applied using shared queries in the maintenance process. We have designed and implemented both optimization strategies and incorporated them into the existing DyDa/TxnWrap system. We have conducted extensive experiments on both the parallel as well as the batch processing of a set of source updates to study the performance achievable under various system settings. Our findings include that our parallel maintenance gains around 40 ~ 50% performance improvement compared to sequential processing in environments that use single-CPU machines and little network delay, i.e, without requiring any additional hardware resources. While for batch processing, an improvement of 400 ~ 500% improvement compared with sequential maintenance is achieved, however at the cost of less frequent refreshes of the data warehouse content.
3

Hit and Bandwidth Optimal Caching for Wireless Data Access Networks

Akon, Mursalin January 2011 (has links)
For many data access applications, the availability of the most updated information is a fundamental and rigid requirement. In spite of many technological improvements, in wireless networks, wireless channels (or bandwidth) are the most scarce resources and hence are expensive. Data access from remote sites heavily depends on these expensive resources. Due to affordable smart mobile devices and tremendous popularity of various Internet-based services, demand for data from these mobile devices are growing very fast. In many cases, it is becoming impossible for the wireless data service providers to satisfy the demand for data using the current network infrastructures. An efficient caching scheme at the client side can soothe the problem by reducing the amount of data transferred over the wireless channels. However, an update event makes the associated cached data objects obsolete and useless for the applications. Frequencies of data update, as well as data access play essential roles in cache access and replacement policies. Intuitively, frequently accessed and infrequently updated objects should be given higher preference while preserving in the cache. However, modeling this intuition is challenging, particularly in a network environment where updates are injected by both the server and the clients, distributed all over networks. In this thesis, we strive to make three inter-related contributions. Firstly, we propose two enhanced cache access policies. The access policies ensure strong consistency of the cached data objects through proactive or reactive interactions with the data server. At the same time, these policies collect information about access and update frequencies of hosted objects to facilitate efficient deployment of the cache replacement policy. Secondly, we design a replacement policy which plays the decision maker role when there is a new object to accommodate in a fully occupied cache. The statistical information collected by the access policies enables the decision making process. This process is modeled around the idea of preserving frequently accessed but less frequently updated objects in the cache. Thirdly, we analytically show that a cache management scheme with the proposed replacement policy bundled with any of the cache access policies guarantees optimum amount of data transmission by increasing the number of effective hits in the cache system. Results from both analysis and our extensive simulations demonstrate that the proposed policies outperform the popular Least Frequently Used (LFU) policy in terms of both effective hits and bandwidth consumption. Moreover, our flexible system model makes the proposed policies equally applicable to applications for the existing 3G, as well as upcoming LTE, LTE Advanced and WiMAX wireless data access networks.
4

Hit and Bandwidth Optimal Caching for Wireless Data Access Networks

Akon, Mursalin January 2011 (has links)
For many data access applications, the availability of the most updated information is a fundamental and rigid requirement. In spite of many technological improvements, in wireless networks, wireless channels (or bandwidth) are the most scarce resources and hence are expensive. Data access from remote sites heavily depends on these expensive resources. Due to affordable smart mobile devices and tremendous popularity of various Internet-based services, demand for data from these mobile devices are growing very fast. In many cases, it is becoming impossible for the wireless data service providers to satisfy the demand for data using the current network infrastructures. An efficient caching scheme at the client side can soothe the problem by reducing the amount of data transferred over the wireless channels. However, an update event makes the associated cached data objects obsolete and useless for the applications. Frequencies of data update, as well as data access play essential roles in cache access and replacement policies. Intuitively, frequently accessed and infrequently updated objects should be given higher preference while preserving in the cache. However, modeling this intuition is challenging, particularly in a network environment where updates are injected by both the server and the clients, distributed all over networks. In this thesis, we strive to make three inter-related contributions. Firstly, we propose two enhanced cache access policies. The access policies ensure strong consistency of the cached data objects through proactive or reactive interactions with the data server. At the same time, these policies collect information about access and update frequencies of hosted objects to facilitate efficient deployment of the cache replacement policy. Secondly, we design a replacement policy which plays the decision maker role when there is a new object to accommodate in a fully occupied cache. The statistical information collected by the access policies enables the decision making process. This process is modeled around the idea of preserving frequently accessed but less frequently updated objects in the cache. Thirdly, we analytically show that a cache management scheme with the proposed replacement policy bundled with any of the cache access policies guarantees optimum amount of data transmission by increasing the number of effective hits in the cache system. Results from both analysis and our extensive simulations demonstrate that the proposed policies outperform the popular Least Frequently Used (LFU) policy in terms of both effective hits and bandwidth consumption. Moreover, our flexible system model makes the proposed policies equally applicable to applications for the existing 3G, as well as upcoming LTE, LTE Advanced and WiMAX wireless data access networks.
5

Atualização de dados de entrada aplicada à previsão de vazões de curto prazo

Ticona Gutierrez, Juan Carlos January 2015 (has links)
Neste estudo, foi realizada uma revisão dos problemas observados na modelagem chuva-vazão, que influenciam a incerteza das condições iniciais dos processos de previsão de vazão. Foi realizada, também, uma revisão do estado da arte de alguns dos modelos de previsão de vazão de curto prazo utilizados no Brasil e, por último, uma revisão das metodologias de atualização de dados empregadas em trabalhos passados. Mas o principal enfoque deste estudo foi à elaboração de uma metodologia de atualização de dados de entrada, baseada na correção do desvio entre a vazão de saída de um modelo hidrológico e a vazão observada, por meio da perturbação dos dados de entrada de precipitação. O estudo de caso está composto por três bacias: Bacia do rio Ijuí, Bacia do rio Tesouras e a Bacia do rio Canoas. Estas bacias foram escolhidas, pois apresentam características distintas, tanto físicas quanto climáticas e, além disso, pela existência de estudos prévios com o modelo hidrológico utilizado neste trabalho. O processo de avaliação do método foi realizado em três etapas: 1) utilizando séries sintéticas; 2) utilizando séries reais; 3) previsão de vazões com atualização de dados. As duas primeiras etapas utilizaram o modelo em modo atualização (“off-line”) e, a última, o modelo em modo de previsão (“on-line”). Para a aplicação do método é necessário estabelecer condições de parada, sendo então, propostos dois conjuntos de critérios de parada. Com isto, foi estabelecido um conjunto adequado de critérios para que estes fiquem fixos para possibilitar futuras aplicações em outros modelos ou em outros estudos de casos. A técnica de previsão de vazão de curto prazo utilizada foi com base na chuva prevista, sendo adotada a previsão de chuva conhecida ou perfeita. Foram geradas previsões diárias de até 7 dias, durante 20 dias contínuos, escolhendo-se dois eventos de diferentes características em cada uma das bacias do estudo de caso. Em modo previsão os resultados se mostraram promissores, o objetivo desejado inicialmente foi atingido pelos dois conjuntos de critérios de parada propostos. Conseguiu-se ter um ganho significativo até o quarto dia de previsão, como, também, melhoras nos períodos de subidas do hidrograma, porém nos períodos de estiagens o ganho foi quase nulo. Além disso, este trabalho mostrou a viabilidade da utilização do modelo IPH II para a geração de previsões de vazões baseadas em previsão de chuva. / In this study a review of the problems observed in rainfall-runoff modeling has been made, which influence the uncertainty of initial conditions of flow forecasting processes, as well as a review of the state of the art of some of the short-term flow forecasting models used in Brazil and the data update methodologies used in many past jobs. However the focus of this study has been the development of a data entry update methodology based on the correction of the deviation between the output flow of a hydrological model and the observed flow, by means of the disruption of rainfall input data. The case study is composed of the three river basins: River Ijuí, River Tesouras and Canoas. These basins have been chosen due to their different characteristics, both physical and climate, besides having been used in previous studies of the hydrological model used. The evaluation process of the method is done in three steps: 1) using synthetic series; 2) using real series; 3) stream flow forecasting with data update, the first two with the model in update mode ("off-line") and the last in predict mode ("on-line"). For the application of this method is necessary to establish stopping conditions for application, and for this have been proposed two sets of stop criteria. With this, intended to establish an appropriate set of criteria so that they become fixed to permit future applications in other models. The short-term flow forecasting technique used has been based on the forecast rain, adopted the rain forecast known or perfect. Predictions have been generated daily up to 7 days, for 20 consecutive days, choosing two events of different features in each case study basins. In predict mode the results have been promissory, the desired goal initially achieved by the two sets of proposed stopping criteria. It was possible to have a significant gain until the fourth day forecast also improvements in periods of hydrograph increases but not during periods of drought the gain was almost nil. This work has also showed the ability to generate predictions of rain forecast based flow as the IPH II model in real time.
6

Atualização de dados de entrada aplicada à previsão de vazões de curto prazo

Ticona Gutierrez, Juan Carlos January 2015 (has links)
Neste estudo, foi realizada uma revisão dos problemas observados na modelagem chuva-vazão, que influenciam a incerteza das condições iniciais dos processos de previsão de vazão. Foi realizada, também, uma revisão do estado da arte de alguns dos modelos de previsão de vazão de curto prazo utilizados no Brasil e, por último, uma revisão das metodologias de atualização de dados empregadas em trabalhos passados. Mas o principal enfoque deste estudo foi à elaboração de uma metodologia de atualização de dados de entrada, baseada na correção do desvio entre a vazão de saída de um modelo hidrológico e a vazão observada, por meio da perturbação dos dados de entrada de precipitação. O estudo de caso está composto por três bacias: Bacia do rio Ijuí, Bacia do rio Tesouras e a Bacia do rio Canoas. Estas bacias foram escolhidas, pois apresentam características distintas, tanto físicas quanto climáticas e, além disso, pela existência de estudos prévios com o modelo hidrológico utilizado neste trabalho. O processo de avaliação do método foi realizado em três etapas: 1) utilizando séries sintéticas; 2) utilizando séries reais; 3) previsão de vazões com atualização de dados. As duas primeiras etapas utilizaram o modelo em modo atualização (“off-line”) e, a última, o modelo em modo de previsão (“on-line”). Para a aplicação do método é necessário estabelecer condições de parada, sendo então, propostos dois conjuntos de critérios de parada. Com isto, foi estabelecido um conjunto adequado de critérios para que estes fiquem fixos para possibilitar futuras aplicações em outros modelos ou em outros estudos de casos. A técnica de previsão de vazão de curto prazo utilizada foi com base na chuva prevista, sendo adotada a previsão de chuva conhecida ou perfeita. Foram geradas previsões diárias de até 7 dias, durante 20 dias contínuos, escolhendo-se dois eventos de diferentes características em cada uma das bacias do estudo de caso. Em modo previsão os resultados se mostraram promissores, o objetivo desejado inicialmente foi atingido pelos dois conjuntos de critérios de parada propostos. Conseguiu-se ter um ganho significativo até o quarto dia de previsão, como, também, melhoras nos períodos de subidas do hidrograma, porém nos períodos de estiagens o ganho foi quase nulo. Além disso, este trabalho mostrou a viabilidade da utilização do modelo IPH II para a geração de previsões de vazões baseadas em previsão de chuva. / In this study a review of the problems observed in rainfall-runoff modeling has been made, which influence the uncertainty of initial conditions of flow forecasting processes, as well as a review of the state of the art of some of the short-term flow forecasting models used in Brazil and the data update methodologies used in many past jobs. However the focus of this study has been the development of a data entry update methodology based on the correction of the deviation between the output flow of a hydrological model and the observed flow, by means of the disruption of rainfall input data. The case study is composed of the three river basins: River Ijuí, River Tesouras and Canoas. These basins have been chosen due to their different characteristics, both physical and climate, besides having been used in previous studies of the hydrological model used. The evaluation process of the method is done in three steps: 1) using synthetic series; 2) using real series; 3) stream flow forecasting with data update, the first two with the model in update mode ("off-line") and the last in predict mode ("on-line"). For the application of this method is necessary to establish stopping conditions for application, and for this have been proposed two sets of stop criteria. With this, intended to establish an appropriate set of criteria so that they become fixed to permit future applications in other models. The short-term flow forecasting technique used has been based on the forecast rain, adopted the rain forecast known or perfect. Predictions have been generated daily up to 7 days, for 20 consecutive days, choosing two events of different features in each case study basins. In predict mode the results have been promissory, the desired goal initially achieved by the two sets of proposed stopping criteria. It was possible to have a significant gain until the fourth day forecast also improvements in periods of hydrograph increases but not during periods of drought the gain was almost nil. This work has also showed the ability to generate predictions of rain forecast based flow as the IPH II model in real time.
7

Atualização de dados de entrada aplicada à previsão de vazões de curto prazo

Ticona Gutierrez, Juan Carlos January 2015 (has links)
Neste estudo, foi realizada uma revisão dos problemas observados na modelagem chuva-vazão, que influenciam a incerteza das condições iniciais dos processos de previsão de vazão. Foi realizada, também, uma revisão do estado da arte de alguns dos modelos de previsão de vazão de curto prazo utilizados no Brasil e, por último, uma revisão das metodologias de atualização de dados empregadas em trabalhos passados. Mas o principal enfoque deste estudo foi à elaboração de uma metodologia de atualização de dados de entrada, baseada na correção do desvio entre a vazão de saída de um modelo hidrológico e a vazão observada, por meio da perturbação dos dados de entrada de precipitação. O estudo de caso está composto por três bacias: Bacia do rio Ijuí, Bacia do rio Tesouras e a Bacia do rio Canoas. Estas bacias foram escolhidas, pois apresentam características distintas, tanto físicas quanto climáticas e, além disso, pela existência de estudos prévios com o modelo hidrológico utilizado neste trabalho. O processo de avaliação do método foi realizado em três etapas: 1) utilizando séries sintéticas; 2) utilizando séries reais; 3) previsão de vazões com atualização de dados. As duas primeiras etapas utilizaram o modelo em modo atualização (“off-line”) e, a última, o modelo em modo de previsão (“on-line”). Para a aplicação do método é necessário estabelecer condições de parada, sendo então, propostos dois conjuntos de critérios de parada. Com isto, foi estabelecido um conjunto adequado de critérios para que estes fiquem fixos para possibilitar futuras aplicações em outros modelos ou em outros estudos de casos. A técnica de previsão de vazão de curto prazo utilizada foi com base na chuva prevista, sendo adotada a previsão de chuva conhecida ou perfeita. Foram geradas previsões diárias de até 7 dias, durante 20 dias contínuos, escolhendo-se dois eventos de diferentes características em cada uma das bacias do estudo de caso. Em modo previsão os resultados se mostraram promissores, o objetivo desejado inicialmente foi atingido pelos dois conjuntos de critérios de parada propostos. Conseguiu-se ter um ganho significativo até o quarto dia de previsão, como, também, melhoras nos períodos de subidas do hidrograma, porém nos períodos de estiagens o ganho foi quase nulo. Além disso, este trabalho mostrou a viabilidade da utilização do modelo IPH II para a geração de previsões de vazões baseadas em previsão de chuva. / In this study a review of the problems observed in rainfall-runoff modeling has been made, which influence the uncertainty of initial conditions of flow forecasting processes, as well as a review of the state of the art of some of the short-term flow forecasting models used in Brazil and the data update methodologies used in many past jobs. However the focus of this study has been the development of a data entry update methodology based on the correction of the deviation between the output flow of a hydrological model and the observed flow, by means of the disruption of rainfall input data. The case study is composed of the three river basins: River Ijuí, River Tesouras and Canoas. These basins have been chosen due to their different characteristics, both physical and climate, besides having been used in previous studies of the hydrological model used. The evaluation process of the method is done in three steps: 1) using synthetic series; 2) using real series; 3) stream flow forecasting with data update, the first two with the model in update mode ("off-line") and the last in predict mode ("on-line"). For the application of this method is necessary to establish stopping conditions for application, and for this have been proposed two sets of stop criteria. With this, intended to establish an appropriate set of criteria so that they become fixed to permit future applications in other models. The short-term flow forecasting technique used has been based on the forecast rain, adopted the rain forecast known or perfect. Predictions have been generated daily up to 7 days, for 20 consecutive days, choosing two events of different features in each case study basins. In predict mode the results have been promissory, the desired goal initially achieved by the two sets of proposed stopping criteria. It was possible to have a significant gain until the fourth day forecast also improvements in periods of hydrograph increases but not during periods of drought the gain was almost nil. This work has also showed the ability to generate predictions of rain forecast based flow as the IPH II model in real time.
8

Návrh algoritmů pro modul informačního systému / Design of algorithm for information system module

Weinlichová, Jana January 2008 (has links)
Master´s thesis is considered with design of algorithms for new module of company information system. In the beginning of thesis there are characterized types of ways to describe an information systems. For specification of described system is briefly defined IBM Lotus Notes environment. Next chapter is about object-oriented analysis and design of a module of information system by using UML´s diagrams in modeling tool Enterprise Architect. In the third chapter is made analysis and design of module´s connection with current system, specificly update data in form. Thesis shows designed algorithms in environment of Lotus Domino Designer by using LotusScript and SQL languages and Lotus Domino Connector for access into the database by using ODBC. In last part of thesis is proposed to use a mapping tool to mapp the ITC infrastructure by using Change management process according to ITIL method, to manage all the changes in developing system effectively.
9

Erstellung von Echtzeitmotormodellen aus den Konstruktionsdaten von Verbrennungsmotoren

Kämmer, Alexander 25 January 2004 (has links) (PDF)
Motormanagement Systeme im modernen Kraftfahrzeug werden zunehmend umfangreicher und komplexer. Die Steuergeräte als zentrale Komponente dieser Systeme werden in ihrer Funktionalität durch Hard- und Software bestimmt. Sie sind das Ergebnis eines langen Entwicklungs- und Fertigungsprozesses. Fahrversuche und Versuche am Motorenprüfstand zum Test von Steuergeräten sind sehr zeit- und kostenintensiv. Eine Alternative ist der Test von Steuergeräten außerhalb ihrer realen Umgebung. Das Steuergerät wird dabei auf einem Hardware-in-the-Loop-Prüfstand betrieben. Die große Zahl von Einzelfunktionen und miteinander verknüpften Funktionen des Steuergerätes erfordert einen strukturierten und reproduzierbaren Ablauf beim Test. Diese Tests sind aber erst nach Fertigstellung eines Motorprototypen möglich, da die Parameter der vorhandenen Modelle aus den gemessenen Prüfstandsdaten ermittelt werden. Eine weitere Fragestellung zu diesem Thema bezieht sich auf die Modelltiefe: Heutige Modelle basieren auf Daten, die über einem Arbeitsspiel gemittelt werden. Untersuchungen wie z.B. der Momentanbeschleunigung von Kurbelwellen sind somit nicht möglich. Aus diesem Grund untersucht diese Arbeit Strategien im Hinblick auf den modellbasierten Test von Motormanagement Systemen. Dabei wird das Potenzial zur Zeiteinsparung bei einem neuen Motormodell großer Tiefe betrachtet.
10

Erstellung von Echtzeitmotormodellen aus den Konstruktionsdaten von Verbrennungsmotoren

Kämmer, Alexander 30 June 2003 (has links)
Motormanagement Systeme im modernen Kraftfahrzeug werden zunehmend umfangreicher und komplexer. Die Steuergeräte als zentrale Komponente dieser Systeme werden in ihrer Funktionalität durch Hard- und Software bestimmt. Sie sind das Ergebnis eines langen Entwicklungs- und Fertigungsprozesses. Fahrversuche und Versuche am Motorenprüfstand zum Test von Steuergeräten sind sehr zeit- und kostenintensiv. Eine Alternative ist der Test von Steuergeräten außerhalb ihrer realen Umgebung. Das Steuergerät wird dabei auf einem Hardware-in-the-Loop-Prüfstand betrieben. Die große Zahl von Einzelfunktionen und miteinander verknüpften Funktionen des Steuergerätes erfordert einen strukturierten und reproduzierbaren Ablauf beim Test. Diese Tests sind aber erst nach Fertigstellung eines Motorprototypen möglich, da die Parameter der vorhandenen Modelle aus den gemessenen Prüfstandsdaten ermittelt werden. Eine weitere Fragestellung zu diesem Thema bezieht sich auf die Modelltiefe: Heutige Modelle basieren auf Daten, die über einem Arbeitsspiel gemittelt werden. Untersuchungen wie z.B. der Momentanbeschleunigung von Kurbelwellen sind somit nicht möglich. Aus diesem Grund untersucht diese Arbeit Strategien im Hinblick auf den modellbasierten Test von Motormanagement Systemen. Dabei wird das Potenzial zur Zeiteinsparung bei einem neuen Motormodell großer Tiefe betrachtet.

Page generated in 0.4562 seconds