• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 682
  • 252
  • 79
  • 57
  • 42
  • 37
  • 30
  • 26
  • 25
  • 14
  • 9
  • 8
  • 7
  • 7
  • 7
  • Tagged with
  • 1503
  • 1029
  • 249
  • 238
  • 223
  • 215
  • 195
  • 185
  • 167
  • 163
  • 151
  • 124
  • 123
  • 122
  • 111
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Robust mixtures of regression models

Bai, Xiuqin January 1900 (has links)
Doctor of Philosophy / Department of Statistics / Kun Chen and Weixin Yao / This proposal contains two projects that are related to robust mixture models. In the robust project, we propose a new robust mixture of regression models (Bai et al., 2012). The existing methods for tting mixture regression models assume a normal distribution for error and then estimate the regression param- eters by the maximum likelihood estimate (MLE). In this project, we demonstrate that the MLE, like the least squares estimate, is sensitive to outliers and heavy-tailed error distributions. We propose a robust estimation procedure and an EM-type algorithm to estimate the mixture regression models. Using a Monte Carlo simulation study, we demonstrate that the proposed new estimation method is robust and works much better than the MLE when there are outliers or the error distribution has heavy tails. In addition, the proposed robust method works comparably to the MLE when there are no outliers and the error is normal. In the second project, we propose a new robust mixture of linear mixed-effects models. The traditional mixture model with multiple linear mixed effects, assuming Gaussian distribution for random and error parts, is sensitive to outliers. We will propose a mixture of multiple linear mixed t-distributions to robustify the estimation procedure. An EM algorithm is provided to and the MLE under the assumption of t- distributions for error terms and random mixed effects. Furthermore, we propose to adaptively choose the degrees of freedom for the t-distribution using profile likelihood. In the simulation study, we demonstrate that our proposed model works comparably to the traditional estimation method when there are no outliers and the errors and random mixed effects are normally distributed, but works much better if there are outliers or the distributions of the errors and random mixed effects have heavy tails.
242

Developing a psychological model of end-users' experience with news Web sites

Aranyi, Gabor January 2012 (has links)
The primary aim of the research project presented in this thesis was to develop and test a comprehensive psychological model of interaction experience with news Web sites. Although news media have been publishing on the Web increasingly since the second half of the 1990s and news sites have become a favoured source of news for many, there is a lack of knowledge about news sites in terms of interaction-experience constructs and their structural relationships. The project aimed to examine people’s use of news sites from the perspective of interaction-experience research by developing a model and, based on this model, to provide guidance for designers of news sites. The project comprises three research phases: (1) exploratory phase, (2) modelling phase and (3) experimental phase. In the exploratory phase, a review of literature and an exploratory study of interaction experience with news Web sites were conducted. The latter explored how users of a particular news site interact with the site and which aspects of their experience they report. Data for the exploratory study were collected with an online questionnaire and by recording participants’ use of a news site under think-aloud instructions. In the modelling phase, an online questionnaire was used to collect answers to psychometric scales that were selected based on the literature review and the exploratory study. A measurement model was formulated to test the relationship between measurement items and the measurement scales, and structural models were formulated to test hypotheses related to the structural relationships of variables. Following the test results, a model of interaction experience with news sites was formulated to predict outcome measures of interaction experience from variables measuring aspects of interaction experience. Components of interaction experience, in turn, were predicted from measures of perceived news-site characteristics. In the experimental phase, an experiment was conducted to test the model of interaction experience with news sites in a controlled setting. Additionally, measures of person- and context characteristics were included in the prediction of components of interaction experience. The model of interaction experience with news sites was supported and accounted for a medium to substantial amount of variance in outcome measures. Finally, design guidance was derived from the model to advance interaction-experience knowledge, and conclusions were drawn regarding the model, in relation to existing research.
243

Essays on Computational Problems in Insurance

Ha, Hongjun 31 July 2016 (has links)
This dissertation consists of two chapters. The first chapter establishes an algorithm for calculating capital requirements. The calculation of capital requirements for financial institutions usually entails a reevaluation of the company's assets and liabilities at some future point in time for a (large) number of stochastic forecasts of economic and firm-specific variables. The complexity of this nested valuation problem leads many companies to struggle with the implementation. The current chapter proposes and analyzes a novel approach to this computational problem based on least-squares regression and Monte Carlo simulations. Our approach is motivated by a well-known method for pricing non-European derivatives. We study convergence of the algorithm and analyze the resulting estimate for practically important risk measures. Moreover, we address the problem of how to choose the regressors, and show that an optimal choice is given by the left singular functions of the corresponding valuation operator. Our numerical examples demonstrate that the algorithm can produce accurate results at relatively low computational costs, particularly when relying on the optimal basis functions. The second chapter discusses another application of regression-based methods, in the context of pricing variable annuities. Advanced life insurance products with exercise-dependent financial guarantees present challenging problems in view of pricing and risk management. In particular, due to the complexity of the guarantees and since practical valuation frameworks include a variety of stochastic risk factors, conventional methods that are based on the discretization of the underlying (Markov) state space may not be feasible. As a practical alternative, this chapter explores the applicability of Least-Squares Monte Carlo (LSM) methods familiar from American option pricing in this context. Unlike previous literature we consider optionality beyond surrendering the contract, where we focus on popular withdrawal benefits - so-called GMWBs - within Variable Annuities. We introduce different LSM variants, particularly the regression-now and regression-later approaches, and explore their viability and potential pitfalls. We commence our numerical analysis in a basic Black-Scholes framework, where we compare the LSM results to those from a discretization approach. We then extend the model to include various relevant risk factors and compare the results to those from the basic framework.
244

The relationship between lean service, activity-based costing and business strategy and their impact on performance

Hadid, Wael January 2014 (has links)
Lean system has drawn the attention of researchers and practitioners since its emergence in 1950s. This has been reflected by the increasing number of companies attempting to implement its practices and the large number of researchers investigating its effectiveness and identifying important contextual factors which affect its implementation. The rising level of interest in lean system has led to the emergence of three distinctive streams of literature. The first stream of literature has focused on the effectiveness of lean system. However, this literature was limited as it mainly examined the additive impact of lean practices on operational performance in the manufacturing context. The second stream of literature has focused on the role the accounting system in the lean context. In this body of literature, there was an agreement among researchers on the superiority of activity-based costing system (ABC) over the traditional accounting system in supporting the implementation of lean practices. However, most studies in this strand of literature were either conceptual or case-based studies. The third stream of literature has focused on the fit between business strategy and lean system. However, inconclusive results were reported in relation to the suitability of lean system to firms adopting the differentiation strategy and others adopting the cost leadership strategy. The aim of this study is to develop and empirically test a conceptual model which integrates the three distinctive streams of literature to extend their focus and overcome their limitations. More specifically, the model developed in the current study highlights not only the additive impact of lean practices but also the possible synergy among those practices in improving both operational and financial performance of service firms. In addition, the model brings to light the potential intervening role of ABC in the strategy-lean association. After identifying and reviewing the relevant literature, the socio-technical system theory and contingency theory were used to develop the conceptual model and associated hypotheses. A questionnaire instrument was designed to collect empirical data which was supplemented by objective data from the Financial Analysis Made Easy database in order to empirically test the conceptual model using partial least squares structural equation modelling (PLS-SEM). The findings of this study indicated that while the technical practices of lean service improved only the operational performance of service firms, the social practices enhanced both operational and financial performance. In addition, the two sets of practices positively interacted to improve firm performance over and above the improvement achieved from each set separately. Moreover, ABC was found to have a positive association with lean practice, and consequently an indirect positive relation with firm operational performance. Finally, both the differentiation and cost leadership strategy had a direct positive relationship with lean practices. However, while ABC was found to partially mediate the differentiation-lean association, it suppressed the cost leadership-lean association leading to a case of inconsistent mediation. The current study contributes to the current literature at different levels. First, at the theoretical level, this study develops a conceptual framework which crosses different streams of literatures mainly, lean system literature, management accounting literature (with focus on ABC), and business strategy literature. Unlike previous studies, by integrating the perspective of socio-technical system theory and contingency theory, the model (i) highlights not only the additive but also the synergistic effect of lean service practices on firm performance, (ii) brings to light the direct impact of ABC and business strategy on lean service practices and the intervening role of ABC due to which the business strategy is assumed to have also an indirect influence on lean practices, and (iii) offers an alternative view on how ABC can improve firm performance by enhancing other organisational capabilities (lean practices) which are expected to improve performance . Second, at the methodological level, unlike previous studies, this study includes a large number of lean service practices and contextual variables to report more precisely on the lean-performance association. In addition, the inclusion of the financial performance dimension-measured by secondary data- in the model besides the operational performance is critical to understand the full capability of lean service in improving firm performance. Further, employing a powerful statistical technique (PLS-SEM) provides more credibility to the results reported in this study. Third, at the empirical level, this study is conducted in the UK service sector. As such, this study is one of the very few studies that have reported on lean service and examined how the adoption of ABC and a specific type of business strategy can affect its implementation using empirical survey data from this context.
245

Supervised Descent Method

Xiong, Xuehan 01 September 2015 (has links)
In this dissertation, we focus on solving Nonlinear Least Squares problems using a supervised approach. In particular, we developed a Supervised Descent Method (SDM), performed thorough theoretical analysis, and demonstrated its effectiveness on optimizing analytic functions, and four other real-world applications: Inverse Kinematics, Rigid Tracking, Face Alignment (frontal and multi-view), and 3D Object Pose Estimation. In Rigid Tracking, SDM was able to take advantage of more robust features, such as, HoG and SIFT. Those non-differentiable image features were out of consideration of previous work because they relied on gradient-based methods for optimization. In Inverse Kinematics where we minimize a non-convex function, SDM achieved significantly better convergence than gradient-based approaches. In Face Alignment, SDM achieved state-of-the-arts results. Moreover, it was extremely computationally efficient, which makes it applicable for many mobile applications. In addition, we provided a unified view of several popular methods including SDM on sequential prediction, and reformulated them as a sequence of function compositions. Finally, we suggested some future research directions on SDM and sequential prediction.
246

Sparse Linear Modeling of Speech from EEG / Gles Linjära Modellering av Tal från EEG

Tiger, Mattias January 2014 (has links)
For people with hearing impairments, attending to a single speaker in a multi-talker background can be very difficult and something which the current hearing aids can barely help with. Recent studies have shown that the audio stream a human focuses on can be found among the surrounding audio streams, using EEG and linear models. With this rises the possibility of using EEG to unconsciously control future hearing aids such that the attuned sounds get enhanced, while the rest are damped. For such hearing aids to be useful for every day usage it better be using something other than a motion sensitive, precisely placed EEG cap. This could possibly be archived by placing the electrodes together with the hearing aid in the ear. One of the leading hearing aid manufacturer Oticon and its research lab Erikholm Research Center have recorded an EEG data set of people listening to sentences and in which electrodes were placed in and closely around the ears. We have analyzed the data set by applying a range of signal processing approaches, mainly in the context of audio estimation from EEG. Two different types of linear sparse models based on L1-regularized least squares are formulated and evaluated, providing automatic dimensionality reduction in that they significantly reduce the number of channels needed. The first model is based on linear combinations of spectrograms and the second is based on linear temporal filtering. We have investigated the usefulness of the in-ear electrodes and found some positive indications. All models explored consider the in-ear electrodes to be the most important, or among the more important, of the 128 electrodes in the EEG cap.This could be a positive indication of the future possibility of using only electrodes in the ears for future hearing aids.
247

TEACHING AN ALGEBRAIC EQUATION TO HIGH SCHOOL STUDENTS WITH MODERATE TO SEVERE INTELLECTUAL DISABILITY

Chapman, Suzannah M. 01 January 2016 (has links)
The purpose of this study was to examine the effectiveness of using the system of least prompts and concrete representations to teach students with moderate and severe disabilities (MSD) to solve simple linear equations. A multiple-probe (days) across participants, single case research design was used to evaluate the effectiveness of task analytic instruction along with concrete representation on teaching students with MSD to solve algebraic equations. The results showed the system of least prompts and concrete representations were effective in teaching students with MSD to solve simple linear equations.
248

The role of the human nasal cavity in patterns of craniofacial covariation and integration

Lindal, Joshua 18 January 2016 (has links)
Climate has a selective influence on nasal cavity morphology. Due to the constraints of cranial integration, naturally selected changes in one structure necessitate changes in others in order to maintain structural and functional cohesion. The relationships between climate and skull/nasal cavity morphology have been explored, but the integrative role of nasal variability within the skull as a whole has not. This thesis presents two hypotheses: 1) patterns of craniofacial integration observed in 2D can be reproduced using 3D geometric morphometric techniques; 2) the nasal cavity exhibits a higher level of covariation with the lateral cranial base than with other parts of the skull, since differences in nasal morphology and basicranial breadth have both been linked to climatic variables. The results support the former hypothesis, but not the latter; covariation observed between the nasal cavity and other cranial modules may suggest that these relationships are characterized by a unique integrative relationship. / February 2016
249

Aspects of model development using regression quantiles and elemental regressions

Ranganai, Edmore 03 1900 (has links)
Dissertation (PhD)--University of Stellenbosch, 2007. / ENGLISH ABSTRACT: It is well known that ordinary least squares (OLS) procedures are sensitive to deviations from the classical Gaussian assumptions (outliers) as well as data aberrations in the design space. The two major data aberrations in the design space are collinearity and high leverage. Leverage points can also induce or hide collinearity in the design space. Such leverage points are referred to as collinearity influential points. As a consequence, over the years, many diagnostic tools to detect these anomalies as well as alternative procedures to counter them were developed. To counter deviations from the classical Gaussian assumptions many robust procedures have been proposed. One such class of procedures is the Koenker and Bassett (1978) Regressions Quantiles (RQs), which are natural extensions of order statistics, to the linear model. RQs can be found as solutions to linear programming problems (LPs). The basic optimal solutions to these LPs (which are RQs) correspond to elemental subset (ES) regressions, which consist of subsets of minimum size to estimate the necessary parameters of the model. On the one hand, some ESs correspond to RQs. On the other hand, in the literature it is shown that many OLS statistics (estimators) are related to ES regression statistics (estimators). Therefore there is an inherent relationship amongst the three sets of procedures. The relationship between the ES procedure and the RQ one, has been noted almost “casually” in the literature while the latter has been fairly widely explored. Using these existing relationships between the ES procedure and the OLS one as well as new ones, collinearity, leverage and outlier problems in the RQ scenario were investigated. Also, a lasso procedure was proposed as variable selection technique in the RQ scenario and some tentative results were given for it. These results are promising. Single case diagnostics were considered as well as their relationships to multiple case ones. In particular, multiple cases of the minimum size to estimate the necessary parameters of the model, were considered, corresponding to a RQ (ES). In this way regression diagnostics were developed for both ESs and RQs. The main problems that affect RQs adversely are collinearity and leverage due to the nature of the computational procedures and the fact that RQs’ influence functions are unbounded in the design space but bounded in the response variable. As a consequence of this, RQs have a high affinity for leverage points and a high exclusion rate of outliers. The influential picture exhibited in the presence of both leverage points and outliers is the net result of these two antagonistic forces. Although RQs are bounded in the response variable (and therefore fairly robust to outliers), outlier diagnostics were also considered in order to have a more holistic picture. The investigations used comprised analytic means as well as simulation. Furthermore, applications were made to artificial computer generated data sets as well as standard data sets from the literature. These revealed that the ES based statistics can be used to address problems arising in the RQ scenario to some degree of success. However, due to the interdependence between the different aspects, viz. the one between leverage and collinearity and the one between leverage and outliers, “solutions” are often dependent on the particular situation. In spite of this complexity, the research did produce some fairly general guidelines that can be fruitfully used in practice. / AFRIKAANSE OPSOMMING: Dit is bekend dat die gewone kleinste kwadraat (KK) prosedures sensitief is vir afwykings vanaf die klassieke Gaussiese aannames (uitskieters) asook vir data afwykings in die ontwerpruimte. Twee tipes afwykings van belang in laasgenoemde geval, is kollinearitiet en punte met hoë hefboom waarde. Laasgenoemde punte kan ook kollineariteit induseer of versteek in die ontwerp. Na sodanige punte word verwys as kollinêre hefboom punte. Oor die jare is baie diagnostiese hulpmiddels ontwikkel om hierdie afwykings te identifiseer en om alternatiewe prosedures daarteen te ontwikkel. Om afwykings vanaf die Gaussiese aanname teen te werk, is heelwat robuuste prosedures ontwikkel. Een sodanige klas van prosedures is die Koenker en Bassett (1978) Regressie Kwantiele (RKe), wat natuurlike uitbreidings is van rangorde statistieke na die lineêre model. RKe kan bepaal word as oplossings van lineêre programmeringsprobleme (LPs). Die basiese optimale oplossings van hierdie LPs (wat RKe is) kom ooreen met die elementale deelversameling (ED) regressies, wat bestaan uit deelversamelings van minimum grootte waarmee die parameters van die model beraam kan word. Enersyds geld dat sekere EDs ooreenkom met RKe. Andersyds, uit die literatuur is dit bekend dat baie KK statistieke (beramers) verwant is aan ED regressie statistieke (beramers). Dit impliseer dat daar dus ‘n inherente verwantskap is tussen die drie klasse van prosedures. Die verwantskap tussen die ED en die ooreenkomstige RK prosedures is redelik “terloops” van melding gemaak in die literatuur, terwyl laasgenoemde prosedures redelik breedvoerig ondersoek is. Deur gebruik te maak van bestaande verwantskappe tussen ED en KK prosedures, sowel as nuwes wat ontwikkel is, is kollineariteit, punte met hoë hefboom waardes en uitskieter probleme in die RK omgewing ondersoek. Voorts is ‘n lasso prosedure as veranderlike seleksie tegniek voorgestel in die RK situasie en is enkele tentatiewe resultate daarvoor gegee. Hierdie resultate blyk belowend te wees, veral ook vir verdere navorsing. Enkel geval diagnostiese tegnieke is beskou sowel as hul verwantskap met meervoudige geval tegnieke. In die besonder is veral meervoudige gevalle beskou wat van minimum grootte is om die parameters van die model te kan beraam, en wat ooreenkom met ‘n RK (ED). Met sodanige benadering is regressie diagnostiese tegnieke ontwikkel vir beide EDs en RKe. Die belangrikste probleme wat RKe negatief beinvloed, is kollineariteit en punte met hoë hefboom waardes agv die aard van die berekeningsprosedures en die feit dat RKe se invloedfunksies begrensd is in die ruimte van die afhanklike veranderlike, maar onbegrensd is in die ontwerpruimte. Gevolglik het RKe ‘n hoë affiniteit vir punte met hoë hefboom waardes en poog gewoonlik om uitskieters uit te sluit. Die finale uitset wat verkry word wanneer beide punte met hoë hefboom waardes en uitskieters voorkom, is dan die netto resultaat van hierdie twee teenstrydige pogings. Alhoewel RKe begrensd is in die onafhanklike veranderlike (en dus redelik robuust is tov uitskieters), is uitskieter diagnostiese tegnieke ook beskou om ‘n meer holistiese beeld te verkry. Die ondersoek het analitiese sowel as simulasie tegnieke gebruik. Voorts is ook gebruik gemaak van kunsmatige datastelle en standard datastelle uit die literatuur. Hierdie ondersoeke het getoon dat die ED gebaseerde statistieke met ‘n redelike mate van sukses gebruik kan word om probleme in die RK omgewing aan te spreek. Dit is egter belangrik om daarop te let dat as gevolg van die interafhanklikheid tussen kollineariteit en punte met hoë hefboom waardes asook dié tussen punte met hoë hefboom waardes en uitskieters, “oplossings” dikwels afhanklik is van die bepaalde situasie. Ten spyte van hierdie kompleksiteit, is op grond van die navorsing wat gedoen is, tog redelike algemene riglyne verkry wat nuttig in die praktyk gebruik kan word.
250

Στεγανογραφία ψηφιακών εικόνων

Μπαλκούρας, Σωτήριος 14 October 2013 (has links)
Η ανάπτυξη του διαδικτύου τα τελευταία χρόνια έχει φέρει αλλαγές στο μέγεθος και την ποιότητα του διαθέσιμου περιεχομένου. Οι χρήστες κυριολεκτικά κατακλύζονται από πληροφορία η οποία μπορεί να έχει διάφορες μορφές όπως κείμενο, ήχο, εικόνα, βίντεο. Η μεγάλη εξάπλωση του διαδικτύου, η εύκολη αναζήτηση σε μεγάλο όγκο πληροφορίας καθώς και η παρουσίαση του περιεχομένου με φιλικό τρόπο προς το χρήστη συνέβαλε στην ολοένα αυξανόμενη ανάγκη για προμήθεια εικόνων, βίντεο και μουσικής. Η ψηφιοποίηση του μεγαλύτερου όγκου περιεχομένου που διαχειρίζονται οι χρήστες τόσο στην προσωπική όσο και στην επαγγελματική ζωή τους οδήγησε στην ανάπτυξη νέων τεχνικών στεγανογραφίας για την ανταλλαγή κρυφής πληροφορίας, έννοια η οποία είναι ευρέως γνωστή από την αρχαιότητα. Η παρούσα μεταπτυχιακή εργασία υλοποιεί δύο από τους πιο δημοφιλείς αλγορίθμους στεγανογράφησης τον (Least Significant Bit) και τον LBP (Local Binary Pattern). Το σύστημα που αναπτύχθηκε είναι διαθέσιμο στο διαδίκτυο και μπορεί να χρησιμοποιηθεί από οποιοδήποτε χρήστη επιθυμεί να αποκρύψει πληροφορία (κείμενο ή εικόνα) μέσα σε μια εικόνα. Το σύστημα υλοποιεί όλο τον κύκλο της στεγανογράφησης δίνοντας τη δυνατότητα στο χρήστη όχι μόνο να κάνει απόκρυψη της πληροφορίας που επιθυμεί αλλά και την αντίστροφη διαδικασία δηλαδή την ανάκτηση της κρυμμένης πληροφορίας. Η διαδικασία είναι απλή και απαιτεί από τον αποστολέα (αυτός που κρύβει το μήνυμα) το ανέβασμα της εικόνας στο σύστημα, την εισαγωγή ενός μυστικού κλειδιού το οποίο πρέπει να είναι γνωστό για την ανάκτηση του μηνύματος, και φυσικά το μήνυμα, δηλαδή η προς απόκρυψη πληροφορία. Στη συνέχεια ο παραλήπτης για να ανακτήσει το μήνυμα θα πρέπει να ανεβάσει στο σύστημα τη στεγανογραφημένη εικόνα καθώς και το μυστικό κλειδί που έχει συμφωνήσει με τον αποστολέα. Τέλος, με κάποια σενάρια χρήσης, πραγματοποιούνται μετρήσεις, οι οποίες δείχνουν την απόδοση κάθε αλγορίθμου και γίνονται οι αντίστοιχες συγκρίσεις. Το σύστημα που υλοποιήθηκε στην παρούσα εργασία μπορεί να συμπεριλάβει και άλλες μεθόδους στεγανογράφησης καθώς επίσης και με την επέκταση του αλγορίθμου LBP ώστε να χρησιμοποιεί και τις τρεις χρωματικές συνιστώσες για την απόκρυψη της πληροφορίας.. Επίσης, θα είχε ιδιαίτερο ενδιαφέρον η παροχή της συγκεκριμένης διαδικασίας σαν ηλεκτρονική υπηρεσία (web service) ώστε να είναι εφικτό να χρησιμοποιηθεί ανεξάρτητα και να μπορεί να εισαχθεί ως αυτόνομο κομμάτι λογισμικού σε κάθε πλατφόρμα που υποστηρίζει web services. / The development of the internet in recent years has brought changes in the size and quality of the available content. Users literally flooded with information which may have various forms like text, audio, image, and video. The wide spread of the internet, the ease of search in a large amount of information and the presentation of the available content in a friendly way resulted in the need for more images, videos and music. With the digitization of the available content new steganography techniques were necessary so that users can exchange secret information. In the current thesis two of the most popular steganography algorithms are implemented: the LSB (Least Significant Bit) and the LBP (Local Binary Pattern). The system is publicly available and can be used by any user who wishes to hide information (text or image) within an image. The system provides functionalities so that user can hide information within an image and recover the hidden information. The sender (the person who wishes to hide a message) has to provide the following information in the system: upload the image, provide the secret key needed to retrieve the message, and upload the message. The receiver has to upload the image containing the message and the secret key needed to recover the message. Anumber of usage scenarios are implemented to measure the performance of the algorithms and make comparisons. The implemented system can easily include more steganografy methods and also the extension of the LBP algorithm so that the three color components are used to hide the information. It would be interested to provide the current process as an e-service (web service) that it is feasible to be used independently and can be introduced as a standalone piece of software in any platform that supports web services.

Page generated in 0.236 seconds