• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 572
  • 129
  • 96
  • 92
  • 87
  • 37
  • 25
  • 21
  • 20
  • 19
  • 19
  • 18
  • 6
  • 6
  • 5
  • Tagged with
  • 1274
  • 335
  • 194
  • 191
  • 190
  • 174
  • 149
  • 115
  • 105
  • 93
  • 84
  • 83
  • 79
  • 75
  • 66
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
941

Critical Firm-based Enablers-Mediators-Outcomes (CFEMOs) : a new integrated model for product innovation performance drivers in the context of U.S. restaurants

Ali, Mohamed Farouk Shehata January 2016 (has links)
This study develops an original theoretical model of critical managerially controllable factors that have high potential for achieving significant improvements in the (intermediate and ultimate) outcome(s) of product innovation efforts. To this end, the author draws on the relevant empirical literature and integrates four complementary theoretical perspectives, namely; the critical success factors (CSFs) approach, the resource-based view (RBV), the input-process-output (IPO) model, and the system(s) approach. The model (hereafter CFEMOs) aims to explicate the simultaneous direct and indirect/mediated interrelationships among the product innovation’s critical firm-based enablers (new-product fit-to-firm’s skills and resources, internal cross-functional integration, and top-management support), process execution proficiency, and performance outcomes (operation-level performance, product-level performance, and firm-level performance). Additionally, it aims to predict the variations of the process execution proficiency and the performance outcomes. The CFEMOs model was empirically tested using an online survey that was completed by 386 U.S. restaurants owners/senior executives on their recently innovated new menu-items. By utilising a partial least squares structural equation modelling, the statistical analysis substantiated that, compared to the models of the extant relevant empirical studies, the CFEMOs model has a broader scope and a superior predictive power. It simultaneously explains 72% of the process execution proficiency, 67% of the new menu-item superiority (quality, speed-to-market, and cost-efficiency), 76% of new menu-item performance (customer satisfaction, sales, and profits), and 75% of the new menu-item contribution to the overall restaurant performance (sales, profits, and market share). Furthermore, this study established that those restaurateurs who concurrently succeed in enhancing their internal cross-functional integration, top-management support, and new-product fit-to-firm’s skills and resources, descendingly ranked, would achieve high process execution proficiency, which subsequently would grant them superior operation-level performance, product-level performance, and firm-level performance. This thesis concludes by providing several key original contributions and crucial implications to product innovation research and practice, as well as offering several promising avenues for future research.
942

Task Oriented Privacy-preserving (TOP) Technologies Using Automatic Feature Selection

Jafer, Yasser January 2016 (has links)
A large amount of digital information collected and stored in datasets creates vast opportunities for knowledge discovery and data mining. These datasets, however, may contain sensitive information about individuals and, therefore, it is imperative to ensure that their privacy is protected. Most research in the area of privacy preserving data publishing does not make any assumptions about an intended analysis task applied on the dataset. In many domains such as healthcare, finance, etc; however, it is possible to identify the analysis task beforehand. Incorporating such knowledge of the ultimate analysis task may improve the quality of the anonymized data while protecting the privacy of individuals. Furthermore, the existing research which consider the ultimate analysis task (e.g., classification) is not suitable for high-dimensional data. We show that automatic feature selection (which is a well-known dimensionality reduction technique) can be utilized in order to consider both aspects of privacy and utility simultaneously. In doing so, we show that feature selection can enhance existing privacy preserving techniques addressing k-anonymity and differential privacy and protect privacy while reducing the amount of modifications applied to the dataset; hence, in most of the cases achieving higher utility. We consider incorporating the concept of privacy-by-design within the feature selection process. We propose techniques that turn filter-based and wrapper-based feature selection into privacy-aware processes. To this end, we build a layer of privacy on top of regular feature selection process and obtain a privacy preserving feature selection that is not only guided by accuracy but also the amount of protected private information. In addition to considering privacy after feature selection we introduce a framework for a privacy-aware feature selection evaluation measure. That is, we incorporate privacy during feature selection and obtain a list of candidate privacy-aware attribute subsets that consider (and satisfy) both efficacy and privacy requirements simultaneously. Finally, we propose a multi-dimensional, privacy-aware evaluation function which incorporates efficacy, privacy, and dimensionality weights and enables the data holder to obtain a best attribute subset according to its preferences.
943

Analýza mezery DPH / The Analysis of the VAT gap

Zídková, Hana January 2013 (has links)
The dissertation thesis describes the methods of calculation of the VAT gap. VAT gap is the difference between the theoretical VAT liability in the whole economy and the accrued VAT receipts. It is expressed in relative terms to the theoretical VAT liability and it approximates the amount of the VAT evasion. The VAT gap in the Czech Republic in the years 2002 to 2010 ranges in average from 16 to 17 %. It is steadily increasing from 11 % to 26 % during the last 5 years of the relevant period. Furthermore, the thesis also analyses the determinants of the VAT gap in the EU member states in the years 2000 to 2006. The share of VAT on total tax revenues and the Tax quota were identified as main tax factors that reduce the VAT gap. On the other hand, the complexity of the VAT system, expressed by the number of VAT rates, the difference between them and the number of preliminary questions to the European Court of Justice, increases the VAT gap. Further significant factor influencing the VAT gap in the negative direction is the Corruption perception Index.
944

Sur quelques problèmes algorithmiques relatifs à la détermination de structure à partir de données de spectrométrie de masse / Topics in mass spectrometry based structure determination

Agarwal, Deepesh 18 May 2015 (has links)
La spectrométrie de masse, initialement développée pour de petites molécules, a permis au cours de la dernière écoulée d’étudier en phase gazeuse des assemblages macro-moléculaires intacts, posant nombre de questions algorithmiques difficiles, dont trois sont étudiées dans cette thèse. La première contribution concerne la détermination de stoichiométrie (SD), et vise à trouver le nombre de copies de chaque constituant dans un assemblage. On étudie le cas où la masse cible se trouve dans un intervalle dont les bornes rendent compte des incertitudes des mesures des masses. Nous présentons un algorithme de taille mémoire constante (DIOPHANTINE), et un algorithme de complexité sensible à la sortie (DP++), plus performants que l’état de l’art, pour des masses en nombre entier ou flottant. La seconde contribution traite de l’inférence de connectivité à partir d’une liste d’oligomères dont la composition en termes de sous-unités est connue. On introduit le problème d’inférence de connectivité minimale (MCI) et présente deux algorithmes pour le résoudre. On montre aussi un accord excellent entre les contacts trouvés et ceux détermines expérimentalement. La troisième contribution aborde le problème d’inférence de connectivité de poids minimal, lorsque chaque contact potentiel a un poids reflétant sa probabilité d’occurrence. On présente en particulier un algorithme de bootstrap permettant de trouver un ensemble d’arêtes de sensitivité et spécificité meilleures que celles obtenues pour les solutions du problème MCI. / Mass spectrometry (MS), an analytical technique initially invented to deal with small molecules, has emerged over the past decade as a key approach in structural biology. The recent advances have made it possible to transfer large macromolecular assemblies into the vacuum without their dissociation, raising challenging algorithmic problems. This thesis makes contributions to three such problems. The first contribution deals with stoichiometry determination (SD), namely the problem of determining the number of copies of each subunit of an assembly, from mass measurements. We deal with the interval SD problem, where the target mass belongs to an interval accounting for mass measurement uncertainties. We present a constant memory space algorithm (DIOPHANTINE), and an output sensitive dynamic programming based algorithm (DP++), outperforming state-of-the-art methods both for integer type and float type problems. The second contribution deals with the inference of pairwise contacts between subunits, using a list of sub-complexes whose composition is known. We introduce the Minimum Connectivity Inference problem (MCI) and present two algorithms solving it. We also show an excellent agreement between the contacts reported by these algorithms and those determined experimentally. The third contribution deals with Minimum Weight Connectivity Inference (MWCI), a problem where weights on candidate edges are available, reflecting their likelihood. We present in particular a bootstrap algorithm allowing one to report a set of edges with improved sensitivity and specificity with respect to those obtaining upon solving MCI.
945

ORGANISATION LIFE CYCLE AND COUNTRY SOCIOECONOMIC CHARACTERISTICS IMPACT ON TOP MANAGEMENT TEAM CHARACTERISTICS / Vliv životního cyklu organizace a socio-ekonomické charakteristiky země na charakteristiku vrcholového managementu

Velinov, Emil Iordanov January 2009 (has links)
The dissertation examines the impact of Organizational Life Cycle (OLC) and the Country Socio Economic Characteristics (CSEC) on Top Management Team (TMT) Characteristics. The dissertation first elaborates and establishes the theoretical link between Organization Life Cycle, Country Socio-Economic Characteristics and characteristics of TMT. Second, a quantitative empirical study is conducted to test the OLC phases and CSEC impact on the TMT characteristics through characteristics. The dissertation outlines a detailed research methodology based on the state-of-art in the area of OLC, TMT and CSEC that will be implemented to answer the key research questions in regards to the scope of the doctoral thesis. Data set is collected from the 300 largest Swiss, German and Czech companies at year-end 2011, including detailed data on the country socio economic characteristics and career backgrounds of all TMT members (executive boards) at these companies at the end of 2011. A detailed procedure is developed to accurately classify organizations at different lifecycle phases, drawing extensively on existing literature and scales. Multilevel data analysis techniques are employed to understand how the different organization lifecycle phases influence both the level of TMT characteristics as well as changes in TMT composition and diversity due to inbound and outbound mobility of top managers over time. Substantial research synergies and knowledge transfer effects expected to emanate from this dissertation. In the dissertation regression and correlation analysis are applied for each phase of the companies' OLC in Switzerland, Germany and the Czech Republic. The dissertation states that more mature the company is more diversified the TMT are regardless the country. Also, the country impact has its own role in the relationship between the OLC and TMT characteristics which is expressed by the findings that Switzerland and Germany are more diversified than the Czech Republic in terms of TMT characteristics as gender diversity, age diversity, nationality diversity, education background of the TMTs, TMT dominant functions and TMT career length. The doctoral thesis contributes to the research by revealing relationships between TMT, CSEC and OLC theories. Also it develops methods and techniques for finding the interconnections between the OLC phases, CSEC with the TMT characteristics and the dissertation outlines the future research gaps in the area of TMT.
946

Evaluating Speedup in Parallel Compilers

Komathukattil, Deepa V 01 January 2012 (has links)
Parallel programming is prevalent in every field mainly to speed up computation. Advancements in multiprocessor technology fuel this trend toward parallel programming. However, modern compilers are still largely single threaded and do not take advantage of the machine resources available to them. There has been a lot of work done on compilers that add parallel constructs to the programs they are compiling, enabling programs to exploit parallelism at run time. Auto parallelization of loops by a compiler is one such example. Researchers have done very little work towards parallelizing the compilation process itself. The research done here focuses on parallel compilers that target computation speedup by parallelizing the process of program compilation during the lexical analysis and semantic analysis phase. Parallelization brings along with it issues like synchronization, concurrency and communication overhead. In the semantic analysis phase, these issues are of particular relevance during the construction of the symbol table. Research done on a concurrent compiler developed at the University of Toronto in 1991 proposed three techniques to address the generation of the symbol table [Seshadri91]. The goal here is to implement a parallel compiler using concepts from those techniques as references. The research done here will augment the work done formerly and measure the performance speedup obtained.
947

Tempo de armazenamento e manejo do painel no valor nutritivo de silagens de milho / Storage period and face management on the nutritional value of corn silage

Daniel Junges 06 October 2014 (has links)
No experimento I, objetivou-se avaliar os efeitos de aditivos microbianos e tempo de armazenamento na qualidade de silagens de milho. A cultura do milho foi ensilada sem aditivos (Controle) ou com inoculantes contendo bactérias láticas homofermentativas (Lactobacillus plantarum + Enterococcus faecium + Pediococcus acidilactici) + enzimas celulolíticas e hemiceluloliticas ou heterofermentativas (Lactobacillus buchneri) aplicados na dose 1 × 105 ufc/g. As silagens foram armazenadas em silos experimentais durante 3, 7, 15, 30, 60, 210, 390, 480 ou 570 dias. Foram avaliados: composição químico-bromatológica, produtos de fermentação, perfil microbiológico, perdas fermentativas, estabilidade aeróbia e degradabilidade ruminal in situ. Os inoculantes não afetaram a maior parte das variáveis estudadas. Entretanto, L. buchneri aumentou a concentração de ácido acético das silagens e diminuiu a deterioração aeróbia, confirmada pelo menor acúmulo térmico durante o ensaio de estabilidade aeróbia. O teor de carboidratos solúveis diminuiu ao longo do tempo de armazenamento, reflexo do metabolismo dos açúcares em produtos de fermentação. As concentrações da prolamina como esperado diminuíram e as concentrações de nitrogênio amoniacal e proteína solúvel aumentaram com os tempos de armazenagem prolongados, reflexo da ocorrência de proteólise na silagem de milho. O pH da silagem diminuiu rapidamente nos primeiros sete dias de armazenamentos mantendo-se estável para os demais tempos de estocagem, diferente do ocorrido para a concentração de ácido acético, que aumentou com o tempo de armazenamento. Verificou-se diminuição na contagem de bactérias láticas e leveduras ao longo do armazenamento. A produção de gás e a perda de matéria seca aumentaram com o tempo de armazenamento. O tempo de armazenamento aumentou todas as variáveis de EA. Os ganhos mais significativos para a EA se deram até aproximadamente 60 dias de armazenamento. A degradação ruminal de amido e, consequentemente de MS, foi incrementada ao longo do armazenamento. No experimento II, objetivou-se avaliar o desempenho de vacas leiteiras em função da estratégia de descarregamento da silagem de milho em silo do tipo trincheira: silagem de milho oriunda da metade superior do silo (topo) ou silagem de milho oriunda da metade inferior do silo (base). Foram utilizadas 24 vacas alocadas em 12 blocos casualisados, com arranjo de reversão simples com períodos de 21 dias. Os animais foram alojados em confinamento tie-stall. As dietas foram iso-protéicas (16,5%) e iso-amiláceas (17,0%), com 60% de silagem de milho (% MS). O consumo de matéria seca, produção e composição do leite foram determinados entre os dias 15 e 21 de cada período experimental. Apesar da silagem oriunda da base do silo levar à maior digestibilidade da dieta e menor concentração de nitrogênio ureico do leite (8,95 e 11,35 mg/dL) não houve efeito da dieta no consumo de matéria seca nem na produção de leite. Sob condições ótimas de manejo, a estratégia de descarregamento da silagem de milho não afeta o desempenho de vacas leiteiras. / In the experiment I, the aim was to evaluate the effects of microbial additives and length of storage on the quality of corn silage. Whole-corn plants were ensiled without or with inoculants containing homofermentative (Lactobacillus plantarum + Enterococcus faecium + Pediococcus acidilactici + cellulolytic and hemicellulolytic enzymes) or heterofermentative lactic acid bacteria (Lactobacillus buchneri) applied at 1 × 105 cfu/g. Treated forages were packed and stored in experimental silos for 3, 7, 15, 30, 60, 210, 390, 480, and 570 days. Samples were evaluated for chemical composition, fermentation end-products, microbial counts, fermentation losses, aerobic stability, and ruminal degradability in situ. Inoculants did not affect most of the variables studied. However, L. buchneri increased acetic acid concentration and decreased aerobic deterioration of silages, as indicated by the lower heat accumulation during the exposure to air. Soluble carbohydrates decreased across the storage period, reflecting the conversion of soluble sugars to fermentation end-products. Concentrations of prolamin decreased, whereas ammonia and soluble protein concentrations increased over the storage period, indicating the proteolysis. The silage pH declined rapidly in the first seven days of storage and remained stable for the remaining storage period, unlike for acetic acid concentration that increased with storage period. On the other hand, counts of lactic acid bacteria and yeasts decreased during the storage. Gas production and dry matter loss increased with the length of storage. The storage period increased all variables stability aerobic. Most of significant improvements in stability aerobic were observed during the first 60 days of storage. Ruminal degradability of starch and, consequently, dry matter increased along the storage. In the experiment II, the aim was to evaluate the influence of strategy of silage unload on the performance of dairy cows. Corn silage from a bunker silo was separated at unloading as silage from the upper half of the silo (top) or from the lower half of the silo (bottom) and used to compose total mixed rations fed to 24 lactating cows allocated in 12 randomized blocks, arranged in a cross-over design with 21 periods. Cows were housed in a tie-stall barn. All diets contained 60% of corn silage and were iso-nitrogen (16.5% CP) and iso-starch (17.0% of starch). Dry matter intake, milk yield and composition were determined from day 15 to 21 in each period. Although silage from bottom led to higher total tract DM digestibility and lower milk urea nitrogen concentration (8.95 vs. 11.35 mg/dL), most of evaluated variables were not affected by treatments. Under optimal silo management, the strategy used to unload corn silage does not affect the performance of dairy cows.
948

Security Auditing and Testing of two Android Client-Server Applications

Engström Ericsson, Matilda January 2020 (has links)
How secure is your application? How can you evaluate if it is secure? The threats are many and may be hard to find. In a world where things are more and more automated; how does manual labour contribute to security auditing applications? This study aims to assess two proof of concept Android client-server applications, developed by students to suit the needs of a fictitious Police Department and Fire Department, respectively. The approach is unconventional yet supported by well-established theory. The gist of a vulnerability assessment methodology initially developed to assess the security of middleware is followed and applied to the entire architecture of these client-server applications. How the manual labour contributed to the end results, in comparison to the use of automated tools and a list of known threats, is then evaluated.   It is concluded that the applications encompass multiple of the Open Web Application Security Project (OWASP) Top 10 Mobile Risks and that automated tools find most of those vulnerabilities. However, relying on automation may lead to a false sense of security, which in effect may cause developers to lose understanding of why vulnerabilities occur and how they should be mitigated. Understanding how the design and architecture of the application influence its security is key.   As of Android 9.0+, default is that applications use SSL encrypted communication. Only 40% of Android users are in 2020 affected by this change according to Android studio developer information, leaving a majority of users unaware of if or how their data is being protected, also observed in analysis results from this thesis work. One should consider if or how to inform users of how their data is being handled, not only in newer Android versions or regarding SSL communication.    This work also shows that developers' decisions may be greatly affected by time pressed situations, which is reflected upon in the last chapter. Another important finding was that the third-party software Sinch, which enabled the use of voice and video communication in one of the applications, sent IP addresses and usernames of the users in clear text during the binding request, when the Session Traversal Utilities for NAT (STUN) protocol was used.
949

Project X : All-in-one WAF testing tool

Anantaprayoon, Amata January 2020 (has links)
Web Application Firewall (WAF) is used to protect the Web application (web app). One of the advantages of having WAF is, it can detect possible attacks even if there is no validation implemented on the web app. But how can WAF protect the web app if WAF itself is vulnerable? In general, four testing methods are used to test WAF such as fuzzing, payload execution, bypassing, and footprinting. There are several open-source WAF testing tools but it appears that it only offers one or two testing methods. That means a tester is required to have multiple tools and learn how each tool works to be able to test WAF using all testing methods. This project aims to solve this difficulty by developing a WAF testing tool called ProjectX that offers all testing methods. ProjectX has been tested on a testing environment and the results show that it fulfilled its requirements. Moreover, ProjectX is available on Github for any developer who want to improve or add more functionality to it.
950

On the empirical measurement of inequality / De la mesure empirique des inégalités

Flores, Ignacio 25 January 2019 (has links)
Le 1er chapitre présente une série de 50 ans sur les hauts revenus chiliens basée sur des données fiscales et comptes nationaux. L’étude contredit les enquêtes, selon lesquelles les inégalités diminuent les 25 dernières années. Au contraire, elles changent de direction à partir de 2000. Le Chili est parmi les pays les plus inégalitaires de l’OCDE et l’Amérique latine. Le 2ème chapitre mesure la sous-estimation des revenus factoriels dans les données distributives. Les ménages ne reçoivent que 50% des revenus du capital brut, par opposition aux firmes. L’hétérogénéité des taux de réponse et autres problèmes font que les enquêtes ne capturent que 20% de ceux-ci, contre 70% du revenu du travail. Cela sous-estime l’inégalité,dont les estimations deviennent insensibles à la "capital share" et sa distribution. Je formalise à partir d’identités comptables pour ensuite calculer des effets marginaux et contributions aux variations d’inégalité. Le 3ème chapitre présente une méthode pour ajuster les enquêtes. Celles-ci capturent souvent mal le sommet de la distribution. La méthode présente plusieurs avantages par rapport aux options précédentes : elle est compatible avec les méthodes de calibration standard ; elle a des fondements probabilistes explicites et préserve la continuité des fonctions de densité ; elle offre une option pour surmonter les limites des supports d’enquête bornées; et elle préserve la structure de micro données en préservant la représentativité des variables sociodémographiques. Notre procédure est illustrée par des applications dans cinq pays, couvrant à la fois des contextes développés et moins développés. / The 1st chapter presents historical series of Chilean top income shares over a period of half a century, mostly using data from tax statistics and national accounts. The study contradicts evidence based on survey data, according to which inequality has fallen constantly over the past 25 years. Rather, it changes direction, increasing from around the year 2000. Chile ranks as one of the most unequal countries among both OECD and Latin American countries over the whole period of study. The 2nd chapter measures the underestimation of factor income in distributive data. I find that households receive only half of national gross capital income,as opposed to corporations. Due to heterogeneous non-response and misreporting, Surveys only capture 20% of it, vs. 70% of labor income. This understates inequality estimates, which become insensitive to the capital share and its distribution. I formalize this system based on accounting identities. I then compute marginal effects and contributions to changes in fractile shares. The 3rd chapter, presents a method to adjust surveys. These generally fail to capturethe top of the income distribution. It has several advantages over previous ones: it is consistent with standard survey calibration methods; it has explicit probabilistic foundations and preserves the continuity of density functions; it provides an option to overcome the limitations of bounded survey-supports; and it preserves the microdata structure of the survey.

Page generated in 0.0418 seconds