• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 38
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 73
  • 14
  • 12
  • 11
  • 10
  • 9
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Remote Technical Support Needs for Hospital Personnel : Using Q-methodology to Examine Remote Support Solutions in Healthcare / Behoven av fjärrstyrda tekniska supportlösningar för sjukvårdspersonal : En tillämpning av Q-metodologi för att undersöka fjärrstyrda tekniska supportlösningar i sjukvården

Fendukly, Mattias January 2018 (has links)
Remote control and management functions are widely utilized in multiple industries.The remote control and management functions has allowed for peopleto connect and interact to solve technical problems more efficiently. However,the healthcare organizations have not utilized the remote controlling and managementfunctions to a degree similar to other industries. Telephoning ande-mailing are still two mainstream ways of work when it comes to solvingtechnical support issues in-house. In order to understand what the technicalpersonnel and the clinical users at a hospital desires in new solutions, thismaster thesis project aimed at finding the existing needs in terms of remotecontrolling and management functions. To find these needs, Q-methodologywas applied for collection of subjective data from healthcare personnel abouta software device that aims at providing remote controlling and managementfunctions. In addition to finding and defining the needs, this thesis also aimedat examining how well such systems can address these needs. Performing this methodology three factors where found representing three differentattitudes regarding the needs for remote functions. The three factorsare "Technical Communication is Significant", "Functionality Appreciativeand Experienced" and "Do if fast!". These factors and their interpretationhelps to be aware of and to evaluate remote support solutions in a systematicway. / Tillämpningen av tekniska distanslösningar används i flertalet industrier i olika syften. Dessa tillämpningar har möjliggjort att människor kan ansluta och interagera med varandra för att på ett effektivt sätt lösa tekniska problem. Trots detta har inte vårdgivarorganisationer tillämpat dessa typer av lösningar i en liknande utsträckning jämfört med många andra industrier. Kommunikationskanaler som telefonsamtal och e-post är fortfarande vanliga när vårdpersonal bemöter tekniska problem som ska lösas internt. Syftet med denna uppsats är att hitta och definiera de befintliga behov som teknisk personal och kliniska användare upplever på sjukhus gällande tekniska distansl ösningar. För att hitta och definiera dessa behov har Q-methodology tillämpats för att systematiskt samla subjektiv data från vårdpersonal gällande ett nytt verktyg som ämnar till att leverera tekniska distanslösningar till sjukvården. Utöver detta har denna uppsats också undersökt hur väl denna typ av nya verktyg tillfredsställer de adresserade behov som beskrivits ovan.   Applicerandet av denna metodik resulterade i tre åsiktsgrupper som representerar tre olika attityder gällande behov för tekniska distanslösningar. Dessa tre åsiktsgrupper är "Technical Communication is Signficant", "Functionality Appreciative and Experienced" och "Do it fast! ". Vetskapen om dessa existerande_åsiktsgrupper bidrar till en större förståelse och en större förmåga att utvärdera tekniska distanslösningar.
62

La protection des libertés individuelles sur le réseau internet / The protection of Individuals rights on the internet

Criqui-Barthalais, Géraldine 07 December 2018 (has links)
Cette étude envisage le réseau internet comme un nouvel espace invitant à réinterpréter les libertés de la personne physique. Au titre de celles-ci, sont protégées la liberté individuelle, entendue comme le fait de ne pouvoir être arbitrairement détenu et la liberté d’aller et venir. Il doit en aller de même sur le réseau. Etablissant une analogie avec ces libertés, la première partie de la thèse consacre deux libertés : la liberté d’accès au réseau et la liberté de naviguer sur le web. La première implique de définir le contenu d’un service public de l’accès. De plus, il faut affirmer que la coupure d’accès au réseau doit être envisagée comme une mesure privative de liberté ; elle ne peut donc être décidée que par le juge judiciaire. L’affirmation de la liberté de naviguer sur le web conduit à envisager le régime du blocage des sites, une mesure qui ne peut intervenir que dans le cadre d’une police administrative spéciale. Dans la seconde partie il apparaît que ces deux libertés n’ont toutefois de sens que si l’individu a accès au réseau anonymement et n’est pas surveillé arbitrairement quand il navigue sur le web. Cette étude cherche ainsi à préciser le régime devant encadrer le mécanisme d’adressage du réseau. Sont définies les conditions du contrôle de l’identité de l’internaute à partir de son adresse IP. Enfin, il est soutenu qu’un principe général d’effacement des données révélant les sites visités doit être affirmé, principe qui s’applique aux différents acteurs du réseau, notamment les moteurs de recherche. L’interception de ces données ne peut procéder que d’un pouvoir sécuritaire ou hiérarchique sur l’internaute. / This study considers the internet as a new territory where rights guaranteed to each individual in physical space can be promoted; not only free speech and privacy, but also the Habeas Corpus prerogative writ, which protects against unlawful imprisonment, and the right to freedom of movement. Thus, processing by analogy, the dissertation intends to promote two specific digital rights: the freedom to connect to the internet and the freedom to surf on the web. The freedom to connect should be part of a public service which promotes this access through public policies. Moreover, barring someone from using the internet can only be decided by a judge. The freedom to surf should protect the web users against unreasonable restrictions. Thus, measures blocking illegal websites should not come through self-regulation but through a legal framework which defines how administrative authorities are entitled to decide such restrictions. The protection of these two rights entails further obligations. Individuals must access the internet anonymously and they must be aware of how the government monitors their actions on the web. This study tries to outline the content of measures aiming to frame network addressing mechanisms. Identity checks based on the IP address should be subject to a strict legal regime. The study concludes that individuals have to be protected from surveillance when data reveal their choices among websites while they are connected. Internet access providers, but also search engines and browsers, must delete this data. Only special measures taken by a public entity or someone entitled to control the web users may lead to this kind of data retention.
63

Desenvolvimento e caracterização de uma mão robótica acionada por atuadores de liga com memória de forma

Silva, André Felipe Cavalcante 28 August 2015 (has links)
Submitted by Maike Costa (maiksebas@gmail.com) on 2017-05-24T14:46:36Z No. of bitstreams: 1 arquivototal.pdf: 13879425 bytes, checksum: b7216c05b85b83f1b7adc4577ae083ba (MD5) / Made available in DSpace on 2017-05-24T14:46:36Z (GMT). No. of bitstreams: 1 arquivototal.pdf: 13879425 bytes, checksum: b7216c05b85b83f1b7adc4577ae083ba (MD5) Previous issue date: 2015-08-28 / This work is based on studies that prove a great rejection rate by amputees on using prosthetic upper limbs due to various problems, such as weight, high noise and lack of anthropomorphism. In this context, this paper presents the development of a robotic hand drive which is not realized by conventional actuators, constituted of wires of a shape memory alloy (SMA). The mechanical structure of the robot hand was designed in CAD software and subsequently manufactured with ABS polymer through rapid prototyping using a three-dimensional printer. The project was designed partially based on the physiological characteristics of the human hand, regarding especially to the angles formed by fingers’ phalanges. A mechanical system was developed in order to compactly accommodate the thin wires of a NiTi SMA, known in this work as Artificial Muscle (AM) which made the compression of SMA wires easier. The performance (operation) of the fingers occurs by the AM activation that are connected to cables arranged in the lower part of the fingers’ structure, which, when activated, perform the movement of flexion. The return of each phalanx, or extension movement of the fingers occurs passively. Elastic elements were installed in the upper part of the phalanges which are responsible for this movement. To monitor the angles formed by each phalanx was used a resistive type sensor that is inserted inside it, varying its electric resistance according to angle variation. On top of this system, a fuzzy logic based controller was developed and it proved to be effective on monitoring the position of the robotic hand’s fingers. The performance of the robot hand can be considered appropriate, as it could partially achieve the desired angles based on the project design. / Este trabalho está fundamentado em estudos que comprovam o grande índice de rejeição por parte de pessoas amputadas ao utilizarem próteses de membros superiores devido a problemas diversos, tais como: peso, ruído elevado e falta de antropomorfismo. Nesse contexto, neste trabalho é apresentado o desenvolvimento de uma mão robótica cujo acionamento é realizado por atuadores não convencionais, constituídos de fios de uma Liga com Memória de Forma (LMF). A estrutura mecânica da mão robótica foi projetada em programa computacional CAD e posteriormente fabricada em polímero ABS por meio de prototipagem rápida usando uma impressora tridimensional. O projeto foi concebido parcialmente com base nas características fisiológicas da mão humana, no que diz respeito principalmente aos ângulos formados pelas falanges dos dedos. Foi desenvolvido um sistema mecânico para acondicionar os fios finos de uma LMF de Ni-Ti de forma compacta denominado neste trabalho por Músculo Artificial (MA) o que facilitou a compactação dos fios de LMF. A atuação dos dedos ocorre pela ativação dos MA que estão conectados a cabos dispostos na parte inferior da estrutura dos dedos, os quais, ao serem ativados, realizam o movimento de flexão. O retorno de cada falange, ou seja, movimento de extensão dos dedos, ocorre de forma passiva. Foram instalados na parte superior das falanges elementos elásticos que são responsáveis por este movimento. Para monitorar os ângulos formados por cada falange foi utilizado um sensor do tipo resistivo que fica inserido dentro das falanges, variando sua resistência elétrica de acordo com a variação dos ângulos. Somado a este sistema, foi desenvolvido um controlador baseado em lógica fuzzy que se mostrou eficiente no monitoramento da posição dos dedos da mão robótica. Verificou-se que o desempenho da mão robótica pode ser considerado adequado, pois conseguiu atingir parcialmente os ângulos desejados de projeto.Este trabalho está fundamentado em estudos que comprovam o grande índice de rejeição por parte de pessoas amputadas ao utilizarem próteses de membros superiores devido a problemas diversos, tais como: peso, ruído elevado e falta de antropomorfismo. Nesse contexto, neste trabalho é apresentado o desenvolvimento de uma mão robótica cujo acionamento é realizado por atuadores não convencionais, constituídos de fios de uma Liga com Memória de Forma (LMF). A estrutura mecânica da mão robótica foi projetada em programa computacional CAD e posteriormente fabricada em polímero ABS por meio de prototipagem rápida usando uma impressora tridimensional. O projeto foi concebido parcialmente com base nas características fisiológicas da mão humana, no que diz respeito principalmente aos ângulos formados pelas falanges dos dedos. Foi desenvolvido um sistema mecânico para acondicionar os fios finos de uma LMF de Ni-Ti de forma compacta denominado neste trabalho por Músculo Artificial (MA) o que facilitou a compactação dos fios de LMF. A atuação dos dedos ocorre pela ativação dos MA que estão conectados a cabos dispostos na parte inferior da estrutura dos dedos, os quais, ao serem ativados, realizam o movimento de flexão. O retorno de cada falange, ou seja, movimento de extensão dos dedos, ocorre de forma passiva. Foram instalados na parte superior das falanges elementos elásticos que são responsáveis por este movimento. Para monitorar os ângulos formados por cada falange foi utilizado um sensor do tipo resistivo que fica inserido dentro das falanges, variando sua resistência elétrica de acordo com a variação dos ângulos. Somado a este sistema, foi desenvolvido um controlador baseado em lógica fuzzy que se mostrou eficiente no monitoramento da posição dos dedos da mão robótica. Verificou-se que o desempenho da mão robótica pode ser considerado adequado, pois conseguiu atingir parcialmente os ângulos desejados de projeto.
64

Gerador a relutância variável em conexão com a rede elétrica para injeção de potência ativa

Viajante, Ghunter Paulo 27 September 2013 (has links)
This work shows a contribution to the study of the Switched Reluctance Machine driven as a generator to connecting in the grid to inject active power. The main objective is show the Switched Reluctance Generator behavior under various operating conditions, as well as the development of an electronic converter to connect it to the low voltage grid. Thus, it presents a mathematical model for the Switching Reluctance Generator which includes the magnetic circuit saturation and a computational model for analyzing steady state and transient. Also it is presented a strategy of generated voltage control by the magnetization angle variation acting only in the top switch of the Asymmetrical Half Bridge converter. An intermediate stage was added in the switching control strategy to reduce the energy amount provided by the excitation source and get better utilization of electromechanical conversion. A detailed description of the control system equations of the DC-AC stage, PLL algorithm and design of injected current compensator are presented and discussed. Finally, it was constructed in laboratory a experimental platform for verification of the theoretical and simulation. / Este trabalho apresenta uma contribuição ao estudo da Máquina a Relutância Variável acionada como gerador para conexão com rede elétrica e injeção de potência ativa. O objetivo principal é apresentar o comportamento do Gerador a Relutância Variável sob diversas condições de operação, bem como o desenvolvimento de um conversor eletrônico para a sua conexão a rede elétrica de baixa tensão. Assim, é apresentado um modelo matemático para o Gerador a Relutância Variável que contempla a saturação do circuito magnético e um modelo computacional para análise em regime permanente e transitório. Também é apresentada uma estratégia de controle da tensão gerada através da variação do ângulo de magnetização atuando apenas nas chaves superiores do conversor assimétrico Half Bridge. Uma etapa intermediária no chaveamento foi acrescentada à estratégia de controle para diminuir a quantidade de energia fornecida pela fonte de excitação e obter melhor aproveitamento da conversão eletromecânica. Uma descrição detalhada do equacionamento do sistema de controle do estágio CC-CA, do algoritmo de PLL e projeto do compensador da corrente injetada são apresentados e discutidos. Por fim, construiu-se uma plataforma experimental em laboratório para a comprovação dos estudos teóricos e de simulação. / Doutor em Ciências
65

Characterizing the Third-Party Authentication Landscape : A Longitudinal Study of how Identity Providers are Used in Modern Websites / Longitudinella mätningar av användandet av tredjepartsautentisering på moderna hemsidor

Josefsson Ågren, Fredrik, Järpehult, Oscar January 2021 (has links)
Third-party authentication services are becoming more common since it eases the login procedure by not forcing users to create a new login for every website thatuses authentication. Even though it simplifies the login procedure the users still have to be conscious about what data is being shared between the identity provider (IDP) and the relying party (RP). This thesis presents a tool for collecting data about third-party authentication that outperforms previously made tools with regards to accuracy, precision and recall. The developed tool was used to collect information about third-party authentication on a set of websites. The collected data revealed that third-party login services offered by Facebook and Google are most common and that Twitters login service is significantly less common. Twitter's login service shares the most data about the users to the RPs and often gives the RPs permissions to perform write actions on the users Twitter account.  In addition to our large-scale automatic data collection, three manual data collections were performed and compared to previously made manual data collections from a nine-year period. The longitudinal comparison showed that over the nine-year period the login services offered by Facebook and Google have been dominant.It is clear that less information about the users are being shared today compared to earlier years for Apple, Facebook and Google. The Twitter login service is the only IDP that have not changed their permission policies. This could be the reason why the usage of the Twitter login service on websites have decreased.  The results presented in this thesis helps provide a better understanding of what personal information is exchanged by IDPs which can guide users to make well educated decisions on the web.
66

Multi-Constellation GNSS Scintillation at Mid-Latitudes

Jean, Marc Henri 15 December 2016 (has links)
Scintillation of Global Positioning Systems (GPS) signals have been extensively studied at low and high latitude regions of the Earth. It has been shown in past studies that amplitude scintillation is severe at low latitudes and phase scintillation is severe at high latitudes. Unlike low and high latitude regions, mid-latitude scintillation has not been extensively studied. Further, it has been suggested that mid-latitude scintillation is negligible. The purpose of this research is to challenge this belief. A multi-constellation and multi-frequency receiver, that tracks American, Russian, and European satellites, was used to monitor scintillation activity at the Virginia Tech Space Center. Analysis was performed on collected data from various days and compared to past research done at high, mid, and low latitudes. The results are discussed in this thesis. / Master of Science
67

Deep neural networks for natural language processing and its acceleration

Lin, Zhouhan 08 1900 (has links)
Cette thèse par article comprend quatre articles qui contribuent au domaine de l'apprentissage profond, en particulier à l'accélération de l’apprentissage par le biais de réseaux à faible précision et à l'application de réseaux de neurones profonds au traitement du langage naturel. Dans le premier article, nous étudions un schéma d’entraînement de réseau de neurones qui élimine la plupart des multiplications en virgule flottante. Cette approche consiste à binariser ou à ternariser les poids dans la propagation en avant et à quantifier les états cachés dans la propagation arrière, ce qui convertit les multiplications en changements de signe et en décalages binaires. Les résultats expérimentaux sur des jeux de données de petite à moyenne taille montrent que cette approche produit des performances encore meilleures que l’approche standard de descente de gradient stochastique, ouvrant la voie à un entraînement des réseaux de neurones rapide et efficace au niveau du matériel. Dans le deuxième article, nous avons proposé un mécanisme structuré d’auto-attention d’enchâssement de phrases qui extrait des représentations interprétables de phrases sous forme matricielle. Nous démontrons des améliorations dans 3 tâches différentes: le profilage de l'auteur, la classification des sentiments et l'implication textuelle. Les résultats expérimentaux montrent que notre modèle génère un gain en performance significatif par rapport aux autres méthodes d’enchâssement de phrases dans les 3 tâches. Dans le troisième article, nous proposons un modèle hiérarchique avec graphe de calcul dynamique, pour les données séquentielles, qui apprend à construire un arbre lors de la lecture de la séquence. Le modèle apprend à créer des connexions de saut adaptatives, ce qui facilitent l'apprentissage des dépendances à long terme en construisant des cellules récurrentes de manière récursive. L’entraînement du réseau peut être fait soit par entraînement supervisée en donnant des structures d’arbres dorés, soit par apprentissage par renforcement. Nous proposons des expériences préliminaires dans 3 tâches différentes: une nouvelle tâche d'évaluation de l'expression mathématique (MEE), une tâche bien connue de la logique propositionnelle et des tâches de modélisation du langage. Les résultats expérimentaux montrent le potentiel de l'approche proposée. Dans le quatrième article, nous proposons une nouvelle méthode d’analyse par circonscription utilisant les réseaux de neurones. Le modèle prédit la structure de l'arbre d'analyse en prédisant un scalaire à valeur réelle, soit la distance syntaxique, pour chaque position de division dans la phrase d'entrée. L'ordre des valeurs relatives de ces distances syntaxiques détermine ensuite la structure de l'arbre d'analyse en spécifiant l'ordre dans lequel les points de division seront sélectionnés, en partitionnant l'entrée de manière récursive et descendante. L’approche proposée obtient une performance compétitive sur le jeu de données Penn Treebank et réalise l’état de l’art sur le jeu de données Chinese Treebank. / This thesis by article consists of four articles which contribute to the field of deep learning, specifically in the acceleration of training through low-precision networks, and the application of deep neural networks on natural language processing. In the first article, we investigate a neural network training scheme that eliminates most of the floating-point multiplications. This approach consists of binarizing or ternarizing the weights in the forward propagation and quantizing the hidden states in the backward propagation, which converts multiplications to sign changes and binary shifts. Experimental results on datasets from small to medium size show that this approach result in even better performance than standard stochastic gradient descent training, paving the way to fast, hardware-friendly training of neural networks. In the second article, we proposed a structured self-attentive sentence embedding that extracts interpretable sentence representations in matrix form. We demonstrate improvements on 3 different tasks: author profiling, sentiment classification and textual entailment. Experimental results show that our model yields a significant performance gain compared to other sentence embedding methods in all of the 3 tasks. In the third article, we propose a hierarchical model with dynamical computation graph for sequential data that learns to construct a tree while reading the sequence. The model learns to create adaptive skip-connections that ease the learning of long-term dependencies through constructing recurrent cells in a recursive manner. The training of the network can either be supervised training by giving golden tree structures, or through reinforcement learning. We provide preliminary experiments in 3 different tasks: a novel Math Expression Evaluation (MEE) task, a well-known propositional logic task, and language modelling tasks. Experimental results show the potential of the proposed approach. In the fourth article, we propose a novel constituency parsing method with neural networks. The model predicts the parse tree structure by predicting a real valued scalar, named syntactic distance, for each split position in the input sentence. The order of the relative values of these syntactic distances then determine the parse tree structure by specifying the order in which the split points will be selected, recursively partitioning the input, in a top-down fashion. Our proposed approach was demonstrated with competitive performance on Penn Treebank dataset, and the state-of-the-art performance on Chinese Treebank dataset.
68

行動支付之探討-以電信運營商為例

陳忠義 Unknown Date (has links)
由於智慧型手機與各種手持裝置日漸普及,4G行動上網速度與行動支付安全性、便利性與即時性的提升,行動支付(Mobile Payment)在這幾年已然成為電信、金融、手機軟硬體製造產業之間最熱門的話題之一,多樣化之支付型態因應而生。將個人行動裝置結合金融支付工具進行交易,讓使用者不需帶錢包出門即能輕鬆完成小額支付是未來趨勢,世界各國行動支付的比例逐漸提高,邁入無現金社會。 台灣電信運營商也積極在行動支付戰場爭取一席之地,陸續推出電信運營商帳單代付(Direct Carrier Billing,DCB)及近距離無線通訊(Near Field Communication,NFC)相關行動支付應用服務,以增加創新服務體驗及客戶黏著度,最終目標為提高客戶平均貢獻度(Average Revenue Per User,ARPU)。 本研究採用質性研究方法,透過次級資料蒐集、整理及分析,瞭解國外電信運營商之成功案例從中擷取鑑往知來的想法和做法,再深入觀察目前國內電信運營商在DCB的壞帳風險過高的問題及NFC電子錢包等行動支付之發展現況與窘境。最後提出建議,建議一為推動成立「電信聯合徵信中心」以共同降低壞帳風險,同時針對DCB用戶之消費項目進行分析以精準行銷並提升ARPU;建議二為推廣以「行動號碼」為數位身份認證(Mobile Connect)之行動支付平台,不僅可增加電信運營商之營收,更能夠滿足消費者追求快速、便利、安全性的需求。 / With the increasing popularity of smart phones and handheld devices, The 4G high mobility speeds and the security, convenience and real-time of mobile payment were improved. In recent years Mobile Payments have become a hot topic between the telecommunications, financial industry, mobile hardware and software manufacturing industry, diversified payment patterns in response. It is the future trend to combine personal mobile devices with financial payments tools so that users can easily make small payment no need to bring their wallet. The proportion of the Mobile Payment by countries in the world is gradually increasing and moving towards a cashless society. Taiwan's telecom operators are also preparing to carve out a niche for themselves in the mobile payment battlefield by launching Direct Carrier Billing (DCB)and Near Field Communication(NFC)related mobile payment application service to increase innovative service experience and customer adhesion, The ultimate goal is to enhance their Average Revenue Per User(ARPU). This research uses Qualitative Research Methods by collect, collate and analyze secondary data to understand the ideas and practices of observing the successful cases of foreign telecom operators to capture from the past to further understanding and solutions. And in-depth observation of the current domestic telecom operator’s highly bad debt risk in DCB and development situation and dilemma of the NFC related mobile payments. Finally, suggestion one is to facilitate the establishment of "Telecom’s Joint Credit Information Center" to reduce the risk of bad debts and then analyze the consumption items of DCB subscribers for precise marketing to enhance ARPU. Suggestion two is to promote “Mobile Connect” digital authentication solution with “Mobile Directory Number” in mobile payment’s platform. It not only increases the revenue of telecom operators, but also satisfy needs of consumer’s about seeking fast, convenient and safe.
69

Content Distribution in Social Groups

Aggarwal, Saurabh January 2014 (has links) (PDF)
We study Social Groups consisting of self-interested inter-connected nodes looking for common content. We can observe Social Groups in various socio-technological networks, such as Cellular Network assisted Device-to-Device communications, Cloud assisted Peer-to-Peer Networks, hybrid Peer-to-Peer Content Distribution Networks and Direct Connect Networks. Each node wants to acquire a universe of segments at least cost. Nodes can either access an expensive link to the content distributor for downloading data segments, or use the well-connected low cost inter-node network for exchanging segments among themselves. Activation of an inter-node link requires cooperation among the participating nodes and reduces the cost of downloading for the nodes. However, due to uploading costs, Non-Reciprocating Nodes are reluctant to upload segments, in spite of their interest in downloading segments from others. We define the Give-and-Take (GT) criterion, which prohibits non-reciprocating behaviour in Social Groups for all nodes at all instants. In the “Full Exchange” case studied, two nodes can exchange copies of their entire segment sets, if each node gains at least one new segment from the other. Incorporating the GT criterion in the Social Group, we study the problem of downloading the universe at least cost, from the perspective of a new node having no data segments. We analyze this NP-hard problem, and propose algorithms for choosing the initial segments to be downloaded from the content distributor and the sequence of nodes for exchange. We compare the performance of these algorithms with a few existing P2P downloading strategies in terms of cost and running time. In the second problem, we attempt to reduce the load on the content distributor by choosing a schedule of inter-node link activations such that the number of nodes with the universe is maximized. Link activation decisions are taken by a central entity, the facilitator, for achieving the social optimum. We present the asymptotically optimal Randomized algorithm. We also present other algorithms, such as the Greedy Links algorithm and the Polygon algorithm, which are optimal under special scenarios of interest. We compare the performances of all proposed algorithms with the optimal value of the objective. We observe that computationally intensive algorithms exhibit better performance. Further, we consider the problem of decentralized scheduling of links. The decisions of link activations are made by the participating nodes in a distributed manner. While conforming to the GT criterion for inter-node exchanges, each node's objective is to maximize its utility. Each node tries to find a pairing partner by preferentially exploring nodes for link formation. Unpaired nodes choose to download a segment using the expensive link with Segment Aggressiveness Probability (SAP). We present linear complexity decentralized algorithms for nodes to choose their best strategy. We present a decentralized randomized algorithm that works in the absence of the facilitator and performs close to optimal for large number of nodes. We define the Price of Choice to benchmark performance of Social Groups (consisting of non-aggressive nodes) with the optimal. We evaluate the performance of various algorithms and characterize the behavioural regime that will yield best results for node and Social Group as well.
70

Improved in silico methods for target deconvolution in phenotypic screens

Mervin, Lewis January 2018 (has links)
Target-based screening projects for bioactive (orphan) compounds have been shown in many cases to be insufficiently predictive for in vivo efficacy, leading to attrition in clinical trials. Phenotypic screening has hence undergone a renaissance in both academia and in the pharmaceutical industry, partly due to this reason. One key shortcoming of this paradigm shift is that the protein targets modulated need to be elucidated subsequently, which is often a costly and time-consuming procedure. In this work, we have explored both improved methods and real-world case studies of how computational methods can help in target elucidation of phenotypic screens. One limitation of previous methods has been the ability to assess the applicability domain of the models, that is, when the assumptions made by a model are fulfilled and which input chemicals are reliably appropriate for the models. Hence, a major focus of this work was to explore methods for calibration of machine learning algorithms using Platt Scaling, Isotonic Regression Scaling and Venn-Abers Predictors, since the probabilities from well calibrated classifiers can be interpreted at a confidence level and predictions specified at an acceptable error rate. Additionally, many current protocols only offer probabilities for affinity, thus another key area for development was to expand the target prediction models with functional prediction (activation or inhibition). This extra level of annotation is important since the activation or inhibition of a target may positively or negatively impact the phenotypic response in a biological system. Furthermore, many existing methods do not utilize the wealth of bioactivity information held for orthologue species. We therefore also focused on an in-depth analysis of orthologue bioactivity data and its relevance and applicability towards expanding compound and target bioactivity space for predictive studies. The realized protocol was trained with 13,918,879 compound-target pairs and comprises 1,651 targets, which has been made available for public use at GitHub. Consequently, the methodology was applied to aid with the target deconvolution of AstraZeneca phenotypic readouts, in particular for the rationalization of cytotoxicity and cytostaticity in the High-Throughput Screening (HTS) collection. Results from this work highlighted which targets are frequently linked to the cytotoxicity and cytostaticity of chemical structures, and provided insight into which compounds to select or remove from the collection for future screening projects. Overall, this project has furthered the field of in silico target deconvolution, by improving the performance and applicability of current protocols and by rationalizing cytotoxicity, which has been shown to influence attrition in clinical trials.

Page generated in 0.054 seconds