Spelling suggestions: "subject:"achine/deep learning"" "subject:"cachine/deep learning""
1 |
Drug Repositioning through the Development of Diverse Computational Methods using Machine Learning, Deep Learning, and Graph MiningThafar, Maha A. 30 June 2022 (has links)
The rapidly increasing number of existing drugs with genomic, biomedical, and pharmacological data make computational analyses possible, which reduces the search space for drugs and facilitates drug repositioning (DR). Thus, artificial intelligence, machine learning, and data mining have been used to identify biological interactions such as drug-target interactions (DTI), drug-disease associations, and drug-response. The prediction of these biological interactions is seen as a critical phase needed to make drug development more sustainable. Furthermore, late-stage drug development failures are usually a consequence of ineffective targets. Thus, proper target identification is needed. In this dissertation, we tried to address three crucial problems associated with the DR pipeline and presents several novel computational methods developed for DR.
First, we developed three network-based DTI prediction methods using machine learning, graph embedding, and graph mining. These methods significantly improved prediction performance, and the best-performing method reduces the error rate by more than 33% across all datasets compared to the best state-of-the-art method. Second, because it is more insightful to predict continuous values that indicate how tightly the drug binds to a specific target, we conducted a comparison study of current regression-based methods that predict drug-target binding affinities (DTBA). We discussed how to develop more robust DTBA methods and subsequently developed Affinity2Vec, the first regression-based method that formulates the entire task as a graph-based method and combines several computational techniques from feature representation learning, graph mining, and machine learning with no 3D structural data of proteins. Affinity2Vec outperforms the state-of-the-art methods. Finally, since drug development failure is associated with sub-optimal target identification, we developed the first DL-based computational method (OncoRTT) to identify cancer-specific therapeutic targets for the ten most common cancers worldwide. Implementing our approach required creating a suitable dataset that could be used by the computational method to identify oncology-related DTIs. Thus, we created the OncologyTT datasets to build and evaluate our OncoRTT method. Our methods demonstrated their efficiency by achieving high prediction performance and identifying therapeutic targets for several cancer types.
Overall, in this dissertation, we developed several computational methods to solve biomedical domain problems, specifically drug repositioning, and demonstrated their efficiencies and capabilities.
|
2 |
Conversational artificial intelligence - demystifying statistical vs linguistic NLP solutionsPanesar, Kulvinder 05 October 2020 (has links)
yes / This paper aims to demystify the hype and attention on chatbots and its association with conversational artificial intelligence. Both are slowly emerging as a real presence in our lives from the impressive technological developments in machine learning, deep learning and natural language understanding solutions. However, what is under the hood, and how far and to what extent can chatbots/conversational artificial intelligence solutions work – is our question. Natural language is the most easily understood knowledge representation for people, but certainly not the best for computers because of its inherent ambiguous, complex and dynamic nature. We will critique the knowledge representation of heavy statistical chatbot solutions against linguistics alternatives. In order to react intelligently to the user, natural language solutions must critically consider other factors such as context, memory, intelligent understanding, previous experience, and personalized knowledge of the user. We will delve into the spectrum of conversational interfaces and focus on a strong artificial intelligence concept. This is explored via a text based conversational software agents with a deep strategic role to hold a conversation and enable the mechanisms need to plan, and to decide what to do next, and manage the dialogue to achieve a goal. To demonstrate this, a deep linguistically aware and knowledge aware text based conversational agent (LING-CSA) presents a proof-of-concept of a non-statistical conversational AI solution.
|
3 |
SEMANTIC INTELLIGENCE FOR KNOWLEDGE-BASED COMPLIANCE CHECKING OF UNDERGROUND UTILITIESXin Xu (9183590) 30 July 2020 (has links)
<p>Underground utilities must comply
with the requirements stipulated in utility regulations to ensure their
structural integrity and avoid interferences and disruptions of utility
services. Noncompliance with the regulations could cause disastrous consequences
such as pipeline explosion and pipeline contamination that can lead to hundreds
of deaths and huge financial loss. However, the current practice of utility compliance
checking relies on manual efforts to examine lengthy textual regulations,
interpret them subjectively, and check against massive and heterogeneous
utility data. It is time-consuming, costly, and error prone. There remains a
critical need for an effective mechanism to help identify the regulatory
non-compliances in new utility designs or existing pipelines to limit possible
negative impacts. Motivated by this critical need, this research aims to create
an intelligent, knowledge-based method to automate the compliance checking for
underground utilities. </p>
<p>The overarching goal is to build
semantic intelligence to enable knowledge-based, automated compliance checking
of underground utilities by integrating semantic web technologies, natural
language processing (NLP), and domain ontologies. Three specific objectives
are: (1) designing an ontology-based framework for integrating massive and heterogeneous
utility data for automated compliance checking, (2) creating a semi-automated
method for utility ontology development, and (3) devising a semantic NLP approach
for interpreting textual utility regulations. Objective 1 establishes the
knowledge-based skeleton for utility compliance checking. Objectives 2 and 3 build
semantic intelligence into the framework resulted from Objective 1 for improved
performance in utility compliance checking. </p>
<p>Utility compliance checking is
the action that examines geospatial data of utilities and their surroundings
against textual utility regulations. The integration of heterogeneous
geospatial data of utilities as well as textual data remains a big challenge. Objective
1 is dedicated to addressing this challenge. An ontology-based framework has
been designed to integrate heterogeneous data and automate compliance checking through
semantic, logic, and spatial reasoning. The framework consists of three key
components: (1) four interlinked ontologies that provide the semantic schema to
represent heterogeneous data, (2) two data convertors to transform data from
proprietary formats into a common and interoperable format, and (3) a reasoning
mechanism with spatial extensions for detecting non-compliances. The
ontology-based framework was tested on a sample utility database, and the
results proved its effectiveness.</p>
<p>Two supplementary methods were
devised to build the semantic intelligence in the ontology-based framework. The
first one is a novel method that integrates the top-down strategy and NLP to
address two semantic limitations in existing ontologies for utilities: lack of
compatibility with existing utility modeling initiatives and relatively small
vocabulary sizes. Specifically, a base ontology is first developed by
abstracting the modeling information in CityGML Utility Network ADE through a
series of semantic mappings. Then, a novel integrated NLP approach is devised
to automatically learn the semantics from domain glossaries. Finally, the
semantics learned from the glossaries are incorporated into the base ontology
to result in a domain ontology for utility infrastructure. For case
demonstration, a glossary of water terms was learned to enrich the base
ontology (formalized from the ADE) and the resulting ontology was evaluated to
be an accurate, sufficient, and shared conceptualization of the domain. </p>
<p>The second one is an ontology-
and rule-based NLP approach for automated interpretation of textual regulations
on utilities. The approach integrates ontologies to capture both domain and
spatial semantics from utility regulations that contain a variety of technical
jargons/terms and spatial constraints regarding the location and clearance of
utility infrastructure. The semantics are then encoded into pattern-matching
rules for extracting the requirements from the regulations. An ontology- and
deontic logic-based mechanism have also been integrated to facilitate the
semantic and logic-based formalization of utility-specific regulatory
knowledge. The proposed approach was tested in interpreting the spatial
configuration-related requirements in utility accommodation policies, and
results proved it to be an effective means for interpreting utility regulations
to ensure the compliance of underground utilities. </p>
<p>The main outcome of this research
is a novel knowledge-based computational platform with semantic intelligence
for regulatory compliance checking of underground utilities, which is also the
primary contribution of this research. The knowledge-based computational
platform provides a declarative way rather than the otherwise
procedural/hard-coding implementation approach to automate the overall process
of utility compliance checking, which is expected to replace the conventional
costly and time-consuming skill-based practice. Utilizing this computational
platform for utility compliance checking will help eliminate non-compliant
utility designs at the very early stage and identify non-compliances in
existing utility records for timely correction, thus leading to enhanced safety
and sustainability of the massive utility infrastructure in the U.S.</p>
|
4 |
Learning Latent Temporal Manifolds for Recognition and Prediction of Multiple Actions in Streaming Videos using Deep NetworksNair, Binu Muraleedharan 03 June 2015 (has links)
No description available.
|
Page generated in 0.0955 seconds