Spelling suggestions: "subject:"net"" "subject:"neto""
311 |
From Stable to Lab—Investigating Key Factors for Sudden Deaths Caused by Streptococcus suisHennig-Pauka, Isabel, Imker, Rabea, Mayer, Leonie, Brügmann, Michael, Werckenthin, Christiane, Weber, Heike, Menrath, Andrea, de Buhr, Nicole 11 April 2023 (has links)
Swine stocks are endemically infected with the major porcine pathogen Streptococcus (S.)
suis. The factors governing the transition from colonizing S. suis residing in the tonsils and the
exacerbation of disease have not yet been elucidated. We analyzed the sudden death of fattening
pigs kept under extensive husbandry conditions in a zoo. The animals died suddenly of septic shock
and showed disseminated intravascular coagulopathy. Genotypic and phenotypic characterizations
of the isolated S. suis strains, a tonsillar isolate and an invasive cps type 2 strain, were conducted.
Isolated S. suis from dead pigs belonged to cps type 2 strain ST28, whereas one tonsillar S. suis
isolate harvested from a healthy animal belonged to ST1173. Neither S. suis growth, induction of
neutrophil extracellular traps, nor survival in blood could explain the sudden deaths. Reconstituted
blood assays with serum samples from pigs of different age groups from the zoo stock suggested
varying protection of individuals against pathogenic cps type 2 strains especially in younger pigs.
These findings highlight the benefit of further characterization of the causative strains in each case by
sequence typing before autologous vaccine candidate selection.
|
312 |
Modelling and Quantitative Analysis of Performance vs Security Trade-offs in Computer Networks: An investigation into the modelling and discrete-event simulation analysis of performance vs security trade-offs in computer networks, based on combined metrics and stochastic activity networks (SANs)Habib Zadeh, Esmaeil January 2017 (has links)
Performance modelling and evaluation has long been considered of paramount
importance to computer networks from design through development, tuning and
upgrading. These networks, however, have evolved significantly since their first introduction
a few decades ago. The Ubiquitous Web in particular with fast-emerging
unprecedented services has become an integral part of everyday life. However, this
all is coming at the cost of substantially increased security risks. Hence cybercrime is
now a pervasive threat for today’s internet-dependent societies. Given the frequency
and variety of attacks as well as the threat of new, more sophisticated and destructive
future attacks, security has become more prevalent and mounting concern in
the design and management of computer networks. Therefore equally important if
not more so is security.
Unfortunately, there is no one-size-fits-all solution to security challenges. One security
defence system can only help to battle against a certain class of security threats. For overall security, a holistic approach including both reactive and proactive
security measures is commonly suggested. As such, network security may have
to combine multiple layers of defence at the edge and in the network and in its
constituent individual nodes.
Performance and security, however, are inextricably intertwined as security measures
require considerable amounts of computational resources to execute. Moreover, in
the absence of appropriate security measures, frequent security failures are likely
to occur, which may catastrophically affect network performance, not to mention
serious data breaches among many other security related risks.
In this thesis, we study optimisation problems for the trade-offs between performance
and security as they exist between performance and dependability. While
performance metrics are widely studied and well-established, those of security are
rarely defined in a strict mathematical sense. We therefore aim to conceptualise and
formulate security by analogy with dependability so that, like performance, it can
be modelled and quantified.
Having employed a stochastic modelling formalism, we propose a new model for a
single node of a generic computer network that is subject to various security threats.
We believe this nodal model captures both performance and security aspects of a
computer node more realistically, in particular the intertwinements between them.
We adopt a simulation-based modelling approach in order to identify, on the basis
of combined metrics, optimal trade-offs between performance and security and facilitate
more sophisticated trade-off optimisation studies in the field.
We realise that system parameters can be found that optimise these abstract combined
metrics, while they are optimal neither for performance nor for security individually.
Based on the proposed simulation modelling framework, credible numerical
experiments are carried out, indicating the scope for further work extensions for a
systematic performance vs security tuning of computer networks.
|
313 |
Perineuronal nets and the inhibitory circuitry of the auditory midbrain: evidence for subtypes of GABAergic neuronsBeebe, Nichole L. 26 July 2016 (has links)
No description available.
|
314 |
[en] USE OF PETRI NET TO MODEL RESOURCE ALLOCATION IN PROCESS MINING / [pt] USO DE REDES DE PETRI NA MODELAGEM DE ALOCAÇÃO DE RECURSOS EM MINERAÇÃO DE PROCESSOSBEATRIZ MARQUES SANTIAGO 22 November 2019 (has links)
[pt] Business Process Management é a ciência de observar como o trabalho é realizado em determinada organização garantindo produtos consistentes e se aproveitando de oportunidades de melhoria. Atualmente, boa parte dos processos são realizados em frameworks, muitos com armazenamento de arquivos de log, no qual é disponibilizada uma grande quantidade de informação que pode ser explorada de diferentes formas e com diferentes objetivos, área denominada como Mineração de Processos. Apesar de muitos desses dados contemplarem o modo como os recursos são alocados para cada atividade, o foco maior dos trabalhos nessa área é na descoberta do processo e na verificação de conformidade do mesmo. Nesta dissertação é proposto um modelo em petri net que incorpora a alocação de recurso, de forma a poder explorar as propriedades deste tipo de modelagem, como por exemplo a definição de todos os estados possíveis. Como aplicação do modelo, realizou-se um estudo comparativo entre duas políticas, uma mais especialista, de alocação de recurso, e outra mais generalista usando simulações de Monte Carlo com distribuição de probabilidade exponencial para o início de novos casos do processo e para estimação do tempo de execução do par recurso atividade. Sendo assim, para avaliação de cada política foi usado um sistema de pontuação que considera o andamento do processo e o tempo total de execução do mesmo. / [en] Business Process Management is the science of observing how the work is performed in a given organization ensuring consistent products and seeking opportunities for improvement. Currently, most of the processes are performed in frameworks, many with log files, in which a large amount of data is available. These data can be explored in different ways and with different objectives, giving rise to the Process Mining area. Although many of these data informs how resources are allocated for each activity, the major focus of previous work is on the discovery process techniques and process compliance. In this thesis a petri net model that incorporates resource allocation is proposed exploring the properties of this type of modeling, such as the definition of all possible states. As a model validation, it is applied in a
comparative study between two resource allocation policies, one considering the expertise of each resource and other with a more generalist allocation. The arrival of new cases and the resource-activity pair execution time were estimated by Monte Carlo simulations with exponential probability distribution. Thus, for the evaluation of each policy a scoring system was used considering the progress of the process and the total execution time.
|
315 |
Effects of Glycosaminoglycans on DNase-Mediated Degradation of DNA, DNA-Histone Complexes, and NETsSohrabipour, Sahar January 2020 (has links)
Neutrophil extracellular traps (NETs) are a link between infection and coagulation in sepsis. The major structural component of NETs is nucleosomes, consisting of DNA and histones. NETs not only act as a scaffold to trap platelets, but NET components also promote coagulation and impair fibrinolysis. Thus, removal of extracellular DNA by DNases may be a potential therapeutic strategy for sepsis. Since heparin is used for thromboprophylaxis in sepsis and may also be a potential anti-sepsis therapy, we investigated the mechanisms by which various forms of heparins modulate DNase function.
There are two types of DNases in vivo: DNase I (produced by exocrine and endocrine glands) and DNase1L3 (secreted by immune cells). DNase I cleaves free DNA, whereas DNase1L3 preferentially cleaves DNA in complex with proteins such as histones. In this study, we investigated how DNase I and DNase1L3 activities are modulated by the following heparins: unfractionated heparin (UFH), enoxaparin (a low-molecular-weight heparin), Vasoflux (a low-molecular-weight, non-anticoagulant heparin), and fondaparinux (the pentasaccharide unit).
Using agarose gel experiments, we showed that UFH, enoxaparin, and Vasoflux enhance the ability of DNase I to digest DNA-histone complexes (presumably by displacing DNA from histones), whereas fondaparinux does not. These findings are consistent with the KD values of the binding of heparin variants to histones, with fondaparinux having >1000-fold lower affinity for histones compared to the other heparins. Taken together, our data suggests that the ability of heparin to enhance DNase I-mediated digestion of DNA-histone complexes is size-dependent and independent of the pentasaccharide region of heparin. With respect to DNase1L3, we observed that it is able to digest histone-bound DNA, and that all heparins, except fondaparinux, inhibited DNase1L3-mediated digestion of histone-bound DNA.
Next, we visualized the degradation of NETs by fluorescence microscopy. DNase I (± heparin variants) completely degraded NETs, presumably by digesting extracellular chromatin at histone-free linker regions, thereby releasing nucleosome units. DNase1L3 also degraded NETs, but not as effectively as DNase I, and was inhibited by all heparins except fondaparinux. Finally, we showed that DNase I levels are decreased and DNase1L3 levels are elevated in septic patients. Taken together, our findings demonstrate that heparin modulates the function of DNases, and that endogenous DNase levels are altered in sepsis pathophysiology. / Thesis / Master of Science (MSc) / Sepsis, a life-threatening condition due to hyperactivation of the immune system in response to infection, results in widespread inflammation and blood clotting. During sepsis, immune cells release sticky strands of DNA that block blood vessels and damage organs. Two different enzymes in the blood (DNase I and DNase1L3) can digest these DNA strands, and may represent a new class of anti-sepsis drugs. Our goal was to determine how heparins, commonly used blood thinners, alter the function of these enzymes. We found that (a) larger-sized heparins improved the activity of DNase I towards DNA-histone complexes and do not require any specific portion of heparin, (b) DNase I is more efficient than DNase1L3 in digesting DNA strands released from immune cells, and (c) levels of DNase I and DNase1L3 are altered in septic patients. Taken together, our studies provide new insights into how these enzymes function.
|
316 |
Uncertainty-aware dynamic reliability analysis framework for complex systemsKabir, Sohag, Yazdi, M., Aizpurua, J.I., Papadopoulos, Y. 18 October 2019 (has links)
Yes / Critical technological systems exhibit complex dynamic characteristics such as time-dependent
behavior, functional dependencies among events, sequencing and priority of causes that may alter the effects
of failure. Dynamic fault trees (DFTs) have been used in the past to model the failure logic of such systems,
but the quantitative analysis of DFTs has assumed the existence of precise failure data and statistical
independence among events, which are unrealistic assumptions. In this paper, we propose an improved
approach to reliability analysis of dynamic systems, allowing for uncertain failure data and statistical and
stochastic dependencies among events. In the proposed framework, DFTs are used for dynamic failure
modeling. Quantitative evaluation of DFTs is performed by converting them into generalized stochastic Petri
nets. When failure data are unavailable, expert judgment and fuzzy set theory are used to obtain reasonable
estimates. The approach is demonstrated on a simplified model of a cardiac assist system. / DEIS H2020 Project under Grant 732242.
|
317 |
Revisiting aryl N-methylcarbamate acetylcholinesterase inhibitors as potential insecticides to combat the malaria-transmitting mosquito, Anopheles gambiaeHartsel, Joshua Alan 31 May 2011 (has links)
My graduate work focused on the syntheses and pharmacology of species-selective aryl methylcarbamate acetylcholinesterase inhibitors to combat the malaria-transmitting mosquito, Anopheles gambiae. We identified six novel carbamates that demonstrated levels of target selectivity exceeding our project milestone of 100-fold. Among the C2-substituted phenylcarbamates examined (class II), 2'-(2- ethylbutoxy)phenyl N-methylcarbamate (9bd*) was extraordinarily selective (570-fold ± 72). The high level of selectivity observed for many of the class II carbamates was attributed to a helical displacement within the active site of An. gambiae acetylcholinesterase, able to accommodate carbamates with larger C2-substituted secondary β-branching side chains. Conversely, this type of side chain forms unfavorable interactions within the active site of human acetylcholinesterase. The C3-substituted carbamates (class I), such as terbam (9c), were less selective than many of the class II carbamates; however, class I carbamates related to terbam (9c) were highly toxic to An. gambiae. In particular, the contact toxicity measured for 9c (LC₅₀ = 0.037 mg/mL) was equal to the commonly used agricultural insecticide, propoxur (9a, LC₅₀ = 0.037 mg/mL). In total, seventy aryl carbamates were screened for their inhibition potency and contact toxicity towards An. gambiae.
The common final step in all of these syntheses was the carbamoylation of a phenol, which normally proceeded in a 70 to 90% yield. Thirty seven novel carbamates are reported out of the seventy two prepared. Although sixteen of the phenols were commercially available, the others were prepared with known and adapted synthetic methodologies. The emerging structure-activity relationships led us to focus on the synthesis of 3-tert-alkylphenols (Class I) and 2-alkoxy or 2-alkylthio-substituted phenols (Class II). Three methods particularly stand out: First, we applied the methods of Tanaka to prepare 3-tert-alkylphenols wherein a methyl group was replaced by a trifluoromethyl group. Second, we adapted the methods of Tanaka to prepare 3-tert-alkylphenols that lack fluorine substitution. This method is competitive with the little known method of Reetz to convert aryl ketones to the corresponding 1,1-dimethylalkyl group and allows one to access electron rich tert-alkyl-substituted aromatics that are not accessible by the Friedel-Crafts alkylation (Friedel-Crafts restricted). Third, we found a convenient and high-yielding method for selective S-alkylation of 2-mercaptophenol. In addition to the synthesis of carbamates, the preparation of one hundred three intermediates, phenols, and electron rich tert-alkyl arenes are reported. / Ph. D.
|
318 |
Improving Bio-Inspired FrameworksVaradarajan, Aravind Krishnan 05 October 2018 (has links)
In this thesis, we provide solutions to two different bio-inspired algorithms. The first is enhancing the performance of bio-inspired test generation for circuits described in RTL Verilog, specifically for branch coverage. We seek to improve upon an existing framework, BEACON, in terms of performance. BEACON is an Ant Colony Optimization (ACO) based test generation framework. Similar to other ACO frameworks, BEACON also has a good scope in improving performance using parallel computing. We try to exploit the available parallelism using both multi-core Central Processing Units (CPUs) and Graphics Processing Units(GPUs). Using our new multithreaded approach we can reduce test generation time by a factor of 25 — compared to the original implementation for a wide variety of circuits. We also provide a 2-dimensional factoring method for BEACON to improve available parallelism to yield some additional speedup. The second bio-inspired algorithm we address is for Deep Neural Networks. With the increasing prevalence of Neural Nets in artificial intelligence and mission-critical applications such as self-driving cars, questions arise about its reliability and robustness. We have developed a test-generation based technique and metric to evaluate the robustness of a Neural Nets outputs based on its sensitivity to its inputs. This is done by generating inputs which the neural nets find difficult to classify but at the same time is relatively apparent to human perception. We measure the degree of difficulty for generating such inputs to calculate our metric. / MS / High-level Hardware Design Languages (HDLs) has allowed designers to implement complicated hardware designs with considerably lesser effort. Unfortunately, design verification for the same circuits has failed to scale gracefully in terms of time and effort. Not only has it become more difficult for formal methods due to exponential complexity from increasing path explosion, but concrete test generation frameworks also face new issues such as the increased requirement in the volume of simulations. The advent of parallel computing using General Purpose Graphics Processing Units (GPGPUs) has led to improved performance for various applications. We propose to leverage both the multi-core CPU and the GPGPU for RTL test generation. This is achieved by implementing a test generation framework that can utilize the SIMD type parallelism available in GPGPUs and task level parallelism available on CPUs. The speedup achieved is extracted from both the test generation framework itself and also from refactoring the hardware model for multi-threaded test generation. For this purpose, we translate the RTL Verilog to a C++ and a CUDA compilable program. Experimental results show that considerable speedup can be achieved for test generation without loss of coverage.
In recent years, machine learning and artificial intelligence have taken a substantial leap forward with the discovery of Deep Neural Networks(DNN). Unfortunately, apart from Accuracy and FTest numbers, there exist very few metrics to qualify a DNN. This becomes a reliability issue as DNNs are quite frequently used in safety-critical applications. It is difficult to interpret how the parameters of a trained DNN help store the knowledge from the training inputs. Therefore it is also difficult to infer whether a DNN has learned parameters which might cause an output neuron to misfire wrongly, a bug. An exhaustive search of the input space of the DNN is not only infeasible but is also misleading. Thus, in our work, we try to apply test generation techniques to generate new test inputs based on existing training and testing set to qualify the underlying robustness. Attempts to generate these inputs are guided only by the prediction probability values at the final output layer. We observe that depending on the amount of perturbation and time needed to generate these inputs we can differentiate between DNNs of varying quality.
|
319 |
Stochastic Petri Net Models of Service Availability in a PBNM System for Mobile Ad Hoc NetworksBhat, Aniket Anant 15 July 2004 (has links)
Policy based network management is a promising approach for provisioning and management of quality of service in mobile ad hoc networks. In this thesis, we focus on performance evaluation of this approach in context of the amount of service received by certain nodes called policy execution points (PEPs) or policy clients from certain specialized nodes called the policy decision points (PDPs) or policy servers. We develop analytical models for the study of the system behavior under two scenarios; a simple Markovian scenario where we assume that the random variables associated with system processes follow an exponential distribution and a more complex non-Markovian scenario where we model the system processes according to general distribution functions as observed through simulation. We illustrate that the simplified Markovian model provides a reasonable indication of the trend of the service availability seen by policy clients and highlight the need for an exact analysis of the system without relying on Poisson assumptions for system processes. In the case of the more exact non-Markovian analysis, we show that our model gives a close approximation to the values obtained via empirical methods. Stochastic Petri Nets are used as performance evaluation tools in development and analysis of these system models. / Master of Science
|
320 |
Analysis Techniques for Concurrent Programming LanguagesTamarit Muñoz, Salvador 02 September 2013 (has links)
Los lenguajes concurrentes est an cada d a m as presentes en nuestra sociedad,
tanto en las nuevas tecnolog as como en los sistemas utilizados de manera cotidiana. M as a un, dada la actual distribuci on de los sistemas y su arquitectura interna,
cabe esperar que este hecho siga siendo una realidad en los pr oximos a~nos. En
este contexto, el desarrollo de herramientas de apoyo al desarrollo de programas
concurrentes se vuelve esencial. Adem as, el comportamiento de los sistemas concurrentes es especialmente dif cil de analizar, por lo que cualquier herramienta que
ayude en esta tarea, a un cuando sea limitada, ser a de gran utilidad. Por ejemplo, podemos encontrar herramientas para la depuraci on, an alisis, comprobaci on,
optimizaci on, o simpli caci on de programas. Muchas de ellas son ampliamente
utilizadas por los programadores hoy en d a.
El prop osito de esta tesis es introducir, a trav es de diferentes lenguajes de
programaci on concurrentes, t ecnicas de an alisis que puedan ayudar a mejorar la
experiencia del desarrollo y publicaci on de software para modelos concurrentes.
En esta tesis se introducen tanto an alisis est aticos (aproximando todas las posibles ejecuciones) como din amicos (considerando una ejecuci on en concreto). Los
trabajos aqu propuestos di eren lo su ciente entre s para constituir ideas totalmente independientes, pero manteniendo un nexo com un: el hecho de ser un
an alisis para un lenguaje concurrente. Todos los an alisis presentados han sido
de nidos formalmente y se ha probado su correcci on, asegurando que los resultados obtenidos tendr an el grado de abilidad necesario en sistemas que lo requieran,
como por ejemplo, en sistemas cr ticos. Adem as, se incluye la descripci on de las
herramientas software que implementan las diferentes ideas propuestas. Esto le da
al trabajo una utilidad m as all a del marco te orico, permitiendo poner en pr actica
y probar con ejemplos reales los diferentes an alisis.
Todas las ideas aqu presentadas constituyen, por s mismas, propuestas aplicables en multitud de contextos y problemas actuales. Adem as, individualmente sirven de punto de partida para otros an alisis derivados, as como para la adaptaci on
a otros lenguajes de la misma familia. Esto le da un valor a~nadido a este trabajo,
como bien atestiguan algunos trabajos posteriores que ya se est an bene ciando de
los resultados obtenidos en esta tesis. / Concurrent languages are increasingly present in our society, both in new
technologies and in the systems used on a daily basis. Moreover, given the
current systems distribution and their internal architecture, one can expect
that this remains so in the coming years. In this context, the development of
tools to support the implementation of concurrent programs becomes essential.
Futhermore, the behavior of concurrent systems is particularly difficult
to analyse, so that any tool that helps in this task, even if in a limited way,
will be very useful. For example, one can find tools for debugging, analysis,
testing, optimisation, or simplification of programs, which are widely used
by programmers nowadays.
The purpose of this thesis is to introduce, through various concurrent programming
languages, some analysis techniques that can help to improve the
experience of the software development and release for concurrent models.
This thesis introduces both static (approximating all possible executions) and
dynamic (considering a specific execution) analysis. The topics considered
here differ enough from each other to be fully independent. Nevertheless,
they have a common link: they can be used to analyse properties of a concurrent
programming language. All the analyses presented here have been
formally defined and their correctness have been proved, ensuring that the
results will have the reliability degree which is needed for some systems (for
instance, for critical systems). It also includes a description of the software
tools that implement the different ideas proposed. This gives the work a usefulness
well beyond the theoretical aspect, allowing us to put it in practice
and to test the different analyses with real-world examples All the ideas here presented are, by themselves, approaches that can be applied
in many current contexts and problems. Moreover, individually they
serve as a starting point for other derived analysis, as well as for the adaptation
to other languages of the same family. This gives an added value to
this work, a fact confirmed by some later works that are already benefiting
from the results obtained in this thesis. / Tamarit Muñoz, S. (2013). Analysis Techniques for Concurrent Programming Languages [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/31651
|
Page generated in 0.1324 seconds