• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 190
  • 36
  • 31
  • 30
  • 22
  • 10
  • 5
  • 4
  • 3
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 433
  • 98
  • 96
  • 80
  • 64
  • 55
  • 51
  • 50
  • 41
  • 38
  • 35
  • 34
  • 32
  • 29
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Surrogate endpoints of survival in metastatic carcinoma

Nordman, Ina IC, Clinical School - St Vincent's Hospital, Faculty of Medicine, UNSW January 2008 (has links)
In most randomised controlled trials (RCTs), a large number of patients need to be followed over many years, for the clinical benefit of the drug to be accurately quantified (1). Using an early proxy, or a surrogate endpoint, in place of the direct endpoint of overall survival (OS) could theoretically shorten the duration of RCTs and minimise the exposure of patients to ineffective or toxic treatments (2, 3). This thesis examined the relationship between surrogate endpoints and OS in metastatic colorectal cancer (CRC), advanced non-small cell lung cancer (NSCLC) and metastatic breast cancer (MBC). A review of the literature identified 144 RCTs in metastatic CRC, 189 in advanced NSCLC and 133 in MBC. The publications were generally of poor quality with incomplete reporting on many key variables, making comparisons between studies difficult. The introduction of the CONSORT statement was associated with improvements in the quality of reporting. For CRC (337 arms), NSCLC (429 arms) and MBC (290 arms) there were strong relationships between OS and progression free survival (PFS), time to progression (TTP), disease control rate (DCR), response rate (RR) and partial response (PR). Correlation was also demonstrated between OS and complete response (CR) in CRC and duration of response (DOR) in MBC. However, while strong relationships were found, the proportion of variance explained by the models was small. Prediction bands constructed to determine the surrogate threshold effect size indicated that large improvements in the surrogate endpoints were needed to predict overall survival gains. PFS and TTP showed the most promise as surrogates. The gain in PFS and TTP required to predict a significant gain in overall survival was between 1.2 and 7.0 months and 1.8 and 7.7 months respectively, depending on trial size and tumour type. DCR was a better potential predictor of OS than RR. The results of this study could be used to design future clinical trials with particular reference to the selection of surrogate endpoint and trial size.
92

Design and Synthesis of Peptidomimics Constrained in Helical and Sheet Conformations using a Novel Covalent Surrogate for the Peptide Main Chain Hydrogen Bond

Nallapati, Lakshmi Aparna January 2015 (has links) (PDF)
This thesis entitled “Design and Synthesis of Peptidomimics Constrained in Helical and Sheet Conformations Using a Novel Covalent Surrogate for the Peptide Main Chain Hydrogen Bond” is divided into six chapters. Chapter 1: Introduction to Ordered Conformations of Peptides and Strategies for Constraining Short Peptides in Ordered Conformations. The first chapter describes the different types of protein secondary structures and introduces the various prominent strategies developed thus far to constrain short peptides in ordered secondary structure-like conformations, with specific emphasis on helical and parallel β-sheet folds. Chapter 2: Design of Structure and General Methodology for the synthesis of Novel H-Bond Surrogate Constrained Cyclic α-Helical Mimics Here we develop the first design of the propyl linker as a covalent surrogate for the peptide H-bond. The first synthetic methodology is described for the synthesis of constraining shortest peptide sequences (tripeptides) in α-helix-like conformations. The Macrolactamization strategy proved to work best as the final step for cyclization. All residues of the turn are completely retained in the constrained sequence, unlike any other earlier method. More importantly, there are no metal involved as catalysts in any of the synthetic transformations, hence removing the problem of metal-bound cyclic structures – which have otherwise rendered these structures non-usable as drug leads in the earlier models. Gly-rich peptides have been constrained as extreme cases of highest chain entropy and least helix propensity. Both secondary and tertiary amide containing peptides have been synthesized using this protocol. Note that the macrolactamization was found to be better than the Fukuyama-Mitsunobu N-alkylation protocol for the final cyclization step. Chapter 3: Synthesis of C-terminal Extended HBS-Constrained Helical Turn Mimics – Validation of the Versatility of Current synthetic protocol The developed cyclization protocol is extended towards the synthesis of C-terminal extended α-helical turn mimics using a solution phase peptide synthesis procedure. Peptides which extend belong the helical turn by a high entropy Gly-residue at the C-terminal are synthesized. The versatility of the synthetic methodology to accommodate sterically constrained amino acid residues – in the form of phenylalanine residue – at any of the positions i+1, i+2 or i+3 of the constrained helical turn is demonstrated. The synthesized are easily isolated without need for column chromatography, in high purity and good yields – this is due to the presence of the N-terminal amino group, salts of which are easily triturated to remove all other organic impurities. Chapter 4: Synthesis and CD conformational analyses of HBS constrained α-Helical turn mimics containing residues with improved helical propensities Alanine residue has the highest helix propensity among all other natural α-amino acid residues. Its enthalpic contribution to the helical conformation is 1 kcal/mol more than that for the Gly residue, which has the least propensity. Incorporation of Ala residue in the Gly-rich cyclic sequences in either the middle of constrained tripeptide or as the C-terminal extended residue has been accomplished. Comparison of the CD spectra of the synthesized cyclic α-helical turn peptides reveals that a tertiary amide linkage is essential for the propyl linker at the C-terminal amino appendage, for helicity to be observed. Helicity improves upon introduction of the first extended residue. The constrained and C-terminal extended α-helical turn mimics show consistently high helicity irrespective of the helix propensities of the component residues showing that the covalent propyl linker surrogate for the H-bond overwhelms the natural propensities of individual amino acid residues towards enabling stabilization of the helical turn and offer far better structural organization to this cause. Chapter 5: Synthesis of shortest HBS-constrained 310 and - helical peptide analogues The unique versatility of the novel covalent propyl linker surrogate for the peptide H-bond is exhibited by its ability to constrain dipeptides in 310-helix like structures. This is the first and the only HBS model that can achieve this synthetic target as the synthetic protocol allows the conservation of both the residues as is in the constrained helical turn. Similarly, the trapping of a pentapeptide in a C-terminal extended rare and unstable -helix like cyclic structure using the current HBS linker is achieved. Considering the high entropic cost for cyclizing such a long 16-membered chain into a constrained structure, this again exhibits the versatility of the currently developed HBS design and the currently developed synthetic methodology. Chapter 6: First design and synthesis of novel H-bond surrogate constrained parallel β-sheet mimics H-bonding interactions stabilize another prevalently observed secondary structure, other than helical structures, namely the -sheets. The parallel -sheets that almost qualify for super secondary structures due to the high contact orders in them are thought to mimic in models, unlike the easier antiparallel -sheets. Here we replace the inter-strand peptide H-bond between parallel -strands to create excised templates as parallel -sheet nucleators. The propyl linker acts as a dynamic linker in these models and the two amino groups are protected with bulky sulphonamides, in order to provide Thorpe-Ingold effect to the peptide chain. The protocol for synthesizing these models has been described and the different analogues that are synthesized thus have been described. This is the first instance of synthesis of parallel -sheet mimics using the covalent surrogates for the peptide H-bond.
93

CBAS: A Multi-Fidelity Surrogate Modeling Tool For Rapid Aerothermodynamic Analysis

Tyler Scott Adams (18423228) 23 April 2024 (has links)
<p dir="ltr"> The need to develop reliable hypersonic capabilities is of critical import today. Among the most prominent tools used in recent efforts to overcome the challenges of developing hypersonic vehicles are NASA's Configuration Based Aerodynamics (CBAERO) and surrogate modeling techniques. This work presents the development of a tool, CBAERO Surrogate (CBAS), which leverages the advantages of both CBAERO and surrogate models to create a simple and streamlined method for building an aerodynamic database for any given vehicle geometry. CBAS is capable of interfacing with CBAERO directly and builds Kriging or Co-Kriging surrogate models for key aerodynamic parameters without significant user or computational effort. Two applicable geometries representing hypersonic vehicles have been used within CBAS and the resulting Kriging and Co-Kriging surrogate models evaluated against experimental data. These results show that the Kriging model predictions are accurate to CBAERO's level of fidelity, while the Co-Kriging model predictions fall within 0.5%-5% of the experimental data. These Co-Kriging models produced by CBAS are 10%-50% more accurate than CBAERO and the Kriging models and offer a higher fidelity solution while maintaining low computational expense. Based on these initial results, there are promising advancements to obtain in future work by incorporating CBAS to additional applications.</p>
94

Náhradní mateřství / Surrogare maternity

Masaříková, Andrea January 2011 (has links)
Surrogate maternity ABSTRACT This graduation theses deals with surrogacy, which is really an actual and discussed issue at the present time. The thesis is divided into three main chapters, the first one is devoted to assisted reproduction, the second one belongs to surrogacy, the third chapter deals with determining parentage of artificial insemination. The first part considering assisted reproduction contains a short analysis of this issue from the medical and juridical point of view and also shows an overview of interfility causes and their therapies. The second chapter shows legal regulations of surrogacy abroad that could be seen as an inspiration for both, the current and future legislation of the Czech Republic. This chapter also pays attention to particular law institutes that temporarily adjust this subject-matter, especially the contracts between surrogate mother and requesting couple, adoptions by surrogate mother and is briefly focused on criminal legislation. A view at the change of legislation in connection with acceptance of new civil code is part of this chapter as well. The third and the last chapter is devoted to determination of parenthood. As regards the paternity, there are three basic hypothesis accepted, that are however modificated by the legislation of asisted reproduction in some...
95

Delayed Transfer Entropy applied to Big Data / Delayed Transfer Entropy aplicado a Big Data

Dourado, Jonas Rossi 30 November 2018 (has links)
Recent popularization of technologies such as Smartphones, Wearables, Internet of Things, Social Networks and Video streaming increased data creation. Dealing with extensive data sets led the creation of term big data, often defined as when data volume, acquisition rate or representation demands nontraditional approaches to data analysis or requires horizontal scaling for data processing. Analysis is the most important Big Data phase, where it has the objective of extracting meaningful and often hidden information. One example of Big Data hidden information is causality, which can be inferred with Delayed Transfer Entropy (DTE). Despite DTE wide applicability, it has a high demanding processing power which is aggravated with large datasets as those found in big data. This research optimized DTE performance and modified existing code to enable DTE execution on a computer cluster. With big data trend in sight, this results may enable bigger datasets analysis or better statistical evidence. / A recente popularização de tecnologias como Smartphones, Wearables, Internet das Coisas, Redes Sociais e streaming de Video aumentou a criação de dados. A manipulação de grande quantidade de dados levou a criação do termo Big Data, muitas vezes definido como quando o volume, a taxa de aquisição ou a representação dos dados demanda abordagens não tradicionais para analisar ou requer uma escala horizontal para o processamento de dados. A análise é a etapa de Big Data mais importante, tendo como objetivo extrair informações relevantes e às vezes escondidas. Um exemplo de informação escondida é a causalidade, que pode ser inferida utilizando Delayed Transfer Entropy (DTE). Apesar do DTE ter uma grande aplicabilidade, ele possui uma grande demanda computacional, esta última, é agravada devido a grandes bases de dados como as encontradas em Big Data. Essa pesquisa otimizou e modificou o código existente para permitir a execução de DTE em um cluster de computadores. Com a tendência de Big Data em vista, esse resultado pode permitir bancos de dados maiores ou melhores evidências estatísticas.
96

Delayed Transfer Entropy applied to Big Data / Delayed Transfer Entropy aplicado a Big Data

Jonas Rossi Dourado 30 November 2018 (has links)
Recent popularization of technologies such as Smartphones, Wearables, Internet of Things, Social Networks and Video streaming increased data creation. Dealing with extensive data sets led the creation of term big data, often defined as when data volume, acquisition rate or representation demands nontraditional approaches to data analysis or requires horizontal scaling for data processing. Analysis is the most important Big Data phase, where it has the objective of extracting meaningful and often hidden information. One example of Big Data hidden information is causality, which can be inferred with Delayed Transfer Entropy (DTE). Despite DTE wide applicability, it has a high demanding processing power which is aggravated with large datasets as those found in big data. This research optimized DTE performance and modified existing code to enable DTE execution on a computer cluster. With big data trend in sight, this results may enable bigger datasets analysis or better statistical evidence. / A recente popularização de tecnologias como Smartphones, Wearables, Internet das Coisas, Redes Sociais e streaming de Video aumentou a criação de dados. A manipulação de grande quantidade de dados levou a criação do termo Big Data, muitas vezes definido como quando o volume, a taxa de aquisição ou a representação dos dados demanda abordagens não tradicionais para analisar ou requer uma escala horizontal para o processamento de dados. A análise é a etapa de Big Data mais importante, tendo como objetivo extrair informações relevantes e às vezes escondidas. Um exemplo de informação escondida é a causalidade, que pode ser inferida utilizando Delayed Transfer Entropy (DTE). Apesar do DTE ter uma grande aplicabilidade, ele possui uma grande demanda computacional, esta última, é agravada devido a grandes bases de dados como as encontradas em Big Data. Essa pesquisa otimizou e modificou o código existente para permitir a execução de DTE em um cluster de computadores. Com a tendência de Big Data em vista, esse resultado pode permitir bancos de dados maiores ou melhores evidências estatísticas.
97

Developing Efficient Strategies for Automatic Calibration of Computationally Intensive Environmental Models

Razavi, Seyed Saman January 2013 (has links)
Environmental simulation models have been playing a key role in civil and environmental engineering decision making processes for decades. The utility of an environmental model depends on how well the model is structured and calibrated. Model calibration is typically in an automated form where the simulation model is linked to a search mechanism (e.g., an optimization algorithm) such that the search mechanism iteratively generates many parameter sets (e.g., thousands of parameter sets) and evaluates them through running the model in an attempt to minimize differences between observed data and corresponding model outputs. The challenge rises when the environmental model is computationally intensive to run (with run-times of minutes to hours, for example) as then any automatic calibration attempt would impose a large computational burden. Such a challenge may make the model users accept sub-optimal solutions and not achieve the best model performance. The objective of this thesis is to develop innovative strategies to circumvent the computational burden associated with automatic calibration of computationally intensive environmental models. The first main contribution of this thesis is developing a strategy called “deterministic model preemption” which opportunistically evades unnecessary model evaluations in the course of a calibration experiment and can save a significant portion of the computational budget (even as much as 90% in some cases). Model preemption monitors the intermediate simulation results while the model is running and terminates (i.e., pre-empts) the simulation early if it recognizes that further running the model would not guide the search mechanism. This strategy is applicable to a range of automatic calibration algorithms (i.e., search mechanisms) and is deterministic in that it leads to exactly the same calibration results as when preemption is not applied. One other main contribution of this thesis is developing and utilizing the concept of “surrogate data” which is basically a reasonably small but representative proportion of a full set of calibration data. This concept is inspired by the existing surrogate modelling strategies where a surrogate model (also called a metamodel) is developed and utilized as a fast-to-run substitute of an original computationally intensive model. A framework is developed to efficiently calibrate hydrologic models to the full set of calibration data while running the original model only on surrogate data for the majority of candidate parameter sets, a strategy which leads to considerable computational saving. To this end, mapping relationships are developed to approximate the model performance on the full data based on the model performance on surrogate data. This framework can be applicable to the calibration of any environmental model where appropriate surrogate data and mapping relationships can be identified. As another main contribution, this thesis critically reviews and evaluates the large body of literature on surrogate modelling strategies from various disciplines as they are the most commonly used methods to relieve the computational burden associated with computationally intensive simulation models. To reliably evaluate these strategies, a comparative assessment and benchmarking framework is developed which presents a clear computational budget dependent definition for the success/failure of surrogate modelling strategies. Two large families of surrogate modelling strategies are critically scrutinized and evaluated: “response surface surrogate” modelling which involves statistical or data–driven function approximation techniques (e.g., kriging, radial basis functions, and neural networks) and “lower-fidelity physically-based surrogate” modelling strategies which develop and utilize simplified models of the original system (e.g., a groundwater model with a coarse mesh). This thesis raises fundamental concerns about response surface surrogate modelling and demonstrates that, although they might be less efficient, lower-fidelity physically-based surrogates are generally more reliable as they to-some-extent preserve the physics involved in the original model. Five different surface water and groundwater models are used across this thesis to test the performance of the developed strategies and elaborate the discussions. However, the strategies developed are typically simulation-model-independent and can be applied to the calibration of any computationally intensive simulation model that has the required characteristics. This thesis leaves the reader with a suite of strategies for efficient calibration of computationally intensive environmental models while providing some guidance on how to select, implement, and evaluate the appropriate strategy for a given environmental model calibration problem.
98

Developing Efficient Strategies for Automatic Calibration of Computationally Intensive Environmental Models

Razavi, Seyed Saman January 2013 (has links)
Environmental simulation models have been playing a key role in civil and environmental engineering decision making processes for decades. The utility of an environmental model depends on how well the model is structured and calibrated. Model calibration is typically in an automated form where the simulation model is linked to a search mechanism (e.g., an optimization algorithm) such that the search mechanism iteratively generates many parameter sets (e.g., thousands of parameter sets) and evaluates them through running the model in an attempt to minimize differences between observed data and corresponding model outputs. The challenge rises when the environmental model is computationally intensive to run (with run-times of minutes to hours, for example) as then any automatic calibration attempt would impose a large computational burden. Such a challenge may make the model users accept sub-optimal solutions and not achieve the best model performance. The objective of this thesis is to develop innovative strategies to circumvent the computational burden associated with automatic calibration of computationally intensive environmental models. The first main contribution of this thesis is developing a strategy called “deterministic model preemption” which opportunistically evades unnecessary model evaluations in the course of a calibration experiment and can save a significant portion of the computational budget (even as much as 90% in some cases). Model preemption monitors the intermediate simulation results while the model is running and terminates (i.e., pre-empts) the simulation early if it recognizes that further running the model would not guide the search mechanism. This strategy is applicable to a range of automatic calibration algorithms (i.e., search mechanisms) and is deterministic in that it leads to exactly the same calibration results as when preemption is not applied. One other main contribution of this thesis is developing and utilizing the concept of “surrogate data” which is basically a reasonably small but representative proportion of a full set of calibration data. This concept is inspired by the existing surrogate modelling strategies where a surrogate model (also called a metamodel) is developed and utilized as a fast-to-run substitute of an original computationally intensive model. A framework is developed to efficiently calibrate hydrologic models to the full set of calibration data while running the original model only on surrogate data for the majority of candidate parameter sets, a strategy which leads to considerable computational saving. To this end, mapping relationships are developed to approximate the model performance on the full data based on the model performance on surrogate data. This framework can be applicable to the calibration of any environmental model where appropriate surrogate data and mapping relationships can be identified. As another main contribution, this thesis critically reviews and evaluates the large body of literature on surrogate modelling strategies from various disciplines as they are the most commonly used methods to relieve the computational burden associated with computationally intensive simulation models. To reliably evaluate these strategies, a comparative assessment and benchmarking framework is developed which presents a clear computational budget dependent definition for the success/failure of surrogate modelling strategies. Two large families of surrogate modelling strategies are critically scrutinized and evaluated: “response surface surrogate” modelling which involves statistical or data–driven function approximation techniques (e.g., kriging, radial basis functions, and neural networks) and “lower-fidelity physically-based surrogate” modelling strategies which develop and utilize simplified models of the original system (e.g., a groundwater model with a coarse mesh). This thesis raises fundamental concerns about response surface surrogate modelling and demonstrates that, although they might be less efficient, lower-fidelity physically-based surrogates are generally more reliable as they to-some-extent preserve the physics involved in the original model. Five different surface water and groundwater models are used across this thesis to test the performance of the developed strategies and elaborate the discussions. However, the strategies developed are typically simulation-model-independent and can be applied to the calibration of any computationally intensive simulation model that has the required characteristics. This thesis leaves the reader with a suite of strategies for efficient calibration of computationally intensive environmental models while providing some guidance on how to select, implement, and evaluate the appropriate strategy for a given environmental model calibration problem.
99

Heurística Surrogate para problema de carregamento de paletes dio produtor /

Kitamura, Bruna de Lima Alcântara. January 2009 (has links)
Orientador: Silvio Alexandre de Araujo / Banca: Reinaldo Morabito / Banca: Geraldo Nunes Silva / Resumo: O objetivo deste trabalho é estudar um caso particular dos problemas de corte e empacotamento, denominado Problema de Carregamento de Paletes do Produtor. Inicialmente, uma formulação proposta na literatura é avaliada com um pacote computacional. Posteriormente, as heurísticas lagrangiana e surrogate são estudadas e um método de atualização dos multiplicadores surrogate é adaptado para este problema. A importância em se estudar o Problema de Carregamento de Paletes do Produtor é que, devido à escala e extensão de certos sistemas logísticos, um pequeno aumento do número de produtos a serem carregados sobre cada palete pode resultar em economias substanciais. A motivação em se estudar o método de atualização surrogate proposto é que, além da adaptação do presente trabalho não ter sido realizada na literatura, uma posterior aplicação desta heurística em conjunto com um procedimento branch and bound poderá render melhores resultados que outras heurísticas. / Abstract: The aim of this work is studying a particular case of cutting and packing problem, so-called the Manufacturer's Pallet Loading Problem. Initially, a formulation proposed in the literature is evaluated with a computer package. Subsequently, the lagrangian and surrogate heuristics are studied and a method to update the surrogate multiplier is adapted for this problem. The importance of studying the manufacturer's pallet loading problem is that, due to the scale and scope of some logistics systems, a small increase in the number of products to be loaded on each pallet can result in substantial savings. The motivation of studying the proposed method of updating the surrogate multipliers is that, besides the adaptation of this work has not been carried out in the literature, further application of heuristics within a procedure branch and bound can yield better results than other heuristics. / Mestre
100

Métodos exatos baseados em relaxação lagrangiana e surrogate para o problema de carregamento de paletes do produtor.

Oliveira, Lilian Kátia de 13 December 2004 (has links)
Made available in DSpace on 2016-06-02T19:50:17Z (GMT). No. of bitstreams: 1 TeseLKO.pdf: 834201 bytes, checksum: 994d7b70c6b1001f9dec962fafc8b72e (MD5) Previous issue date: 2004-12-13 / Universidade Federal de Sao Carlos / The purpose of this work is to develop exact methods, based on Lagrangean and Surrogate relaxation, with good performance to solve the manufacturer s pallet loading problem. This problem consists of orthogonally arranging the maximum number of rectangles of sizes (l,w) and (w,l) into a larger rectangle (L,W) without overlapping. Such methods involve a tree search procedure of branch and bound type and they use, in each node of the branch and bound tree, bounds derived from Lagrangean and/or Surrogate relaxations of a 0-1 linear programming formulation. Subgradient optimization algorithms are used to optimize such bounds. Problem reduction tests and Lagrangean and Surrogate heuristics are also applied in the subgradient optimization to obtain good feasible solution. Computational experiments were performed with instances from the literature and also real instances obtained from a carrier. The results show that the methods are able to solve these instances, on average, more quickly than other exact methods, including the software GAMS/CPLEX. / O objetivo deste trabalho é desenvolver métodos exatos, baseados em relaxação Lagrangiana e Surrogate, com bom desempenho para resolver o problema de carregamento de paletes do produtor. Tal problema consiste em arranjar ortogonalmente e sem sobreposição o máximo número de retângulos de dimensões ( , ) l w ou ( , ) w l sobre um retângulo maior ( , ) L W . Tais métodos exatos são procedimentos de busca em árvore do tipo branch and bound que, em cada nó, utilizam limitantes derivados de relaxações Lagrangiana e/ou Surrogate de uma formulação de programação linear 0 1 &#8722; . Algoritmos de otimização do subgradiente são usados para otimizar estes limitantes. São aplicados ainda testes de redução do problema e heurísticas Lagrangiana e Surrogate na otimização do subgradiente para obter boas soluções factíveis. Testes computacionais foram realizados utilizando exemplos da literatura e exemplos reais, obtidos de uma transportadora. Os resultados mostram que os métodos são capazes de resolvê-los, em média, mais rapidamente do que outros métodos exatos, incluindo o software GAMS/CPLEX.

Page generated in 0.0732 seconds