Spelling suggestions: "subject:"optimized"" "subject:"pptimized""
1 |
Optimizing the Tailored Treatment of Breast CancerAmir, Eitan 06 December 2012 (has links)
Background: Breast cancer is a diverse disease. Over the past 3 decades it has been increasingly appreciated that therapy should be targeted to specific patient and tumour characteristics. In recent years the evaluation of tailored therapy has been dominated by the development of new drug therapy which when successful has been marketed at a high price. There have been few successful attempts to optimize currently available therapies. This thesis explores the optimization of currently available therapies in three domains: efficacy, toxicity and supportive care.
Methods: Three independent studies were undertaken. First, a prospective cohort study was conducted to assess the impact of re-biopsy of recurrent breast cancer on physician choice of therapy and on patient satisfaction. The second study comprised a systematic review and meta-analysis of randomized trials exploring toxicities associated with different endocrine therapy options for early breast cancer with the aim of identification of patients who may be harmed by certain drugs. Finally, a randomized feasibility study was conducted to evaluate de-escalated intravenous bisphosphonates in women with low-risk metastatic breast cancer to bone.
Results: All studies met their objectives in showing that the tailored use of available therapies can be optimized. The prospective study of the impact of re-biopsy showed that treatment decisions were modified in 14% of women. Patient satisfaction with the process of re-biopsy was high. The meta-analysis of toxicities of endocrine therapy identified cardiovascular disease as a statistically significant toxicity of aromatase inhibitors, thereby suggesting that those with established cardiovascular disease or risk factors thereof should reduce their exposure to these drugs. Finally, the randomized feasibility study showed that it is possible to conduct randomized trials of de-escalated bisphosphonates in women with low-risk breast cancer and there was no signal that reducing the frequency of treatment was associated with untoward outcomes.
Conclusions: It is possible to optimize the tailored therapy of breast cancer using currently available treatments. This may lead to improved patient outcome while using existing resources. Further studies assessing the optimization of other treatments are warranted.
|
2 |
Optimizing the Tailored Treatment of Breast CancerAmir, Eitan 06 December 2012 (has links)
Background: Breast cancer is a diverse disease. Over the past 3 decades it has been increasingly appreciated that therapy should be targeted to specific patient and tumour characteristics. In recent years the evaluation of tailored therapy has been dominated by the development of new drug therapy which when successful has been marketed at a high price. There have been few successful attempts to optimize currently available therapies. This thesis explores the optimization of currently available therapies in three domains: efficacy, toxicity and supportive care.
Methods: Three independent studies were undertaken. First, a prospective cohort study was conducted to assess the impact of re-biopsy of recurrent breast cancer on physician choice of therapy and on patient satisfaction. The second study comprised a systematic review and meta-analysis of randomized trials exploring toxicities associated with different endocrine therapy options for early breast cancer with the aim of identification of patients who may be harmed by certain drugs. Finally, a randomized feasibility study was conducted to evaluate de-escalated intravenous bisphosphonates in women with low-risk metastatic breast cancer to bone.
Results: All studies met their objectives in showing that the tailored use of available therapies can be optimized. The prospective study of the impact of re-biopsy showed that treatment decisions were modified in 14% of women. Patient satisfaction with the process of re-biopsy was high. The meta-analysis of toxicities of endocrine therapy identified cardiovascular disease as a statistically significant toxicity of aromatase inhibitors, thereby suggesting that those with established cardiovascular disease or risk factors thereof should reduce their exposure to these drugs. Finally, the randomized feasibility study showed that it is possible to conduct randomized trials of de-escalated bisphosphonates in women with low-risk breast cancer and there was no signal that reducing the frequency of treatment was associated with untoward outcomes.
Conclusions: It is possible to optimize the tailored therapy of breast cancer using currently available treatments. This may lead to improved patient outcome while using existing resources. Further studies assessing the optimization of other treatments are warranted.
|
3 |
Optimized plant distribution and 5S model that allows SMEs to increase productivity in textilesRuiz, Silvana, Simón, Allison, Sotelo, Fernando, Raymundo, Carlos 01 January 2019 (has links)
In Peru, the Textile sector generates between 350 and 400 thousand direct jobs, representing 1.9% of Gross domestic product (GDP) and just over 10% of manufacturing. SMEs are characterized by being formed by family businesses, low levels of investment in new technologies and limited financial resources. This context has made SMEs are delayed compared to large companies in implementing Lean Manufacturing. Manufacturing textile companies that have problems with low productivity, excessive use of physical space, unnecessary movement and transport, use the tools of Lean Manufacturing and distribution plant for solving these problems. Many of the problems found in companies are related to the disorganization of processes, material flow and layout. Therefore, companies have seen the need to apply different strategic tools to help them increase the efficiency of their processes and become more competitive in their market. Among the strategic tools is the Lean Manufacturing. Several authors conclude that the plant distributions that SMEs have are not correct for increased productivity, however, the improvement models presenting lack information on how to create step by step a new layout of the company. Because of this, this article details the steps that SMEs can follow in search for a plant distribution model under the SLP tool.
|
4 |
Comparison and analysis of FDA reported visual outcomes of the three latest platforms for LASIK: wavefront guided Visx iDesign, topography guided WaveLight Allegro Contoura, and topography guided Nidek EC-5000 CATzMoshirfar, Majid, Shah, Tirth, Skanchy, David, Linn, Steven, Kang, Paul, Durrie, Daniel 01 1900 (has links)
Purpose: To compare and analyze the differences in visual outcomes between Visx iDesign Advanced WaveScan Studio (TM) System, Alcon Wavelight Allegro Topolyzer and Nidek EC-5000 using Final Fit (TM) Custom Ablation Treatment Software from the submitted summary of safety and effectiveness of the US Food and Drug Administration (FDA) data. Methods: In this retrospective comparative study, 334 eyes from Visx iDesign, 212 eyes from Alcon Contour, and 135 eyes from Nidek CATz platforms were analyzed for primary and secondary visual outcomes. These outcomes were compared via side-by-side graphical and tabular representation of the FDA data. Statistical significance was calculated when appropriate to assess differences. A P-value <0.05 was considered statistically significant. Results: The mean postoperative uncorrected distance visual acuity (UDVA) at 12 months was 20/19.25 +/- 8.76, 20/16.59 +/- 5.94, and 20/19.17 +/- 4.46 for Visx iDesign, Alcon Contoura, and Nidek CATz, respectively. In at least 90% of treated eyes at 3 months and 12 months, all three lasers showed either no change or a gain of corrected distance visual acuity (CDVA). Mesopic contrast sensitivity at 6 months showed a clinically significant increase of 41.3%, 25.1%, and 10.6% for eyes using Visx iDesign, Alcon Contoura, and Nidek CATz, respectively. Photopic contrast sensitivity at 6 months showed a clinically significant increase of 19.2%, 31.9%, and 10.6% for eyes using Visx iDesign, Alcon Contoura, and Nidek CATz, respectively. Conclusion: FDA data for the three platforms shows all three were excellent with respect to efficacy, safety, accuracy, and stability. However, there are some differences between the platforms with certain outcome measurements. Overall, patients using all three lasers showed significant improvements in primary and secondary visual outcomes after LASIK surgery.
|
5 |
Optimized Control Of Steam Heating CoilsAli, Mir Muddassir 2011 December 1900 (has links)
Steam has been widely used as the source of heating in commercial buildings and industries throughout the twentieth century. Even though contemporary designers have moved to hot water as the primary choice for heating, a large number of facilities still use steam for heating. Medical campuses with on-site steam generation and extensive distribution systems often serve a number of buildings designed prior to the mid-1980s. The steam is typically used for preheat as its high thermal content helps in heating the air faster and prevents coils from freezing in locations with extreme weather conditions during winter.
The present work provides a comprehensive description of the various types of steam heating systems, steam coils, and valves to facilitate the engineer's understanding of these steam systems.
A large percentage of the steam coils used in buildings are provided with medium pressure steam. Veterans Integrated Service Network and Army Medical Command Medical Facilities are examples which use medium pressure steam for heating. The current design manual for these medical facilities recommends steam at 30psig be provided to these coils. In certain cases although the steam heating coil is designed for a 5psig steam pressure, it is observed that higher pressure steam is supplied at the coil. A higher steam pressure may lead to excessive heating, system inefficiency due to increased heat loss, simultaneous heating and cooling, and increased maintenance cost.
Field experiments were conducted to evaluate the effect of lowering steam pressure on the system performance. A 16% reduction in temperature rise across the coil was found when the steam pressure in the coil was reduced from 15psig to 5psig. The rise in temperature with lower pressure steam was sufficient to prevent coil freeze-up even in the most severe weather conditions. Additional benefits of reduced steam pressure are reduced flash steam losses (flash steam is vapor or secondary steam formed when hot condensate from the coil is discharged into a lower pressure area, i.e., the condensate return line) and radiation losses, increased flow of air through the coil thereby reducing air stratification and reduced energy losses in the event of actuator failure.
The work also involved evaluating the existing control strategies for the steam heating system. New control strategies were developed and tested to address the short comings of existing sequences. Improved temperature control and occupant comfort; elimination of valve hunting and reduced energy consumption were benefits realized by implementing these measures.
|
6 |
Simulation and Analysis of the Characteristics of Thermal Fliuid Cycles for natural refrigerants R-600a and R-290 applying to an air-conditioning systemWu, Chun-Yi 06 July 2000 (has links)
The characteristics of thermofluid flow cycle for natural refrigerants R-600a and R-290 applying to an air-conditioning system are studied in this project. In system performance analysis, The exergy analysis incorperated with heat transfer and fluid mechanics are also adopted to analyze the exergy transfer and destroy of each component and the whole system.
The simulation parameters in this research include room temperatures, outdoor temperatures, and the types of refrigerants. If all the conditions remain constant except room temperature, the numerical results show that the coefficient of performance (COP) and the energy efficiency ratio (EER) will increase when the room temperature increases, or the outdoor temperature decreases. If all simulation conditions are the same, COP and EER with R-600a is better than those with R-290. By using exergy analysis, the numerical results show that the flow exergies through compressor and expansion valve will decrease due to the friction of the fluid flow. However, the flow exergies through condenser and evaporator will decrease due to finite-temperature heat transfer and energy carried away by exterior air. The destruction of the flow exergy due to the irreversibility of the frictional fluid flow is relative small to heat transfer. By using the exergy analysis, we can clearly understand the exergy change within each component of an air-conditioning system. This treatment is very useful in the design of air-conditioning systems and its optimum analysis.
|
7 |
Calibration of prepared environment for optical navigationPanilet Panipichai, Jinnu January 2015 (has links)
The main objective of this thesis is to evaluate accuracy and precision of the machine vision system used to calibrate a prepared environment foroptical navigation. Rotationally independent optimized colour reference labels (symbols) creates an environment. Any number of symbols can be used. A symbol carries 8–bit (0 to 255) information, which can be designed for different values by using Matlab algorithms. An optical navigation system enters into the environment and captures thesymbols. The symbols are then decoded to determine the geographical positions of the symbols from reference position of the system by using Matlab algorithms. Then, the system is moved to a known position and the same set of symbols are captured, decoded and located. The process is repeated for several positions of the system to find precision and accuracy. Finally, the results are analysed.
|
8 |
Optimized Reduced Models for Discrete Fracture Networks Used in Modeling Particle Flow and TransportJanuary 2020 (has links)
archives@tulane.edu / Discrete fracture networks (DFNs) can be modeled with polygonal representations that are useful for geophysical modeling of nuclear waste containment and hydrofrac- turing. Flow and transport calculations are possible, but computationally expensive, limiting the feasibility for model uncertainty quantification. Graphs are used to re- duce model complexity and computation time. We present the formulation of using a graph as a reduced model for DFNs and pose the inversion problem central to this research. We present a novel alternative to Darcy’s law on graphs using the well known Brinkman formulation on the continuum.
We apply the Levenberg-Marquardt algorithm to optimize graphs, calibrating them to observed data through the inversion problem. We present the deficiencies in physically motivated graphs, and show how optimized graphs produce better results overall. Our solution finds lumped parameters representing the fracture properties, and is used to reduce the computational time required for particle transport calculations. Breakthrough curves are produced on our obtained solutions, which closely match the high fidelity model. We present examples of creating these reduced models for DFNs with 500 fractures to illustrate the methodology and optimization scheme used to obtain an improved result over a previous formulation. / 1 / Jaime Lopez-Merizalde
|
9 |
Optimized combination model and algorithm of parking guidance information configurationMei, Zhenyu, Tian, Ye January 2011 (has links)
Operators of parking guidance and information (PGI) systems often have difficulty in providing the best car park availability information to drivers in periods of high demand. A new PGI configuration model based on the optimized combination method was proposed by analyzing of parking choice behavior. This article first describes a parking choice behavioral model incorporating drivers perceptions of waiting times at car parks based on PGI signs. This model was used to predict the influence of PGI signs on the overall performance of the traffic system. Then relationships were developed for estimating the arrival rates at car parks based on driver characteristics, car park attributes as well as the car park availability information displayed on PGI signs. A mathematical program was formulated to determine the optimal display PGI sign configuration to minimize total travel time. A genetic algorithm was used to identify solutions that significantly reduced queue lengths and total travel time compared with existing practices. These procedures were applied to an existing PGI system operating in Deqing Town and Xiuning City. Significant reductions in total travel time of parking vehicles with PGI being configured. This would reduce traffic congestion and lead to various environmental benefits.
|
10 |
Image processing and forward propagation using binary representations, and robust audio analysis using deep learningPedersoli, Fabrizio 15 March 2019 (has links)
The work presented in this thesis consists of three main topics:
document segmentation and classification into text and score,
efficient computation with binary representations, and deep learning
architectures for polyphonic music transcription and classification.
In the case of musical documents, an important
problem is separating text from musical score by detecting the
corresponding boundary boxes. A new algorithm is
proposed for pixel-wise classification of digital documents in musical
score and text. It is based on a bag-of-visual-words approach and
random forest classification. A robust technique for identifying
bounding boxes of text and music score from the pixel-wise
classification is also proposed.
For efficient processing of learned models, we turn our attention to
binary representations. When dealing with binary data, the use of
bit-packing and bit-wise computation can reduce computational time and
memory requirements considerably. Efficiency is a key factor when
processing large scale datasets and in industrial applications.
SPmat is an optimized framework for binary image processing.
We propose a bit-packed representation for binary images that encodes
both pixels and square neighborhoods, and design SPmat, an optimized
framework for binary image processing, around it.
Bit-packing and bit-wise computation can also be used for efficient
forward propagation in deep neural networks. Quantified deep neural
networks have recently been proposed with the goal of improving
computational time performance and memory requirements while
maintaining as much as possible classification performance. A particular
type of quantized neural networks are binary neural networks in which
the weights and activations are constrained to $-1$ and $+1$. In this
thesis, we describe and evaluate Espresso, a novel optimized framework
for fast inference of binary neural networks that takes advantage of
bit-packing and bit-wise computations. Espresso is self contained,
written in C/CUDA and provides optimized implementations of all the
building blocks needed to perform forward propagation.
Following the recent success, we further investigate Deep neural
networks. They have achieved state-of-the-art results and
outperformed traditional machine learning methods in many applications
such as: computer vision, speech recognition, and machine translation.
However, in the case of music information retrieval (MIR) and audio
analysis, shallow neural networks are commonly used. The
effectiveness of deep and very deep architectures for MIR and audio
tasks has not been explored in detail. It is also not clear what is
the best input representation for a particular task. We therefore
investigate deep neural networks for the following audio analysis
tasks: polyphonic music transcription, musical genre classification,
and urban sound classification. We analyze the performance of common
classification network architectures using different input
representations, paying specific attention to residual networks. We
also evaluate the robustness of these models in case of degraded audio
using different combinations of training/testing data. Through
experimental evaluation we show that residual networks provide
consistent performance improvements when analyzing degraded audio
across different representations and tasks. Finally, we present a
convolutional architecture based on U-Net that can improve polyphonic
music transcription performance of different baseline transcription
networks. / Graduate
|
Page generated in 0.0388 seconds