Spelling suggestions: "subject:"desponse surfaces (estatistics)"" "subject:"desponse surfaces (cstatistics)""
41 |
The use of response surface methodology and artificial neural networks for the establishment of a design space for a sustained release salbutamol sulphate formulationChaibva, Faith Anesu January 2010 (has links)
Quality by Design (QbD) is a systematic approach that has been recommended as suitable for the development of quality pharmaceutical products. The QbD approach commences with the definition of a quality target drug profile and predetermined objectives that are then used to direct the formulation development process with an emphasis on understanding the pharmaceutical science and manufacturing principles that apply to a product. The design space is directly linked to the use of QbD for formulation development and is a multidimensional combination and interaction of input variables and process parameters that have been demonstrated to provide an assurance of quality. The objective of these studies was to apply the principles of QbD as a framework for the optimisation of a sustained release (SR) formulation of salbutamol sulphate (SBS), and for the establishment of a design space using Response Surface Methodology (RSM) and Artificial Neural Networks (ANN). SBS is a short-acting ♭₂ agonist that is used for the management of asthma and chronic obstructive pulmonary disease (COPD). The use of a SR formulation of SBS may provide clinical benefits in the management of these respiratory disorders. Ashtalin®8 ER (Cipla Ltd., Mumbai, Maharashtra, India) was selected as a reference formulation for use in these studies. An Ishikawa or Cause and Effect diagram was used to determine the impact of formulation and process factors that have the potential to affect product quality. Key areas of concern that must be monitored include the raw materials, the manufacturing equipment and processes, and the analytical and assessment methods employed. The conditions in the laboratory and manufacturing processes were carefully monitored and recorded for any deviation from protocol, and equipment for assessment of dosage form performance, including dissolution equipment, balances and hardness testers, underwent regular maintenance. Preliminary studies to assess the potential utility of Methocel® Kl OOM, alone and in combination with other matrix forming polymers, revealed that the combination of this polymer with xanthan gum and Carbopol® has the potential to modulate the release of SBS at a specific rate, for a period of 12 hr. A central composite design using Methocel® KlOOM, xanthan gum, Carbopol® 974P and Surelease® as the granulating fluid was constructed to fully evaluate the impact of these formulation variables on the rate and extent of SBS release from manufactured formulations. The results revealed that although Methocel® KlOOM and xanthan gum had the greatest retardant effect on drug release, interactions between the polymers used in the study were also important determinants of the measureable responses. An ANN model was trained for optimisation using the data generated from a central composite study. The efficiency of the network was optimised by assessing the impact of the number of nodes in the hidden layer using a three layer Multi Layer Perceptron (MLP). The results revealed that a network with nine nodes in the hidden layer had the best predictive ability, suitable for application to formulation optimisation studies. Pharmaceutical optimisation was conducted using both the RSM and the trained ANN models. The results from the two optimisation procedures yielded two different formulation compositions that were subjected to in vitro dissolution testing using USP Apparatus 3. The results revealed that, although the formulation compositions that were derived from the optimisation procedures were different, both solutions gave reproducible results for which the dissolution profiles were indeed similar to that of the reference formulation. RSM and ANN were further investigated as possible means of establishing a design space for formulation compositions that would result in dosage forms that have similar in vitro release test profiles comparable to the reference product. Constraint plots were used to determine the bounds of the formulation variables that would result in the manufacture of dosage forms with the desired release profile. ANN simulations with hypothetical formulations that were generated within a small region of the experimental domain were investigated as a means of understanding the impact of varying the composition of the formulation on resultant dissolution profiles. Although both methods were suitable for the establishment of a design space, the use of ANN may be better suited for this purpose because of the manner in which ANN handles data. As more information about the behaviour of a formulation and its processes is generated during the product Iifecycle, ANN may be used to evaluate the impact of formulation and process variables on measureable responses. It is recommended that ANN may be suitable for the optimisation of pharmaceutical formulations and establishment of a design space in line with ICH Pharmaceutical Development [1], Quality Risk Management [2] and Pharmaceutical Quality Systems [3]
|
42 |
Global Resource Management of Response Surface MethodologyMiller, Michael Chad 04 March 2014 (has links)
Statistical research can be more difficult to plan than other kinds of projects, since the research must adapt as knowledge is gained. This dissertation establishes a formal language and methodology for designing experimental research strategies with limited resources. It is a mathematically rigorous extension of a sequential and adaptive form of statistical research called response surface methodology. It uses sponsor-given information, conditions, and resource constraints to decompose an overall project into individual stages. At each stage, a "parent" decision-maker determines what design of experimentation to do for its stage of research, and adapts to the feedback from that research's potential "children", each of whom deal with a different possible state of knowledge resulting from the experimentation of the "parent". The research of this dissertation extends the real-world rigor of the statistical field of design of experiments to develop an deterministic, adaptive algorithm that produces deterministically generated, reproducible, testable, defendable, adaptive, resource-constrained multi-stage experimental schedules without having to spend physical resource.
|
43 |
A Framework for the Determination of Weak Pareto Frontier Solutions under Probabilistic ConstraintsRan, Hongjun 09 April 2007 (has links)
A framework is proposed that combines separately developed multidisciplinary optimization, multi-objective optimization, and joint probability assessment methods together but in a decoupled way, to solve joint probabilistic constraint, multi-objective, multidisciplinary optimization problems that are representative of realistic conceptual design problems of design alternative generation and selection. The intent here is to find the Weak Pareto Frontier (WPF) solutions that include additional compromised solutions besides the ones identified by a conventional Pareto frontier. This framework starts with constructing fast and accurate surrogate models of different disciplinary analyses. A new hybrid method is formed that consists of the second order Response Surface Methodology (RSM) and the Support Vector Regression (SVR) method. The three parameters needed by SVR to be pre-specified are automatically selected using a modified information criterion based on model fitting error, predicting error, and model complexity information. The model predicting error is estimated inexpensively with a new method called Random Cross Validation. This modified information criterion is also used to select the best surrogate model for a given problem out of the RSM, SVR, and the hybrid methods. A new neighborhood search method based on Monte Carlo simulation is proposed to find valid designs that satisfy the deterministic constraints and are consistent for the coupling variables featured in a multidisciplinary design problem, and at the same time decouple the three loops required by the multidisciplinary, multi-objective, and probabilistic features. Two schemes have been developed. One scheme finds the WPF by finding a large enough number of valid design solutions such that some WPF solutions are included in those valid solutions. Another scheme finds the WPF by directly finding the WPF of each consistent design zone. Then the probabilities of the PCs are estimated, and the WPF and corresponding design solutions are found. Various examples demonstrate the feasibility of this framework.
|
44 |
Identification of Tobacco-Related Compounds in Tobacco Products and Human HairRainey, Christina 04 September 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Analyses of tobacco products and their usage are well-researched and have implications in analytical chemistry, forensic science, toxicology, and medicine. As such, analytical methods must be developed to extract compounds of interest from tobacco products and biological specimens in order to determine tobacco exposure.
In 2009, R.J. Reynolds Tobacco Co. released a line of dissolvable tobacco products that are marketed as a smoking alternative. The dissolvables were extracted and prepared by ultrasonic extractions, derivatization, and headspace solid phase microextraction (SPME) with analysis by gas chromatography-mass spectrometry (GC-MS). The results show that the compounds present are nicotine, flavoring compounds, humectants and binders. Humectant concentrations vary among different tobacco types depending on the intended use. Humectants were quantified in various tobacco types by GC and “splitting” the column flow between a flame ionization detector (FID) and an MS using a microfluidic splitter in order to gain advantage from the MS’s selectivity. The results demonstrated excellent correlation between FID and MS
and show that MS provides a higher level of selectivity and ensures peak purity. Chemometrics was also used to distinguish products by tobacco type.
Hair is a common type of evidence in forensic investigations, and it is often subjected to mitochondrial DNA (mtDNA) analysis. Preliminary data was gathered on potential “lifestyle” markers for smoking status as well as any indications of subject age, gender, or race by investigating the organic “waste” produced during a mtDNA extraction procedure. The normally discarded organic fractions were analyzed by GC-MS and various lipids and fatty acids were detected.
At this point, a total vaporization-SPME (TV-SPME) method was theorized, developed, and optimized for the specific determination of nicotine and its metabolite, cotinine. The theory of TV-SPME is to completely vaporize an organic extract which will eliminate the partitioning between the sample and the headspace, thereby simplifying the thermodynamic equilibrium. Parameters such as sample volume, incubation temperature, and extraction time were optimized to achieve the maximum analyte signal. Response surface methodology (RSM) is a statistical model that is very useful in predicting and determining optimum values for variables to ensure the ideal response. RSM was used to optimize the technique of TV-SPME for the analysis of nicotine and cotinine.
Lastly, quantitation of nicotine and cotinine in human hair typically requires large sample sizes and extensive extraction procedures. Hence, a method using small sample sizes and a simple alkaline digestion followed by TV-SPME-GC-MS has been developed. Hair samples were collected from anonymous volunteers and nicotine and cotinine were identified and quantitated in the hair of tobacco users.
|
Page generated in 0.0803 seconds