551 |
Quantitative analysis and characterization of intracellular gene delivery mechanisms / Quantitative analysis and characterization of cellular gene delivery mechanicsVarga, Csanad M. (Csanad Mathias), 1976- January 2003 (has links)
Thesis (Ph. D. in Bioengineering)--Massachusetts Institute of Technology, Biological Engineering Division, 2003. / Leaf 117 blank. / Includes bibliographical references. / A goal for gene delivery research is to design vectors capable of (a) delivering transgenes to target cells, (b) yielding efficient gene expression, and (c) minimizing any immune, inflammatory, or cytotoxic response. Current research has focused on developing such vehicles using end gene expression as the benchmark. While transgene protein production is the overall objective of successful gene delivery, such qualitative treatment of gene delivery, especially for non-viral vectors, may result in unoptimized vectors and potential rate limiting steps unidentified. Quantitative analysis of the gene delivery pathway is essential for the characterization, comparison, and design of vectors. The complex nature of the mechanisms for gene delivery, particularly at the cellular level, contains multiple potentially rate limiting steps to successful gene expression. Through quantitative methodologies, vector efficacy can be related to molecular characteristics and specific processes within the gene delivery pathway. These potentially rate limiting steps include, but are not limited to, cell surface association, subcellular trafficking, endosomal escape, nuclear translocation, vector unpackaging, and gene expression. Design of synthetic gene delivery vectors seeks to develop molecular systems mimicking virus-like infection behavior, including cell membrane attachment and rapid internalization followed by endosomal escape, nuclear localization, and finally gene expression. To explore such opportunities for vector optimization and design, a model human hepatocellular carcinoma cell line by sets of transfection agents complexed with a plasmid. Time courses of plasmid numbers were determined both from whole cells and from isolated nuclei by real-time quantitative PCR. / (cont.) This enabled determination of values for parameters characterizing the key intracellular trafficking processes for a validated mass-action kinetic model of cellular gene delivery, in concert with model parameter values obtained from literature data. Quantitative parameter sensitivity analyses were performed for the individual gene delivery vectors, permitting elucidation of the particular rate-limiting processes specific to each vector. The resulting model predictions were then extended to test effect of increased delivery by polyethylenimine based gene delivery vectors and thus model utility. Additionally, viral vector performance was measured, providing insight into the extreme efficiency of such vectors. As no single process was found to be rate-limiting for all vectors, nor was the rate-limiting process necessarily the kinetically-slowest process. Thus, a single design factor will not likely improve all types of vectors, but rather each vector must be improved with respect to its own specific rate-limiting process(es) and improvements in vehicle design may best arise from quantitative analysis of the contributions of each to the integrated system operation. / by Csanad M. Varga. / Ph.D.in Bioengineering
|
552 |
Mitigating container security risk using real-time monitoring with active Radio Frequency Identification and sensorsSchlesinger, Adam Ian January 2005 (has links)
Thesis (M. Eng. in Logistics)--Massachusetts Institute of Technology, Engineering Systems Division, 2005. / Includes bibliographical references (leaves 57-60). / The global village in which we live enables increased trade and commerce across regions but also brings a complicated new set of challenges such as terrorist activity, human and drug smuggling and theft in foreign or domestic locations. Containers travel the globe, across all seven continents. In the wake of intensified security concerns since the September 11, 2001 attacks, tracking containers and their contents presents an increasing concern for those institutions and personnel charged with ensuring their security. This thesis analyzes the risks associated with global container transport. The concept of an e-container is set forth as a risk mitigation technology that uses real-time monitoring of a container's physical status acquired from an array of embedded RFID-enabled sensors. A framework is suggested that relates sensor-identified signatures and phenomena to behaviors representing breaches in container security. A theoretical model suggests which sensors are required to identify the individual breaches in order to mitigate container security risk. / y Adam Ian Schlesinger. / M.Eng.in Logistics
|
553 |
The end of core : should disruptive innovation in telecommunication invoke discontinuous regulation? / Should disruptive innovation in telecommunication invoke discontinuous regulation?Vaishnav, Chintan January 2010 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Engineering Systems Division, 2010. / This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. / Cataloged from student submitted PDF version of thesis. Page 247 blank. / Includes bibliographical references (p. 239-246). / This research analyzes how a telecommunications regulator can balance regulation with innovation, at a reasonable cost. This question has gained critical importance for telecom regulators as the unregulated Internet technologies such as voice and video over Internet disrupt the regulated traditional technologies such as telephony and television and the historical paradigm of the regulator. The existing U.S. telecommunications regulations were created in the integral age. In that paradigm, functional components that constitute a service compliant with regulation resided inside the network core; each operator was vertically integrated and controlled the total functionality necessary to deliver a service; a few such operators controlled the industry; they faced low competition and were under limited pressure to adopt innovation; and consumers had limited choice. The Internet has introduced a polar opposite paradigm-the modular age. In this paradigm, functional components that constitute a service are dispersed across the network core and edges; each firm controls only a subset of the total functionality necessary to constitute a service; many modular firms interoperate to deliver a service; firms compete fiercely and are under great pressure to innovate; and consumers enjoy far greater choice due to the multi-modal competition among multiple technologies. Although transitioning from an integral to a modular age dramatically flips the environment, the current regulatory response to this dramatic shift has been hesitant to shift its intellectual roots. Consequently, this thesis describes and analyzes the new telecommunications paradigm and explores its implications for an appropriate regulatory paradigm. The research uses the regulation of voice communications in the United States as a representative case. We analyze the new telecommunications paradigm as a dynamic complex system. Our research approach rests upon four principles of systems: two organizational principles (hierarchy and feedback) and two behavioral principles (emergent behavior and strategic and statistical behavior).The telecommunications system is viewed as one of the many subsystems that together fulfill the objectives of a society. The dynamics of the telecommunications system itself are conceptualized as those resulting from the interactions of four subsystems: regulatory dynamics, corporate strategy dynamics, consumer dynamics, and technology dynamics. The regulatory objectives to be fulfilled are conceived as an emergent property of such a system of systems. To carry out this research, we have developed a system-level dynamic feedback model and two case studies. As modular entrants of Internet-based technology disrupt integrated incumbents of traditional technology, bewildering dynamic complexity complicates decision-making by policymakers, managers, consumers, and technologists alike. Our model makes understandable the emergent behavior amidst the uncertainty that surrounds such a disruption phenomenon. The model formulations are behavioral. They are derived from the existing theories of technology and industry disruption, where possible. Alternatively, where theories have a gap, the decision processes of stakeholders, gleaned from unstructured interviews, are mathematised as the basis for the model formulations. The resulting structure is a fully endogenous systems model of regulation, competition, and innovation in telecommunications. In the first case study we analyze the regulatory environment of pre vs. post-Internet periods, both quantitatively and qualitatively. For the analysis, public comments in response to the Telecommunications Act of 1996 Notice for Proposed Rulemaking (NPRM) are compared with those in response to the IP-Enabled Services NPRM published in 2004. The analysis demonstrates how the differences in the integral and modular age are reflected in the regulatory record. The second case study analyzes how market, technology, organizational, and regulatory uncertainties affect technology and industry disruption. For this case, we use a combination of industrial statistics and content analysis of media publications. The analysis demonstrates the limits to technology and industry disruption. The case studies complement the model in two ways: first, they facilitate further refinement of the systems model; second, they empirically validate the arguments deduced from model analysis. Through this research we answer three questions: (1) Can the regulatory structure designed in an integral age-in its objectives, obligations (requirements), and enforcement mechanisms-work for a modular age? (2) How can regulators and managers improve decision making amidst the uncertainty surrounding the disruption of an integrated technology and industry by a modular one? (3) What is the new role of the telecommunications regulator and how can it be fulfilled in the modular age of the Internet? Our analysis shows that the current regulatory structure is inadequate for responding to the challenges the modular age poses. Firstly, the current objectives are appropriate but cannot be met unless regulators discontinue the merely efficiencycentered thinking and begin to address objectives at the societal level. Secondly, the current obligations may attain short-term goals, but have undesirable long-term consequences. Devising obligations that are appropriate in the long-term requires regulators to discontinue myopic measures such as incremental regulation of new technologies. Finally, the current enforcement mechanisms are blunted by the dynamic complexity of the modular age. Enforcing regulations effectively in the modular age necessitates adding to the regulatory quiver new mechanisms that are more versatile than the merely adversarial command-and-control mechanisms. Through model analysis, we demonstrate how a lack of understanding of the various uncertainties, and misperceptions of feedback in a complex system where regulators, firms, consumers, and technologists constantly interact, could lead to decisions that are costly for regulators as well as managers. Yet, as we demonstrate, with better grasp of the dynamic complexity involved, they can significantly improve decision-making to meet the challenges of the modular age. We argue that the most critical role for the telecommunications regulator in the new telecommunications paradigm is to sustain a balance between regulation and innovation, at a reasonable cost. Achieving such a balance in a modular structure is not trivial because of several natural tendencies. First, achieving high compliance at low cost is difficult because in highly modular architectures and industries, coordination costs, such as the time to build consensus, can be inordinately large. Second, keeping the innovationlevel high is difficult because it requires fighting the natural tendency of modular firms to gain and abuse market power. We propose a combination of two policy levers-Limiting Significant Market Power (SMP) Accumulation and Building Broad-based Consensus around Regulatory Issues-that most effectively achieve the desired balance and remain inadequately explored in the United States. We contend that implementing these policy levers will require, first, a more broadly construed antitrust regulation in the United States that will ensure higher modularity, and, second, a telecommunications regulatory agency that is empowered and organized to pursue objectives at the societal level and to build broad-based consensus among divergent interests in a highly modular structure. / by Chintan Vaishnav. / Ph.D.
|
554 |
Enzymatic and analytical tools for the characterization of chondroitin sulfate and dermatan sulfate glycosaminoglycansPojasek, Kevin R. (Kevin Robert), 1976- January 2003 (has links)
Thesis (Ph. D. in Applied Biosciences and Biotechnology)--Massachusetts Institute of Technology, Biological Engineering Division, 2003. / Includes bibliographical references (p. 113-123). / Glycosaminoglycans (GAGs) are complex polysaccharides that reside in the extracellular matrix and on the surfaces of all cells. The same complexity that contributes to the diversity of GAG function has also hindered their chemical characterization. Recent progress in coupling bacterial GAG-degrading enzymes with sensitive analytical techniques has led to a revolution in understanding the structure-function relationship for an important subset of GAGs, namely heparin/heparan sulfate-like glycosaminoglycans (HSGAGs). The study of chondroitin sulfate and dermatan sulfate (CS/DS), an equally important subset of GAGs, has lagged behind partially due to a lack of enzymatic and analytical tools akin to those developed for HSGAGs. The Flavobacterial heparinases have proven indispensable in characterizing the fine structure of HSGAGs responsible for their different biological functions. As a continuation of ongoing research, a combination of chemical modification, peptide mapping, and site-directed mutagenesis was employed to explore the role of histidine in the activity of heparinase III. Of the thirteen histidines in the enzyme, His295 and His510 were found to be critical for the degradation of heparan sulfate by heparinase III. As a first step to developing the chondroitinases as enzymatic tools for the characterization of CS/DS oligosaccharides, recombinant expression and purification schemes were developed for chondroitinase AC and B from Flactobacterium heparinum. The recombinant enzymes were characterized using biochemical techniques and kinetic parameters were determined for their respective CS/DS substrates. / (cont.) By combining the modeling a tetrasaccharide substrate into the active site of chondroitinase B with site-directed mutagenesis studies, a variety of residues were identified as critical for substrate binding and catalysis. A subsequent co-crystal structure of chondroitinase B with DS-derived hexasaccharide revealed a catalytic role for a calcium ion and provided further clarity into the role of individual active site amino acids. Additionally, using a variety of defined DS-derived oligosaccharides coupled with sensitive analytical techniques, chondroitinase B was identified as an endolytic, non-random, non-processive enzyme that preferentially cleaves longer oligosaccharides compared to shorter ones. Taken together, these studies represent a critical step in developing the chondroitinases as enzymatic tools for the characterization of CS/DS oligosaccharides in a fashion akin to the use of the heparinases to characterize HSGAGs. / by Kevin R. Pojasek. / Ph.D.in Applied Biosciences and Biotechnology
|
555 |
Coastal communities and climate change : a dynamic model of risk perception, storms, and adaptationFranck, Travis Read January 2009 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Engineering Systems Division, 2009. / This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. / Cataloged from student submitted PDF version of thesis. / Includes bibliographical references (p. 303-311). / Climate change impacts, including sea-level rise and changes in tropical storm frequency and intensity, will pose signicant challenges to city planners and coastal zone managers trying to make wise investment and protection decisions. Meanwhile, policymakers are working to mitigate impacts by regulating greenhouse gas emissions. To design effective policies, policymakers need more accurate information than is currently available to understand how coastal communities will be affected by climate change. My research aims to improve coastal impact and adaptation assessments, which inform climate and adaptation policies. I relax previous assumptions of probabilistic annual storm damage and rational economic expectations-variables in previous studies that are suspect, given the stochastic nature of storm events and the real-world behavior of people. I develop a dynamic stochastic adaptation model that includes explicit storm events and boundedly rational storm perception. I also include endogenous economic growth, population growth, public adaptation measures, and relative sea-level rise. The frequency and intensity of stochastic storm events can change a region's long- term economic growth pattern and introduce the possibility of community decline. Previous studies using likely annual storm damage are unable to show this result. Additionally, I consider three decision makers (coastal managers, infrastructure investors, and residents) who differ regarding their perception of storm risk. The decision makers' perception of risk varies depending on their rationality assumptions. / (cont.) Boundedly rational investors and residents perceive storm risk to be higher immediately after a storm event, which can drive down investment, decrease economic 3 growth, and increase economic recovery time, proving that previous studies provide overly optimistic economic predictions. Rationality assumptions are shown to change economic growth and recovery time estimates. Including stochastic storms and variable rationality assumptions will improve adaptation research and, therefore, coastal adaptation and climate change policies. / by Travis Read Franck. / Ph.D.
|
556 |
Quantitative performance-based evaluation of a procedure for flexible design concept generationCardin, Michel-Alexandre, 1979- January 2011 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Engineering Systems Division, 2011. / Cataloged from PDF version of thesis. / Includes bibliographical references (p. 152-163). / This thesis presents an experimental methodology for objective and quantitative design procedure evaluation based on anticipated lifecycle performance of design concepts, and a procedure for flexible design concept generation. The methodology complements existing evaluation methodologies by measuring anticipated performance via efficient computer modeling techniques. The procedure, in contrast to others, stimulates flexible design concept generation by packaging a short lecture on flexibility, and a prompting ideation mechanism. Controlled collaborative experiments had participants suggest alternative solutions to a design problem under different treatment conditions. Experimental conditions used the procedure for flexibility, while control conditions relied on prior training in science and engineering only, and free undirected ideation. Measures included the quantity of flexible design concepts generated, anticipated economic performance improvements compared to a benchmark design, participants' subjective impressions of satisfaction with the process and results, and results quality assessments. Seventy-one designers divided among twenty-six teams performed the experiments involving a simplified real estate infrastructure design problem. Application of the methodology demonstrated effective and efficient evaluation of the design procedure based on anticipated performance of design concepts. The lecture and prompting mechanism significantly improved anticipated performance compared to the benchmark design, by nearly thirty-six percent. The prompting mechanism significantly improved generation of valuable flexible design concepts. Lecturing improved significantly user satisfaction with the process and results, as well as results quality assessments. Even though prompting demonstrably improved anticipated performance and concept generation, it had no effect on participants' satisfaction with the process and results - unless combined with the lecture. Also, prompting did not lead participants to expect better results quality. This demonstrates the need for thorough and rigorous procedure evaluations based both on subjective user impressions and objective quantitative measurements. A preliminary analysis suggests that the proposed experimental platform can be used to study the influence of uncertainty and flexibility related words on discussion content, although more work is necessary to fully validate the approach. / by Michel-Alexandre Cardin. / Ph.D.
|
557 |
System design and the cost of architectural complexitySturtevant, Daniel Joseph January 2013 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Engineering Systems Division, 2013. / Cataloged from PDF version of thesis. / Includes bibliographical references (p. 159-166). / Many modern systems are so large that no one truly understands how they work. It is well known in the engineering community that architectural patterns (including hierarchies, modules, and abstraction layers) should be used in design because they play an important role in controlling complexity. These patterns make a system easier to evolve and keep its separate portions within the bounds of human understanding so that distributed teams can operate independently while jointly fashioning a coherent whole. This study set out to measure the link between architectural complexity (the complexity that arises within a system due to a lack or breakdown of hierarchy or modularity) and a variety of costs incurred by a development organization. A study was conducted within a successful software firm. Measures of architectural complexity were taken from eight versions of their product using techniques recently developed by MacCormack, Baldwin, and Rusnak. Significant cost drivers including defect density, developer productivity, and staff turnover were measured as well. The link between cost and complexity was explored using a variety of statistical techniques. Within this research setting, we found that differences in architectural complexity could account for 50% drops in productivity, three-fold increases in defect density, and order-of-magnitude increases in staff turnover. Using the techniques developed in this thesis, it should be possible for firms to estimate the financial cost of their complexity by assigning a monetary value to the decreased productivity, increased defect density, and increased turnover it causes. As a result, it should be possible for firms to more accurately estimate the potential dollar-value of refactoring efforts aimed at improving architecture. / by Daniel J. Sturtevant. / Ph.D.
|
558 |
How should indicators be found for scenario monitoring ?He, Zheng, M. Eng. Massachusetts Institute of Technology January 2013 (has links)
Thesis (M. Eng. in Logistics)--Massachusetts Institute of Technology, Engineering Systems Division, 2013. / Cataloged from PDF version of thesis. / Includes bibliographical references (p. 76-78). / Scenario planning is a widely used approach for developing long-term strategies. The typical scenario process involves developing scenarios, identifying strategies whose success is contingent on the scenario, and monitoring the environment regularly to know which scenario(s) may become more likely. Hence it becomes necessary to find a way to monitor the business environment in order to inform the process of making strategic decisions under uncertainty. This thesis proposes to use a set of nested indicators to monitor environment. The approach consists of a seven-step process to build composite indicators and link them with scenarios. Individual indicators are selected based on intuitive theoretical frameworks. Different weights are assigned to individual indicators using factor analysis. And then composite indicators are built by linear aggregation of individual indicators. The composite indicators are used to assess the changes in the driving forces over time. Such changes serve as the basis for judging whether the level of the driving forces is high or low. Those levels are then used to infer which scenario is likely to come to pass. This thesis used a set of four scenarios to illustrate the application of the approach. Those scenarios were built for a chemical company's supply chain in Asian/Pacific region in 2025. The result suggested that the environment of the sub-region in the monitoring year was more like a "Collaborative World" or a mix of "Collaborative World" and "Demanding World". And it is more possible that the environment was evolving into those two scenarios instead of the others. / by Zheng He. / M.Eng.in Logistics
|
559 |
A decision-support model for managing the fuel inventory of a Panamanian generating companyPerez-Franco, Roberto, 1976- January 2004 (has links)
Thesis (M. Eng. in Logistics)--Massachusetts Institute of Technology, Engineering Systems Division, 2004. / Includes bibliographical references (leaf 89). / Bahia Las Minas Corp (BLM) is a fuelpowered generating company in the Panamanian power system. The purpose of this thesis is to design and evaluate a decision-support model for managing the fuel inventory of this company. First, we research BLM and its fuel replenishment methods. Then we define the problem, its objective function, assumptions, parameters and constraints. After identifying the most important given information (fuel price forecast, demand forecast, and current inventory levels), we define the equations that relate these inputs with the order sizes, and the availability and reserve constraints. Due to the large number of constraints, we devise a mechanism to calculate lower limits for the aggregate order sizes that prevent violations of the constraints beyond user-defined limits. We prepare a model in Excel for use with a single fuel type. This model takes stochastic forecasts of demand and fuel prices, and determines the best size for the weekly fuel order. After testing the model under several different scenarios, we conclude that it responds correctly to changes in price and demand. The complete discussion of these results can be found in the body of the thesis. Finally, we present some recommendations for BLM, both in relation to this replenishment problem and to its supply chain in general. / by Roberto Perez-Franco. / M.Eng.in Logistics
|
560 |
Progression of chondrocyte signaling responses to mechanical stimulation in 3-D gel cultureChai, Diana H January 2008 (has links)
Thesis (Ph. D.)--Massachusetts Institute of Technology, Biological Engineering Division, 2008. / This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. / Includes bibliographical references (leaves 148-156). / Mechanical stimulation of 3-D chondrocyte cultures increases extracellular matrix (ECM) production and mechanical stiffness in regenerating cartilage. The goal of this study was to examine the progression of chondrocyte signaling responses to mechanical stimulation in 3-D culture during tissue regeneration. To investigate the role of integrins in chondrocyte mechanotransduction, function-blocking antibodies and small-molecule antagonists were used to disrupt integrin-matrix interactions during dynamic compression of chondrocytes in 3-D agarose culture. At early days in culture, blocking [alpha]v[beta]3 integrin abolished dynamic compression stimulation of proteoglycan synthesis, independent of effects in free-swell culture, while blocking [alpha]5[beta]1 integrins abolished the effect of compression only when blocking in free-swell increased proteoglycan synthesis. This suggests that disrupting [alpha]v[beta]3 and [alpha]5[beta]1 interactions with the ECM influences proteoglycan synthesis in distinct pathways and that [alpha]v[beta]3 more directly influences the mechanical response. To further distinguish individual mechanotransduction pathways, we investigated the temporal gene transcription response of chondrocytes to ramp-and-hold compression on Days 1, 10, and 28 in 3-D agarose culture. Clustered and individual gene expression profiles changed temporally and in magnitude over time in culture. Day 1 cultures differed from Days 10 and 28, reflecting changes in cell microenvironment with development of pericellular and extracellular matrices. Comparisons with the response of intact tissue to compression suggested similar regulatory mechanisms. We further investigated MAPkinase (ERK1/2, p38, JNK) and Akt activation on Days 1 and 28 in agarose culture through phosphorylation state-specific Western blotting. / (cont.) Compression induced transient ERK1/2 phosphorylation on both days, with Day 28 levels similar to intact tissue. Unique from tissue behavior, only slight transient p38 phosphorylation was observed on Day 28, and SEK phosphorylation was undetected. Akt was uniquely regulated in intact cartilage compared to MAPks, with decreased total Akt levels over time under static compression. In contrast, compression transiently decreased pAkt levels in agarose cultures, with no changes in total Akt. Changes in the chondrocyte responses to compression with time in agarose culture suggest that cells sense different forces and respond differently with time; further studies may help optimize mechanical loading for tissue-engineering purposes. These studies provide a basis for further examination of mechanotransduction in cartilage. / by Diana H. Chai. / Ph.D.
|
Page generated in 0.089 seconds