Spelling suggestions: "subject:"3dmodeling."" "subject:"bymodeling.""
381 |
Modeling of the dispensing-based tissue scaffold fabrication processesLi, Minggan 11 August 2010 (has links)
Tissue engineering is an emerging area with an aim to create artificial tissues or organs by employing methods of biology, engineering and material science. In tissue engineering, scaffolds are three-dimensional (3D) structure made from biomaterials with highly interconnected pore networks or microstructure, and are used to provide the mechanical and biological cues to guide cell differentiation in order to form desired three-dimensional tissues or functional organs. Hence, tissue scaffold plays a critical role in tissue engineering. However, fabrication of such scaffolds has proven to be a challenge task. One important barrier is the inability to fabricate scaffolds with designed pore size and porosity to mimic the microstructure of native tissue. Another issue is the prediction of process-induced cell damage in the cell-involved scaffold fabrication processes. By addressing these key issues involved in the scaffold fabrication, this research work is aimed at developing methods and models to represent the dispensing-based solid free form scaffold fabrication process with and without the presence of living cells.<p>
The microstructure of scaffolds, featured by the pore size and porosity, has shown to significantly affect the biological and mechanical properties of formed tissues. As such, during fabrication process the ability to predict and determine scaffold pore size and porosity is of great importance. In the first part of this research, the flow behaviours of the scaffold materials were investigated and a model of the flow rate of material dispensed during the scaffold fabrication was developed. On this basis, the pore size and porosity of the scaffolds fabricated were represented by developing a mathematical model. Scaffold fabrication experiments using colloidal gels with different hydroxylapatite volume fractions were carried out and the results obtained agreed with those from model simulations, indicating the effectiveness of the models developed. The availability of these models makes it possible to control the scaffold fabrication process rigorously, instead of relying upon a trial and error process as previously reported.<p>
In the scaffold fabrication process with the presence of living cells, cells are continuously subjected to mechanical forces. If the forces exceed certain level and/or the forces are applied beyond certain time periods, cell damage may result. In the second part of this research, a method to quantify the cell damage in the bio-dispensing process is developed. This method consists of two steps: one step is to establish cell damage models or laws to relate cell damage to the hydrostatic pressure / shear stress that is applied on cells; and the second step is to represent the process-induced forces that cells experience during the bio-dispensing process and then apply the established cell damage law to model the percent cell damage in the process. Based on the developed method, the cell damage percents in the scaffold fabrication processes that employ two types of dispensing needles, i.e., tapered and cylindrical needles, respectively, were investigated and compared. Also, the difference in cell damage under the high and low shear stress conditions was investigated, and a method was developed to establish the cell damage law directly from the bio-dispensing process. To validate the aforementioned methods and models, experiments of fabricating scaffolds incorporating Schwann cells or 3T3 fibroblasts were carried out and the percent cell damage were measured and compared with the simulation results. The validated models allow one to determine of the influence of process parameters, such as the air pressure applied to the process and the needle geometry, on cell damage and then optimize these values to preserve cell viability and/or achieve the desired cell distribution within the scaffolds.
|
382 |
A scripting interface for doubly linked face list based polygonal meshesTett, Stuart Tosten 15 May 2009 (has links)
This thesis presents a scripting language interface for modeling manifold meshes represented by a
Doubly Linked Face List (DLFL).With a scripting language users can create procedurally generated
meshes that would otherwise be tedious or impractical to create with a graphical user interface. I
have implemented a scripting language interface for the user to create stand-alone scripts as well
as script interactively within a graphical environment.
|
383 |
Loss modeling for pricing catastrophic bondsSircar, Jyotirmoy 15 May 2009 (has links)
It is important to be able to quantify potential seismic damage to structures and
communicate risk in a comprehendible way to all stakeholders. The risks involved with
damage to constructed facilities due to catastrophic disasters can be hedged using
financial instruments such as Catastrophic (CAT) bonds. This work uses the loss ratio
(Lr), which is the ratio of the repair cost to the total replacement cost, to represent
structural and non-structural damage caused by earthquakes.
A loss estimation framework is presented that directly relates seismic hazard to
seismic response to damage and hence to losses. A key feature of the loss estimation
approach is the determination of losses without the need for fragility curves. A
Performance-Based Earthquake Engineering (PBEE) approach towards assessing the
seismic vulnerability of structures relating an intensity measure (IM) to its associated
engineering demand parameter (EDP) is used to define the demand model. An
empirically calibrated tripartite loss model in the form of a power curve with upper and
lower cut-offs is developed and used in conjunction with the previously defined demand
model in order to estimate loss ratios. The loss model is calibrated and validated for different types of bridges and buildings. Loss ratios for various damage states take into
account epistemic uncertainty as well as an effect for price surge following a major
hazardous event. The loss model is then transformed to provide a composite seismic
hazard-loss relationship which is used to estimate financial losses from expected
structural losses.
The seismic hazard-loss model is then used to assess the expected spread, that is
the interest rate deviation above the risk-free (prime) rate in order to price two types of
CAT bonds: indemnity CAT bonds and parametric CAT bonds. It is concluded that CAT
bonds has the ability to play a major role in hedging financial risk associated with
damage to a civil engineering facility as a result of a catastrophe. However, it is seen that
a potential investor seeks a high degree of confidence when investing in CAT bonds as
there is huge uncertainty surrounding the probability of occurrence of an event.
|
384 |
Analysis of Precipitation Using Satellite Observations and Comparisons with Global Climate ModelsMurthi, Aditya 2010 May 1900 (has links)
In this study, the space-time relationship of precipitation fields is examined by testing the Taylor's "frozen field" hypothesis (TH). Specifically, the hypothesis supposes that if a spatio-temporal precipitation field with a stationary covariance
Cov(r,tau) in both space r and time tau, moves with a constant velocity v, then the temporal covariance at time lag tau is equal to the spatial covariance at space lag v tau, that is, Cov(0;tau) = Cov(v tau, 0). Of specific interest is whether there is a cut-off or decorrelation time scale for which the TH holds for a given mean
flow velocity v. The validity of the TH is tested for precipitation fields using high-resolution gridded
NEXRAD radar reflectivity data over southeastern United States by employing two different statistical approaches. The first method is based upon rigorous hypothesis
testing while the second is based on a simple correlation analysis, which neglects possible dependencies in the correlation estimates. The data-set has an approximate
horizontal resolution of 4 km x 4 km and a temporal resolution of 15 minutes, while the time period of study is 4 days. The results of both statistical methods suggest
that the TH might hold for the shortest space and time scales resolved by the data (4 km and 15 minutes), but that it does not hold for longer periods or larger spatial
scales.
The fidelity of global climate models in accurately simulating seasonal mean precipitation in the tropics is investigated by comparisons with satellite observations. Specifically, six-year long (2000-2005) simulations are performed using a high-resolution (36-km) Weather Research Forecast (WRF) model and the Community
Atmosphere Model (CAM) at T85 spatial resolution and the results are compared with satellite observations from the Tropical Rainfall Measuring Mission (TRMM). The primary goal is to study the annual cycle of rainfall over four land regions of the tropics namely, the Indian monsoon, the Amazon, tropical Africa and the North American monsoon. The results indicate that the WRF model systematically underestimates the magnitude of monthly mean rainfall over most Tropical land regions but
gets the seasonal timing right. On the other hand, CAM produces rainfall magnitudes that are closer to the observations but the rainfall peak leads or lags the observations by a month or two. Some of these regional biases can be attributed to erroneous circulation and moisture surpluses/deficits in the lower troposphere in both models. Overall, the results seem to indicate that employing a higher spatial resolution (36 km) does not significantly improve simulation of precipitation. We speculate that a combination of several physics parameterizations and lack of model tuning gives rise
to the observed differences between the models and the observations.
|
385 |
Development and Investigation of Synthetic Skin Simulant Platform (3SP) in Friction Blister ApplicationsGuerra, Carlos 2010 December 1900 (has links)
Skin is the largest organ of the human body. It is the first line of defense between the vulnerable organs and tissues of the body and the environment. Healthy skin is paramount to avoiding infection and disease. Therefore, any breach in the skin represents a significant risk to the health and comfort of its owner. Friction blisters are one of the most common modes of damage to human skin. In some extreme cases, such as those who suffer from Epidermal Bullosa, friction blisters are a very common and painful occurrence.
Prior research on blister formation has been performed at mostly an observational level. In some cases, blisters have been deliberately created on human volunteers or animal test subjects. However, these studies are very difficult to recreate due to the legal issues of human and animal testing and the fact that no two people will have the same response to external stimulus. Other studies have followed athletes or soldiers who use different textile fabrics for socks or clothing to determine which have significant effects.
Concurrent studies have focused on mimicking human skin for haptics research in product development. These have made great strides in introducing engineering properties such as coefficient of friction (COF) and elastic modulus into the field of skin study. While these studies are very useful to understanding the properties and mechanisms of human skin in rubbing applications, their primary audience is the cosmetics industry or product developers.
There is a significant opportunity to take a similar approach of applying an engineering viewpoint to repeatably model the onset and formation of blisters on human skin. The authors have developed the Synthetic Skin Simulant Platform (3SP) to fulfill this role. The 3SP is a three-layer composite of elastomeric materials that outputs a visually recognizable blister upon sufficiently strong shear loading.
The authors determined through two factorial experiments conducted on a custom wear testing table which variables were most significant to blister formation in the 3SP. The results showed that COF and dermal stiffness are the primary contributors. This agrees with prior literature about the significance of COF, and it suggests that dermal stiffness is a significant factor that merits examination in future blister research.
Finally, the authors ran another experiment to ascertain the influence of textile fabrics and surface treatments on blister formation in the 3SP. The results demonstrated that surface treatments of corn starch and aloe-based lubricant were effective at mitigating blister formation on the 3SP. Furthermore, the results show that fabric is also bordering statistical significance on blistering.
|
386 |
Modeling The NOx Emissions In A Low NOx Burner While Fired With Pulverized Coal And Dairy Biomass BlendsUggini, Hari 2012 May 1900 (has links)
New regulations like the Clean Air Interstate Rule (CAIR) will pose greater challenges for Coal fired power plants with regards to pollution reduction. These new regulations plan to impose stricter limits on NOX reduction. The current regulations by themselves already require cleanup technology; newer regulations will require development of new and economical technologies.
Using a blend of traditional fuels & biomass is a promising technology to reduce NOX emissions. Experiments conducted previously at the Coal and Biomass energy lab at Texas A&M reported that dairy biomass can be an effective Reburn fuel with NOX reduction of up to 95%; however little work has been done to model such a process with Feedlot Biomass as a blend with the main burner fuel. The present work concerns with development of a zero dimensional for a low NOx burner (LNB) model in order to predict NOX emissions while firing a blend of Coal and dairy biomass. Two models were developed. Model I assumes that the main burner fuel is completely oxidized to CO,CO2,H20 and fuel bound nitrogen is released as HCN, NH3, N2; these partially burnt product mixes with tertiary air, undergoes chemical reactions specified by kinetics and burns to complete combustion. Model II assumes that the main burner solid fuel along with primary and secondary air mixes gradually with recirculated gases, burn partially and the products from the main burner include partially burnt solid particles and fuel bound nitrogen partially converted to N2, HCN and NH3. These products mix gradually with tertiary air, undergo further oxidation-reduction reactions in order to complete the combustion. The results are based on model I. Results from the model were compared with experimental findings to validate it.
Results from the model recommend the following conditions for optimal reduction of NOx: Equivalence Ratio should be above 0.95; mixing time should be below 100ms. Based on Model I, results indicate that increasing percentage of dairy biomass in the blend increases the NOx formation due to the assumption that fuel N compounds ( HCN, NH3) do not undergo oxidation in the main burner zone. Thus it is suggested that model II must be adopted in the future work.
|
387 |
Photochemical modeling and analysis of meteorological parameters during ozone episodes in the Kao-Ping Area , TaiwanHo, Yi-Ta 21 May 2004 (has links)
A three-dimensional (3D) photochemical grid model, CAMx-2.0 (1998), was employed to analyze the spatial and temporal variations of ambient ozone during ozone episodes (concentration of ozone > 120 ppbv) in the Kao-Ping airshed in 2000-2001. The sensitivity analyses of ozone concentrations to the emission reductions in volatile organic compounds (VOC) and nitrogen oxides (NOx) were performed, and the relationships between ozone concentrations and meteorological parameters were examined. Furthermore, the transport routes were studies using inverse trajectory method.
Examinations of meteorological parameters and ozone trends reveal that warm temperature, sufficient sunlight, low wind, and high surface pressure are distinct parameters that tend to trigger ozone episodes in Kao-Ping area in autumn and winter seasons. Seasonal patterns of surface ozone include a summer minimum with two maxima in autumn and late winter to the middle of spring, consistent with low mixing heights in autumn and winter and large mixing height in summer.
Predicted values of hourly ozone concentration agree reasonably well with measured data. The assessment of the effect of the initial and boundary conditions on the performance of the model revealed that the model can be improved by specifying an ozone concentration of 70 ppbv rather than 30 ppbv on the top boundary of the model, while separately considering the daytime and nighttime ozone concentration on the lateral boundary conditions. The sensitivity analysis shows a VOC-sensitive regime in Kaohsiung City. In addition to the locally emitted pollutants, the inverse trajectory analysis shows that most pollutants in Kaohsiung City come from Kaohsiung County, followed by Tainan County and Ping-Tung County.
In autumn, the air quality is worst in Ping-Tung County and ozone episodes occur most frequently. Because the prevailing wind is north or north-east wind in autumn, most pollutants are transported from the upwind areas, including Kaohsiung City and Kaohsiung County. The sensitivity analysis shows a NOx-sensitive regime for Ping-Tung city, consistent with Sillman¡¦s results (1999), indicating that freshly emitted pollutants are typically (but not always) characterized by VOC-sensitive chemistry and evolve towards NOx-sensitive chemistry as the air parcels move downwind.
|
388 |
Modeling PIM with UML and PatternsSu, Hsiao-Sheng 02 May 2006 (has links)
Software modeling with unified modeling language (UML) and model driven architecture (MDA) concept have become the new paradigm of modern systems analysis and design. Some Several CASE tools have been introduced to facilitate the transformation from platform independent model (i.e., class diagram, and sequence diagram) to platform specific model, and thereby enhance the efficiency of system development.
This research depends on MDA concept, and presents a systematic methodology which integrated the UML and MDA concept with Patterns to refine PIM cohesively. A real-world case using the integrated techniques is presented. With this approach, the system developer can increase the reuse of PIM and thereby enhance the efficiency of system development.
|
389 |
Structure-function study of L-lactate dehydrogenase and molecular systematics of five turtle specieszo, Ho-wan 10 July 2001 (has links)
Abstract
Five species belonging to order Chelonia, two families and four genera, namely, Taiwanese soft-shelled turtle, Pelodiscus sinensis japonicus; American soft-shelled turtle, Apalone ferox; alligator snapping turtle, Macroclemys temminck; pitted shelled turtle, Carettochelys insculpta and side-necked turtle, Chelodina siebernrocki were investigated in order to fully understand the structural basis for the multiple lactate dehydrogenase (LDH) isozymes in turtles and soft-shelled turtles. Starch gel electrophoretic patterns of LDH isozymes from muscle, heart, liver, testis and eye were analyzed. Chelonia possess the two fundamental LDH loci-A¡]muscle¡^and B¡]heart¡^as the case of all other vertebrates. The major forms of LDH isozymes in the tissues of Chelonia are homotetrameric LDH-A4 and B4.While some of these Chelonia do not form two heterotetrameric A3B1 and/or A1B3 isozymes. This phenomenon is also observed among some lower vertebrates and fishes of other classes.
I have determined the LDH-A and LDH-B cDNA sequences of protein-coding region from these five species. The 3D-structure of tetrameric LDH isozymes from Taiwanese soft-shelled turtle was predicted by homologous modeling and the substitutive residues in subunit contact sites were examined in oder to explain different multiple forms of tetrameric LDH isozymes present in various species.
The LDH isozymes are housekeeping genes in most eukaryotic cells and therefore the LDH DNA or protein sequences can be an ideal marker for studying the molecular phylogenetics and evolution among different organisms. However, whether this marker can also be used to investigate the systematic relationship among the closely related species remains to be demonstrated. In this study, the newly determined LDH-A and LDH-B cDNA sequences and their deduced protein sequences from several different turtles and soft-shelled turtles, as well as the previously published LDH sequences, are analyzed by phylogenetic tree reconstruction methods of neighbor-joining, minimum evolution, maximum parsimony and maximum likelihood. These results confirm the traditional classification based on morphology of Chelonia as the two different families belonging to the same order Chelonia. Furthermore, these results clearly classify these Chelonia species into side-necked turtle, rough-shelled hide-necked turtle and smooth-shelled hide-neck turtle. Finally, these results also demonstrate that LDH can indeed be used as a molecular systematic marker for analyzing closely related species.
|
390 |
Stochastic generation of biologically accurate brain networksAluri, Aravind 12 April 2006 (has links)
Basic circuits, which form the building blocks of the brain, have been identiffied
in recent literature. We propose to treat these basic circuits as "stochastic generators"
whose instances serve to wire a portion of the mouse brain. Very much in the same
manner as genes generate proteins by providing templates for their construction, we
view the catalog of basic circuits as providing templates for wiring up the neurons
of the brain. This thesis work involves a) deffining a framework for the stochastic
generation of brain networks, b) generation of sample networks from the basic circuits,
and c) visualization of the generated networks.
|
Page generated in 0.1179 seconds