Spelling suggestions: "subject:"[een] EDGE"" "subject:"[enn] EDGE""
431 |
School as a Center for Community: Establishing Neighborhood Identity through Public Space and Educational FacilityGoykhman, Fred 10 November 2008 (has links)
“Safety is an opportunity for people to open their minds” -Jin Baek, 2008
For my thesis I will design an education facility. That education facility will strive to meet with today's security needs and will provide a safe-feeling place for growth. In identifying the problem, I found two main causes for the described conditions in today's schools. They are improper adaptation and uniform building type.
Improper adaptation has to do with surface applications, rather than integrating with the social fabric of the school's communal requirements. Unfortunate incidents have caused the solutions to heightened security around schools to be fortressing and disrupting to the human activities. Metal detectors, restricted areas and alarmed doors are some of the possibly necessary but often overlooked attributes of the school design, which in concentration create a trapping, prison-like feeling where they should suggest a place of voluntary education and inspiration for the future. I will utilize CPTED (Crime Prevention Through Environmental Design) strategies, research codes, new building technologies, materials, systems, arrangements, precedent studies, and testing through simulation or experiment, in a form of installation. I can determine possible solutions and interventions using these resources.
Uniform building type sets a counterproductive precedent. Today we must look at places were young people want to be, and splice the desired attributes of those places in to modern schools. In fact, uniform building type is one of the reasons for improper adaptation. Through interviewing school administrators, building officials, students, faculty, psychologists, builders and other construction professionals, I can identify the mandatory requirements. Implementing security and safety attributes as part of the concept, and knowing trends in technology can help secure educational facilities while still maintaining the qualities that are conducive to a learning environment.
As stated by Holly Richmond in Contract magazine, February 2006 edition, "Students are the most crucial design element in today's schools," says Kerry Leonard, principal and senior planner at O'Donnell, Wicklund, Pigozzi and Peterson Architects in Chicago and chair of the advisory group for the AIA Committee on Architecture for Education. "Understanding how people learn and creating environments that respond to this knowledge is the best building block to start from."
|
432 |
A Computational Geometry Approach to Digital Image Contour ExtractionTejada, Pedro J. 01 May 2009 (has links)
We present a method for extracting contours from digital images, using techniques from computational geometry. Our approach is different from traditional pixel-based methods in image processing. Instead of working directly with pixels, we extract a set of oriented feature points from the input digital images, then apply classical geometric techniques, such as clustering, linking, and simplification, to find contours among these points. Experiments on synthetic and natural images show that our method can effectively extract contours, even from images with considerable noise; moreover, the extracted contours have a very compact representation.
|
433 |
LAMINAR AND TURBULENT STUDY OF COMBUSTION IN STRATIFIED ENVIRONMENTS USING LASER BASED MEASUREMENTSGrib, Stephen William 01 January 2018 (has links)
Practical gas turbine engine combustors create extremely non-uniform flowfields, which are highly stratified making it imperative that similar environments are well understood. Laser diagnostics were utilized in a variety of stratified environments, which led to temperature or chemical composition gradients, to better understand autoignition, extinction, and flame stability behavior. This work ranged from laminar and steady flames to turbulent flame studies in which time resolved measurements were used.
Edge flames, formed in the presence of species stratification, were studied by first developing a simple measurement technique which is capable of estimating an important quantity for edge flames, the advective heat flux, using only velocity measurements. Both hydroxyl planar laser induced fluorescence (OH PLIF) and particle image velocimetry (PIV) were used along with numerical simulations in the development of this technique. Interacting triple flames were also created in a laboratory scale burner producing a laminar and steady flowfield with symmetric equivalence ratio gradients. Studies were conducted in order to characterize and model the propagation speed as a function of the flame base curvature and separation distance between the neighboring flames. OH PLIF, PIV and Rayleigh scattering measurements were used in order to characterize the propagation speed. A model was developed which is capable of accurately representing the propagation speed for three different fuels. Negative edge flames were first studied by developing a one-dimensional model capable of reproducing the energy equation along the stoichiometric line, which was dependent on different boundary conditions. Unsteady and laminar negative edge flames were also simulated with periodic boundary conditions in order to assess the difference between the steady and unsteady cases. The diffusive heat loss was unbalanced with the chemical heat release and advective heat flux energy gain terms which led to the flame proceeding and receding. The temporal derivative balanced the energy equation, but also aided in the understanding of negative edge flame speeds. Turbulent negative edge flame velocities were measured for extinguishing flames in a separate experiment as a function of the bulk advective heat flux through the edge and turbulence level. A burner was designed and built for this study which created statistically stationary negative edge flames. The edge velocity was dependent on both the bulk advective heat flux and turbulence levels. The negative edge flame velocities were obtained with high speed stereo-view chemiluminescence and two dimensional PIV measurements.
Autoignition stabilization was studied in the presence of both temperature and species stratification, using a simple laminar flowfield. OH and CH2O PLIF measurements showed autoignition characteristics ahead of the flame base. Numerical chemical and flow simulations also revealed lower temperature chemistry characteristics ahead of the flame base leading to the conclusion of lower temperature chemistry dominating the stabilization behavior. An energy budget analysis was conducted which described the stabilization behavior.
|
434 |
Numerical Simulation of Dropped Cylindrical Objects into Water in Two Dimensions (2D)Zhen, Yi 20 December 2018 (has links)
The dropped objects are identified as one of the top ten causes of fatalities and serious injuries in the oil and gas industry. It is of importance to understand dynamics of dropped objects under water in order to accurately predict the motion of dropped objects and protect the underwater structures and facilities from being damaged. In this thesis, we study nondimensionalization of dynamic equations of dropped cylindrical objects. Nondimensionalization helps to reduce the number of free parameters, identify the relative size of effects of parameters, and gain a deeper insight of the essential nature of dynamics of dropped cylindrical objects under water. The resulting simulations of dimensionless trajectory confirms that drop angle, trailing edge and drag coefficient have the significant effects on dynamics of trajectories and landing location of dropped cylindrical objects under water.
|
435 |
Analysis of entry phase in intermittent machiningAgic, Adnan January 2018 (has links)
Cutting forces and vibrations are essential parameters in the assessment of a cutting process. As the energy consumption in the machining process is directly affected by the magnitude of the cutting forces it is of vital importance to design cutting edges and select process conditions that will maintain high tool performance through reduced energy consumption. The vibrations are often the cause of poor results in terms of accuracy, low reliability due to sudden failures and bad environmental conditions caused by noise. The goal of this work is to find out how the cutting edge and cutting conditions affect the entry conditions of the machining operation. This is done utilizing experimental methods and appropriate theoretical approaches applied to the cutting forces and vibrations. The research was carried out through three main studies beginning with a force build-up analysis of the cutting edge entry into the workpiece in intermittent turning. This was followed by a second study, concentrated on modelling of the entry phase which has been explored through experiments and theory developed in the first study. The third part was focused on the influence of the radial depth of cut upon the entry of cutting edge into the workpiece in a face milling application. The methodology for the identification of unfavourable cutting conditions is also explained herein. Important insights into the force build-up process help addressing the correlation between the cutting geometries and the rise time of the cutting force. The influence of the nose radius for a given cutting tool and workpiece configuration during the initial entry is revealed. The critical angle i.e. the position of the face milling cutter that results in unfavourable entry conditions has been explained emphasizing the importance of the selection of cutting conditions. Finally, the theoretical methods utilized for the evaluation of the role of cutting edge geometry within entry phase dynamics has been explored. This has revealed the trends that are of interest for selection of cutting conditions and cutting edge design.
|
436 |
The Committee on Taste and LeisureBarrie, Katherine E 01 January 2019 (has links)
Within my studio practice I have been examining the aesthetics of leisure spaces, the implications of good and bad taste, and what it means to live one’s best life. Considering the history of design motifs and the influence of color upon the human psyche, my thesis exhibition of abstract paintings contains references to patterns, design movements, and modes of artifice that have historically been seen as brazen and tacky. These include nods to the Memphis Design group, faux marble, terrazzo, stucco, and artificial sand. Each has held an important place in the history of designed spaces, and at one time or another they were deeply celebrated before being criticized. I am drawn to the parallels between the surface treatment of furniture and architectural spaces, and the surface of a canvas. My use of materials includes a mixture of high- and lowbrow to reinterpret media such as highly pigmented acrylic paint, natural and artificial sand, volcanic pumice, and hardware store products for DIY home improvement. I use a formal, modernist painting language to elevate the artificial and superficial to the hierarchy associated with the moral underpinnings of modernism. By being entirely serious about the unserious, this work aims to question the value we assign to play and why tastefulness rarely aligns with fun.
|
437 |
Knowledge management as a competitive edge in a global economy : a case study of Thuto ke Lefa trainingKanjere, Maria Matshidiso January 2010 (has links)
Thesis (MBA) --University of Limpopo, 2010 / Knowledge management is an important component of any organization. It includes
knowledge creation, knowledge sifting and knowledge sharing. Thus every organization
has a way of creating, disseminating and preserving its own knowledge. Organizations
that thrive in the 21st century are those that have realized the significance of managing
knowledge and have systems in place to encourage creativity.
Most organizations often overlook the aspect of putting time aside for employees to
share knowledge and expertise from their different fields. Knowledge, if well managed,
has a direct bearing on the growth and development of an organization. Gone are the
days when organizations succeeded only on the basis of working hard; the emphasis
now is on working smart. Technology has made life easy and simple and more
innovative. It has, on the other hand, together with globalization made the world to be
smaller. For instance, sales can take place at any place at anytime in the world.
Distances, as well as meridian differences, are no longer a barrier in terms of growing a
company. Hence, there are virtual companies.
Companies should take it upon themselves to appoint people who are capable of
uplifting their knowledge base and enhancing their organization’s intellectual property;
they should appoint people who are in a position to learn fast and who can as well
adjust to internal as well as to external forces of change. Thus, training and
development should form part and parcel of a company that is prepared to move ahead
of its competitors. Through the right channels of addressing constant changes that are
taking place in the market, the company should have a special way of doing business
and of possessing special knowledge that will put it ahead of its competitors.
Special knowledge and expertise has a capability of generating more revenues for the
company. Revenues are no longer only determined by the production factors but also
by the competitive knowledge that the company possess. Therefore, this implies that for
the company to do well, knowledge has to be well managed as it is used to compete in
a global economy.
The global economy is affected by a number of dynamics which have to be addressed
by smart companies in order for them to stay in business. Knowledge at that level
transforms fast, is transmitted at a high level and can quickly become obsolete. Thus
companies have to keep abreast of what is taking place in the markets and also
become the trendsetters in their area of operation. This is because competition in a
v
global market is stiff; companies face competitors from various conglomerates at local
and international levels.
Therefore this study focuses on how knowledge is being transformed and managed at
Thuto Ke Lefa for competitive advantage and as well as for economic benefits. Thuto
Ke Lefa Training Company is a national company that is based in Polokwane; it has
other branches in Mpumalanga and Gauteng Province. The company specializes in
providing service to the public through developing the skills of the workforce in the
public as well as in the private sector. The company was founded in 2003 by Mr
Mashakobo Johannes Moja and his wife Eunice Moja. Thuto Ke Lefa Training Company
is a registered company which is accredited by the Education, Training and
Development Practitioners Sector (ETDP SETA).
A case study of Thuto Ke Lefa revealed that the company is well resourced in terms of
technology even though some areas have to be beefed up. Various search engines are
available for staff to access and create knowledge. However, knowledge is not well
coordinated as there is no knowledge manager and there is no centralized place for
dissemination and storage of knowledge. Employees do not have the resource person
or office that overlooks the creation of knowledge.
The fact that knowledge is not well coordinated at Thuto Ke Lefa makes the company to
perform below its actual potential as some of the important knowledge can be under
utilized. Therefore, this makes it difficult for the company to measure its capability and
its capacity in terms of knowledge. With this being the case, knowledge is not fully used
to the advantage of the company.
It was therefore recommended through the research that Thuto Ke Lefa should have a
knowledge management office or resource person who will coordinate and manage
knowledge. This will enable the company to use its knowledge resources competitively.
Literature review has also indicated that knowledge is less managed in the training
sector as compared to the other sectors.
It will therefore be imperative for the management of Thuto Ke Lefa to inculcate the
culture of knowledge creation, knowledge sharing and knowledge storage through the
correct devices. Incentives and rewards should be given to those individuals who work
tirelessly to create knowledge. Time should also be set aside for the sharing of
knowledge. Experts in different fields should be in the position to know what is going on
in the other projects of the company so as to make meaningful contributions.
|
438 |
Three-Dimensional Digital Image Processing And Reconstruction Of Granular ParticlesRivas, Jorge A 26 October 2005 (has links)
This thesis presents a method for digitization of the two-dimensional shape of granular particles by means of photo microscopy and image processing techniques implemented using a software package from Media Cybernetics, Inc: Image-Pro Plus 5.1 and the add-ins Scope-Pro 5.0, SharpStack 5.0 and 3D Constructor 5.0. With the use of these tools, it was possible to implement an efficient semi-automated routine that allows the digitization of large numbers of two-dimensional silhouettes of particles in minimum time, without endangering the quality and reliability of the shapes obtained. Different sample preparation techniques, illumination systems, deconvolution algorithms, mathematical functions, filtering techniques and programming commands are brought into play in order to transform the shape of the two-dimensional projection of particles (captured as a set of successive images acquired at different planes of focus) into a binary format (black and white). At the same time, measurements and statistical information such as grain size distribution can be analyzed from the shapes obtained for a particular granular soil. This information also includes but it is not limited to perimeter, area, diameter (minimum, maximum and mean), caliper (longest, smallest and mean), roundness, aspect ratio and fractal dimension. Results are presented for several sands collected from different places around the world. In addition, some alternatives for three-dimensional shape reconstruction such as X-ray nano tomography and serial sectioning are discussed.
|
439 |
On the impact and applicability of network edge computing to reduce network latencies of worldwide client applicationsHorsthemke, Stephan January 2020 (has links)
This project evaluates the applicability of network edge computing to reduce global latencies of client applications. It determines the dimension of latency reduction network edge computing can provide compared to common cloud computing architectures. Furthermore, this project examines whether Compute@Edge, an exemplary and modern edge computing service, enables the replacement of many latency-sensitive cloud systems by an adequate versatility and a reasonable costbenefit ratio. Compute@Edge is a new, serverless edge computing platform by Fastly built on WebAssembly. A prototype that replicates a globally utilized server of Spotify was implemented on Compute@Edge. To compare the latencies of cloud and edge computing, an experiment captured the latencies of the prototype and the original system using a Spotify client that generated almost 26 million data points from all over the world. Next to the experiment, the implementation of the prototype allows accurate insights into the possibilities of Compute@Edge and whether WebAssembly is a promising approach for edge computing. Successes of this work include data showing that network edge computing can reduce latencies significantly. It offers arguments to ramp up the usage of edge computing, WebAssembly and Compute@Edge for applications that require low latencies. The results of the experiment show that network edge computing is capable of reducing network latency compared to cloud computing by at least 38%. The lower latencies combined with the versatility and feasibility of Compute@Edge show that modern edge platforms enable a much higher utilization for applications like Spotify. / Projektet utvärderar hur applicerbart nätverks edge computing är för att minska global latens av kundapplikationer. Den avgör att dimensionen av fördröjnings minskningen i nätverks edge computing kan ge i jämförelse till vanliga cloud computing arkitekturer. Projektet undersöker också om Compute@Edge, en exemplarisk och modernt edge computing service, möjliggör ett byte av många latens-känsliga cloud system och då med en lämplig användbarhet och ett rimlig kostnads-nyttoförhållande. Compute@Edge är en ny serverlös edge computing platform av Fastly, byggt på WebAssembly. En prototype som replikerar en globalt använd server av Spotify var implementerad på Compute@Edge. För att jämföra latenserna av cloud och edge computing, genomfördes ett experiment som fångade upp latenserna av prototypen och det ursprungliga systemet med hjälp från en Spotify kund som genererade runt 26 millioner globala datapunkter. Med experimentet, ger prototypimplementeringen exakta insikter till möjligheterna med Compute@Edge och om WebAssembly är en lovande lösning till edge computing. Arbetes framgång inkluderar data som visar att nätverks edge computing kan minska latensen betydligt. Det visar också argument för att öka på användingen av edge computing, WebAssembly och Compute@Edge till applikationer som behöver låga latens. Experimentets resultat visar att nätverks edge computing kan minska nätverkslatens i jämförelse till cloud computing med åtminstone 38%. De lägre latenserna kombinerade med användbarheten och möjligheten av Compute@Edge visar att moderna edge plattformar ger möjligheter till mycket mer bättre översättning för applikationer som Spotify.
|
440 |
Computer Aided Long-Bone Segmentation and Fracture DetectionDonnelley, Martin, martin.donnelley@gmail.com January 2008 (has links)
Medical imaging has advanced at a tremendous rate since x-rays were discovered in 1895. Today, x-ray machines produce extremely high-quality images for radiologists to interpret. However, the methods of interpretation have only recently begun to be augmented by advances in computer technology. Computer aided diagnosis (CAD) systems that guide healthcare professionals to making the correct diagnosis are slowly becoming more prevalent throughout the medical field.
Bone fractures are a relatively common occurrence. In most developed countries the number of fractures associated with age-related bone loss is increasing rapidly. Regardless of the treating physician's level of experience, accurate detection and evaluation of musculoskeletal trauma is often problematic. Each year, the presence of many fractures is missed during x-ray diagnosis. For a trauma patient, a mis-diagnosis can lead to ineffective patient management, increased dissatisfaction, and expensive litigation. As a result, detection of long-bone fractures is an important orthopaedic and radiologic problem, and it is proposed that a novel CAD system could help lower the miss rate. This thesis examines the development of such a system, for the detection of long-bone fractures.
A number of image processing software algorithms useful for automating the fracture detection process have been created. The first algorithm is a non-linear scale-space smoothing technique that allows edge information to be extracted from the x-ray image. The degree of smoothing is controlled by the scale parameter, and allows the amount of image detail that should be retained to be adjusted for each stage of the analysis. The result is demonstrated to be superior to the Canny edge detection algorithm. The second utilises the edge information to determine a set of parameters that approximate the shaft of the long-bone. This is achieved using a modified Hough Transform, and specially designed peak and line endpoint detectors.
The third stage uses the shaft approximation data to locate the bone centre-lines and then perform diaphysis segmentation to separate the diaphysis from the epiphyses. Two segmentation algorithms are presented and one is shown to not only produce better results, but also be suitable for application to all long-bone images. The final stage applies a gradient based fracture detection algorithm to the segmented regions. This algorithm utilises a tool called the gradient composite measure to identify abnormal regions, including fractures, within the image. These regions are then identified and highlighted if they are deemed to be part of a fracture.
A database of fracture images from trauma patients was collected from the emergency department at the Flinders Medical Centre. From this complete set of images, a development set and test set were created. Experiments on the test set show that diaphysis segmentation and fracture detection are both performed with an accuracy of 83%. Therefore these tools can consistently identify the boundaries between the bone segments, and then accurately highlight midshaft long-bone fractures within the marked diaphysis.
Two of the algorithms---the non-linear smoothing and Hough Transform---are relatively slow to compute. Methods of decreasing the diagnosis time were investigated, and a set of parallelised algorithms were designed. These algorithms significantly reduced the total calculation time, making use of the algorithm much more feasible.
The thesis concludes with an outline of future research and proposed techniques that---along with the methods and results presented---will improve CAD systems for fracture detection, resulting in more accurate diagnosis of fractures, and a reduction of the fracture miss rate.
|
Page generated in 0.0557 seconds