• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 1
  • Tagged with
  • 48
  • 11
  • 9
  • 9
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Modelling and optimization of Safety Instrumented Systems based on dependability and cost measures

Torres-Echeverria, Alejandro C. January 2009 (has links)
This thesis is centred on modelling and multi-objective optimization of Safety Instrumented Systems (SIS) in compliance with the standard IEC 61508. SIS are in charge of monitoring that the operating conditions of a plant remain under safe limits and free of hazards. Their performance is, therefore, critical for the integrity of people around the plant, the environment, assets and production. A large part of this work is devoted to modelling of SIS. Safety integrity and reliability measures, used as optimization objectives, are quantified by the Average Probability of Failure on Demand (PFDavg) and the Spurious Trip Rate (STR). The third objective is the Lifecycle Cost (LCC); ensuring system cost-effectiveness. The optimization strategies include design and testing policies. This encompasses optimization of design by redundancy and reliability allocation, use of diverse redundancy, inclusion of MooN voting systems and optimization of testing frequency and strategies. The project implements truly multi-objective optimization using Genetic Algorithms. A comprehensive analysis is presented and diverse applications to optimization of SIS are developed. Graphical techniques for presentation of results that aid the analysis are also presented. A practical approach is intended. The modelling and optimization algorithms include the level of modelling detail and meet the requirements of IEC 61508. The focus is on systems working in low-demand mode. It is largely based on the requirements of the process industry but applicable to a wide range of other process. Novel contributions include a model for quantification of time-dependent Probability of Failure on Demand; an approximation for STR; implementation of modelling by Fault Trees with flexibility for evaluation of multiple solutions; and the integration of system modelling with optimization by Genetic Algorithms. Thus, this work intends to widen the state-of-the-art in modelling of Probability of Failure on Demand, Spurious Trip Rate and solution of multi-optimization of design and testing of safety systems with Genetic Algorithms.
12

The effects of task complexity on performance in constraint satisfaction design

Smy, Victoria January 2013 (has links)
The aim of this thesis is to examine the importance of constraints in design activity, and more specifically, in constraint satisfaction tasks. Constraints are involved in all design tasks and denote criteria on what constitutes a good design outcome. Within the present research, a constraint stipulates a restriction on how a design element may be assimilated into a design. For instance, given a spatial office layout design, the positioning of an employee may be restricted by a constraint stipulating that they should be in close proximity to a particular area. This thesis attempts to address a gap in the design literature by examining the effect of such constraints on design performance using experimental methodology. Two research threads are addressed; the effects of constraints on performance, and how constraint satisfaction performance can be improved by training. In the first research thread, the theoretical framework adopted concerns Newell and Simon’s (1972) problem space theory and more recent suggestions by Halford, Wilson and Phillips (1998) concerning relational complexity. These are used to predict the effects of increasing the number of design constraints (Experiments 1-2) and the number of types of constraint (Experiments 3-4). Both these factors together with a reduction in the degree of cognitive fit (Vessey & Gellata, 1991) between the constraints and the external representation (Experiment 5) were found to reduce design performance. The second research thread examines whether training can improve design performance. Practice only was found to improve performance on a near transfer task relative to a control group but not a dissimilar, far transfer task (Experiment 6). The subsequent Experiments examined the effect of what has been labelled ‘metacognitive’ training on performance. Findings indicated that a training intervention aimed at encouraging either reflective self-explanation (Experiment 7), or aimed at improving planning strategy (Experiment 8) improved performance in comparison to both control and practice only groups. The implications of these results are discussed together with future research directions.
13

Microbial quality control management in industry : approaches for assessing bioburden and community composition

Davies, Adrienne January 2015 (has links)
Microbial contamination of industrial products and processes can endanger the health and safety of consumers and workers and negatively affect businesses by causing product spoilage and decreases in process efficiency. Major microbial contamination events can result in financial losses from product recalls, unplanned production downtime, loss of sales and litigation. Effective microbial monitoring is essential for detecting the presence of microorganisms and helps businesses control and maintain product quality. This thesis first examines the challenges in achieving accurate assessments of microbial load, which is defined as the quantity of viable microorganisms within a given sample matrix (Chapter 2). Classical culture techniques remain the most widely utilized and approved methods but their major shortcomings have limited their applicability in Quality Management systems employed to control contamination throughout product processing, handling, and distribution. Issues such as the long incubation times to achieve a result, the selective bias of culture conditions on species recovery, and the inability to detect damaged or injured cells are discussed. Rapid microbiological methods (RMMs) offer increased sensitivity and accuracy, can deliver results in near real-time, and could enable more effective and proactive management of microbial contamination problems. Nevertheless, their uptake in industry has been hindered by their higher operating costs and time-consuming proof-of-equivalency studies to the reference culture methods, which are required for approval by the relevant regulatory bodies. An overview of commercially available RMMs is provided and the suitability of each method is application- and industry-dependent. An industrial case study of microbial monitoring methods was performed with Omya AG (Switzerland) the top global producer of calcium carbonate (CC) slurries (Chapter 3). CC slurries are used as manufacturing fillers in a wide range of industries including paper, paints, construction and food. High microbial loads have been shown to adversely effect slurry quality and performance and negatively impact on customers’ finished products (Schwarzentruber, 2003). Quality Assurance (QA) managers rely on culture dip slides for detecting microbial load and directing biocide treatment to manage excessive population growth. RMMs offer Omya the potential for cost savings from effective biocide management and could help QA managers uncover and remedy microbial-related product spoilage sooner. The viability-based RMM CellFacts II (CFII) (CellFacts2014 Ltd., UK) combines electrical flow impedance and viable cell specific fluorescent staining to produce an accurate, real-time measurement of the contaminating population. To improve its usability and desirability to Omya the technology was optimized by streamlining sample preparation steps, the cost per test was reduced, and data presentation and interpretation was simplified. During this study CFII showed increased sensitivity and reliability compared to dip slides and highlighted the differences in preservation efficiency and bioburden levels in slurries preserved with biocide blends or by pHstabilization. Although pH stabilizers are added to slurries during production, Omya still relies upon biocides for managing any uncontrolled microbial growth detected during product storage or upon delivery to customers. The biocide-preserved slurries that were analyzed were maintained at pH 8.5-9.0 while the pH-stabilized slurry were pH 9.5-10.0. It was hypothesized that this 1-log difference in pH impacted upon microbial community composition and it was necessary to ensure that these populations were still susceptible to post-production biocide treatments. A multifaceted approach was used to characterize the communities and assess the coverage and limitations of each method (Chapter 4). Samples were collected and microbial load was determined by plate culture and CFII, and population diversity was elucidated by the recovery of species in culture and by 16S rRNA gene analysis by Terminal-Restriction Fragment Length Polymorphism (T-RFLP) and sequencing on the Illumina MiSeq (USA). A comparison of direct and indirect DNA extraction methods was performed to observe the efficiency and differences in DNA recovery from microbial species. In addition, DNA from the viable population was selectively amplified by treating isolated microbial cells with propidium monoazide (PMA), a dye that enters non-viable cells, binds DNA and inhibits PCR amplification. Biocidepreserved slurry showed lower diversity in culture compared to pH-stabilized slurry but displayed a higher number of operational taxonomic units (OTUs) by T-RFLP and sequence analysis in the microbiome analysis platform Quantitative Insights in Microbial Ecology (QIIME). The microbial communities were significantly different with Pseudomonas spp. associated with biocide-preserved slurry and alkaliphilic Bacillus spp. dominating in pH-stabilized products. The culture media conditions were inadequate for recovering the highly abundant alkaliphilic population in pHstabilized slurry and future culture work must be optimized on alkaliphilic media. 16S rRNA gene analyses produced a more truthful representation of the community and DNA extracted from PMA-treated cells revealed changes in population structure after 48 hours of incubation and in response to biocide addition. Slurry populations are dynamic; regular sample collection at various time points is necessary to achieve accurate detection and monitoring of microbial communities.
14

Scalable allocation of safety integrity levels in automotive systems

Azevedo, Luís Pedro da Silva January 2015 (has links)
The allocation of safety integrity requirements is an important problem in modern safety engineering. It is necessary to find an allocation that meets system level safety integrity targets and that is simultaneously cost-effective. As safety-critical systems grow in size and complexity, the problem becomes too difficult to be solved in the context of a manual process. Although this thesis addresses the generic problem of safety integrity requirements allocation, the automotive industry is taken as an application example. Recently, the problem has been partially addressed with the use of model-based safety analysis techniques and exact optimisation methods. However, usually, allocation cost impacts are either not directly taken into account or simple, linear cost models are considered; furthermore, given the combinatorial nature of the problem, applicability of the exact techniques to large problems is not a given. This thesis argues that it is possible to effectively and relatively efficiently solve the allocation problem using a mixture of model-based safety analysis and metaheuristic optimisation techniques. Since suitable model-based safety analysis techniques were already known at the start of this project (e.g. HiP-HOPS), the research focuses on the optimisation task. The thesis reviews the process of safety integrity requirements allocation and presents relevant related work. Then, the state-of-the-art of metaheuristic optimisation is analysed and a series of techniques, based on Genetic Algorithms, the Particle Swarm Optimiser and Tabu Search are developed. These techniques are applied to a set of problems based on complex engineering systems considering the use of different cost functions. The most promising method is selected for investigation of performance improvements and usability enhancements. Overall, the results show the feasibility of the approach and suggest good scalability whilst also pointing towards areas for improvement.
15

Fighter pilot's performance and mental workload

Mansikka, H. P. January 2016 (has links)
Human information processing consists of multiple and limited resources; some of them are shared while some are separate and non-interchangeable. High pilot mental workload (PMWL) - and the subsequent decline in performance - results from the imbalance between the mental resources available to perform the task and the amount of resources needed to perform it. When the pilot’s proficiency is evaluated, s/he should deliver an acceptable performance while being able to reserve enough mental capacity for the unexpected, additional resource demands. The task demands and cognitive stressors of air combat have potential to degrade pilot performance to an unacceptable level. Therefore, it is important to understand the amount of mental workload the pilots are experiencing and how much spare capacity they have available to cope with the possible additional resource demands. This thesis was aimed at understanding the relationship between PMWL and performance. The approach presented in this thesis was expected to support the development of reliable metrics for predicting the pilot performance under the stress of combat. In terms of practical applications, this thesis contributed to the development of the methodological principles that could help assuring the pilots’ ability to cope with the task demands higher than those experienced during training or proficiency checks. Heart rate (HR) and heart rate variation (HRV) were used as indexes of PMWL. The selection was done for several reasons. HR and HRV measures were accepted by the pilots as they were non-intrusive and they appeared to be objective. In addition, the implementation requirements were by no means excessive. Considering the aims of this thesis, the low diagnosticity of HR/HRV was not an issue. Finally, HR and HRV proved to be sensitive measures of varying task demands – especially when measured together with the pilots’ awareness of the mission requirements. Simulated fighter missions were used to manipulate the pilots’ task demand and to measure their performance and HR/HR. The thesis is constructed around three studies. In the first study, the subjects were required to fly instrument approaches in a high fidelity simulator under various levels of task demand. The task demand was manipulated by increasing the load on the subjects by reducing the range at which they commenced the approach. HR and the time domain components of HRV were used as measures of PMWL. The findings indicated that HR and HRV were sensitive to varying task demands. HR and HRV were able to distinguish the level of PMWL after which the subjects were no longer able to cope with the increasing task demands and their performance fell to a sub-standard level. The major finding of the first study was the HR/HRV’s ability to differentiate the sub-standard performance approaches from the high performance approaches. In the second study, fighter pilots’ performance and PMWL were both measured during a real instrument flight rules proficiency check in an F/A-18 simulator. PMWL was measured using HR and HRV. Performance was rated using Finnish Air Force’s official rating scales. Results indicated that HR and HRV were able to differentiate varying task demands in situations where variations in performance were insignificant. It was concluded that during a proficiency check, PMWL should be measured together with the task performance measurement. In the third study, fighter pilots’ HRV and performance were examined during instrument approaches and air combat. The subjects’ performance was rated by a weapons instructor. In addition, the subjects’ HRV was measured and used as an indicator of PMWL. During the instrument approaches, low performance was associated with high PMWL as expected. However, during the combat phases of the mission, low performance was associated with low PMWL. When the subject’s awareness of the mission requirements was studied, it was found that the combination of low performance and low PMWL was associated with the subjects’ low awareness of the mission requirements. The major finding was that unless the subjects’ awareness of the mission requirements is examined, the relationship between the mental workload and performance during a complex combat mission may be difficult to explain. It is concluded that HR and HRV are sensitive measures of PMWL in a simulated fighter aviation environment. HR and HRV proved to be associated with the changes in task demands and pilots’ performance during simulated instrument approaches and air combat. However, the results of this thesis suggest that measuring just PMWL and performance is not sufficient – especially if the task of interest is complex and dynamic. To fully understand the pilot performance in such environment, the relationship between awareness of the mission requirements, workload and performance needs to be untangled. While this thesis provides encouraging results to understand this phenomena, further research is still needed before awareness of the situation requirements (or more broadly, situation awareness), performance and mental workload can be measured simultaneously, objectively and in real time.
16

An integrated user-centric service platform to promote sustainable behaviour

Alsulaiman, Abdulaziz January 2014 (has links)
In this thesis, the focus is on addressing the gaps that exist between (a) utility companies’ efforts to implement and diffuse Internet-based services and (b) their goal to achieve users’ expectations in terms of managing their domestic utility services online. The research highlights the importance of shifting the focus towards a user-centric and integrated virtual enterprise service delivery platform that factors in user demands and functions as a medium for interaction between all utility parties to cooperate in achieving efficient resources usage, while promoting sustainability and environmental welfare. The research involves a critical investigation of users’ perspectives related to the success of the creation of the proposed platform in the utility domains, with particular emphasis on its potential influence on user perceptions in terms of benefits, technology-fit, acceptance, and service adoption; in addition to the potential role of such platform in promoting individual pro-environmental sustainability. The research methodology design involves the collection of information (n=1142) about individual opinions, views, beliefs and expectations towards the new conceptualised innovation platform. The methodology involves four main stages. The initial stage was devoted to defining the research problem, conceptual model, developing a theoretical validation framework and the research questions with a formulation of the related hypothesis. This stage of the research design is critical to identify the key underpinning aspects of the research topic and to determine the main variables that have an impact on the research problem. The second stage involves an empirical investigation of Internet-user’s perceptions of task-technology fit, new technology innovation adoption, barriers, trends and difficulties for both existing and future technology innovation as well as an electronic service delivery approach, using a quantitative questionnaire survey. In the third stage, a mock-up prototype system is implemented to illustrate and validate the proposed research concepts. The last stage, aimed at analytically validate users’ perspective towards the new technology innovation platform as formulated in the research model and to predict their willingness to adopt the platform and impact on their individual sustainable behaviour. The resulting validated model and service platform provide a means for utility companies to deliver user accepted electronic utility management services, while promoting environmental friendly behaviour.
17

Collaborative adaptive accessibility and human capabilities

Atkinson, Matthew T. January 2012 (has links)
This thesis discusses the challenges and opportunities facing the field of accessibility, particularly as computing becomes ubiquitous. It is argued that a new approach is needed that centres around adaptations (specific, atomic changes) to user interfaces and content in order to improve their accessibility for a wider range of people than targeted by present Assistive Technologies (ATs). Further, the approach must take into consideration the capabilities of people at the human level and facilitate collaboration, in planned and ad-hoc environments. There are two main areas of focus: (1) helping people experiencing minor-to-moderate, transient and potentially-overlapping impairments, as may be brought about by the ageing process and (2) supporting collaboration between people by reasoning about the consequences, from different users perspectives, of the adaptations they may require. A theoretical basis for describing these problems and a reasoning process for the semi-automatic application of adaptations is developed. Impairments caused by the environment in which a device is being used are considered. Adaptations are drawn from other research and industry artefacts. Mechanical testing is carried out on key areas of the reasoning process, demonstrating fitness for purpose. Several fundamental techniques to extend the reasoning process in order to take temporal factors (such as fluctuating user and device capabilities) into account are broadly described. These are proposed to be feasible, though inherently bring compromises (which are defined) in interaction stability and the needs of different actors (user, device, target level of accessibility). This technical work forms the basis of the contribution of one work-package of the Sustaining ICT use to promote autonomy (Sus-IT) project, under the New Dynamics of Ageing (NDA) programme of research in the UK. Test designs for larger-scale assessment of the system with real-world participants are given. The wider Sus-IT project provides social motivations and informed design decisions for this work and is carrying out longitudinal acceptance testing of the processes developed here.
18

Investigating the effects of physical and cognitive demands on quality of performance and subjective responses under pacing/Takt time

Shaikh, Shakil Ahmed January 2015 (has links)
Assembly work requires high levels of performance and quality but may involve complex cognitive and physical tasks. There is evidence that physical and cognitive workloads are not separate, but may interact. Work in exercise and simple physical tasks suggests that physical load may lead to changes in cognitive performance, and in perceived workload. The aim of this thesis is to examine physical and cognitive interactions that might affect assembly work. First, observation was undertaken in industry to identify the physical and cognitive factors relevant to examples of assembly lines. From this, a task analysis of a simulated assembly task was developed. Three experimental studies were conducted, based upon the simulated assembly task, in order to investigate three main assembly variables; working height, memory load and pacing. The first study showed that the number of completed assemblies was reduced when performed at higher pacing and while working at above shoulder height. The number of components dropped was higher when performed at above shoulder height. When the task was performed at elbow height ‘wait’ time increased as the beep time was found to be higher at elbow height than the above shoulder height, which led to increase wait time when performing the task at elbow height. Subjective measures (NASA TLX) showed that temporal demand and effort were reported as higher during high pacing. Perceived physical and temporal demand increased when working above shoulder height. An interaction on subjective measure was identified between pacing and working height. Performance of NASA TLX was found to be poor when performing the assembly operation at high pacing/Takt and above shoulder height as compared to working at high pacing/ Takt and elbow height. In the second study the experimental design was modified by changing the assembly order to variable assembly and consistent assembly, which represented single model assembly line (where only one type of assembly is being processed) and mixed model assembly line (different types of products being processed). Study 2 was found to be more mentally demanding due to task complexity. However, it was also found that completed assemblies were higher for the consistent assembly task. Subjective measures reported stress as being higher for higher pacing and variable assembly. The final study combined the variables from the first two studies as well as investigating different levels of memory load. Performance times for variable assembly were longer and resulted in less correct code responses. A higher memory load resulted in a higher performance time and lower correct code responses as well as fewer completed assemblies. An interaction between working height and perceived mental workload was found. Results showed that perceived temporal demand and perceived effort of NASA TLX were found to be higher when performing the assembly operation at elbow height and high memory as compared to the assembly operation performed at elbow height and low memory. It was also found that memory load affects perceived physical demand. For industry the findings suggest that in variable (mixed model) assembly different levels of pacing, working height and cognitive demands may affect workers’ performance both physically and mentally. Demands will be higher when working at variable assembly but also performance will vary where variable and consistent assembly are used together. The research also discusses theories that might be most useful for describing these effects.
19

Experimental characterisation of jumping and bobbing actions for individuals and small groups

McDonald, Madison G. January 2015 (has links)
Stadia and structures which host crowd events often experience dynamic crowd loading. Of greatest concern is the loading from the actions of jumping and bobbing, especially if the action is regulated by an external stimulus. Gathered crowds of individuals will often synchronise their actions with one another exacerbating the jumping and bobbing load. Crowd and individual jumping and bobbing models are used to predict the dynamic forces experienced by a structure, however there is a lack of in-situ forces and experimental data from groups. To further the advancement of current crowd and individual models, this thesis provides an in depth study into the actions of jumping and bobbing. Experiments with eight test subjects (TS) and a range of activity frequencies were first conducted to study the loads generated by individuals. The properties of both actions were characterised and the variation within each TS’s ground reaction force (GRF) known as intrasubject variability, and inter-subject variability (between TSs) was quantified. Weight, gender and height affected the jumping GRFs. For jumping a significant portion of the properties were found to have the largest inter-subject variation at a frequency of 2Hz, suggesting high diversity of jumping properties between TSs. Overall there was more inter and intra-subject variation in the activity of bobbing than jumping. A novel indirect force measurement method was sought to aid the data collection of in-situ individual and group jumping and bobbing GRFs, by monitoring a single point on individual’s bodies. It was found that the best monitoring point was the C7th Vertebrae which provided reliable force data for the 1st and 2nd harmonics of jumping and bobbing. Having verified the single body point methodology for force measurement, group experiments with 2, 4 and 8 TSs were conducted. Metronome, music and visual stimuli were used to dictate the target frequency which ranged between 1.5 and 3.5Hz. A large database comprising of4,794 individual GRFs was collected. The degree of individual and group synchronisation with the beat and with one another was quantified for the first two harmonics. Group size, stimulus, and stimulus frequency were found to affect synchronisation. Charts detailing the expected levels of synchronisation were produced. The responses of simulated single degree of freedom (SDOF) systems to the group GRFs were examined and compared to responses from a periodic signal. For resonance responses there is potential for crowd loads to be modelled as a half-sine periodic force. Structural responses from forces measured on rigid surfaces were compared to responses from forces measured on flexible surfaces and found to be larger, especially at resonance. Charts are presented detailing the likely levels of structural resonance response for each stimulus and group size.
20

Managing knowledge for through life capability

Ahlberg Pilfold, Sofia January 2016 (has links)
In 2005 the UK Ministry of Defence (MoD) published a White Paper in which it detailed its Defence Industrial Strategy (DIS) (UK MoD, 2005). The strategy involved a rapid transformation of UK defence towards a product-service, business-like paradigm through the adoption of Through Life Capability Management (TLCM). TLCM has since been succeeded by other initiatives. However, for organisations involved in the management of capability through life, the associated principles of operation as well as the challenges remain, including that of the management of knowledge. The confederated capability enterprise is a distributed knowledge system. Knowledge of the systems, for which a particular organisation has through-life management responsibility, may be distributed throughout an enterprise that comprises several commercial organisations as well as the customer. The bringing together of different components of capability and perspectives makes managing knowledge difficult. This is complicated further by the observation that in a decade one can expect a significant proportion of the manpower involved in a capability will have changed. Success in this type of environment requires a clear understanding of the value of particular knowledge within the organisation as well as effective knowledge management in the wider enterprise. Dstl and EPSRC have jointly funded this research which addresses management of knowledge for through life capability through modelling of the capability enterprise, a workshop on TLCM benefits and behaviours, a comparative case study at a commercial service company and the UK MoD including Dstl, and knowledge mapping within a specific exemplar capability. The results of the modelling illustrated the Systems of Systems (SoS) nature of the enterprise and the need to align capability and management processes across the enterprise. How well this can be achieved depends on the extent to which both the UK MoD and industry are willing to share, access and process information and knowledge. This would require trust between the individuals and organisations involved. The need for trust was emphasised in an international workshop where the participants discussed the behaviours that were required for the perceived benefits of TLCM to be realised. The workshop members highlighted trust in long term planning as industry seeks to manage skills and knowledge over time. ServiceCo provides communication and media services to customers globally. It comprises four customer-facing divisions and two operational units. The case which was based on interviews in one customer-facing and one operational unit revealed the following: • Focus on corporate values supports knowledge management behaviours across the organisation. • Succession planning is needed for all skills and knowledge that are critical or essential to the business. • Once the continual renewal of knowledge slows down and/or stops in an organisation, the knowledge is lost. The second case of the study was the Royal Navy Command Head Quarters and Dstl. Dstl is a trading fund that provides UK MoD and the wider UK government specialist Science & Technology services and operates and manages the Chief Scientific Advisor's research programme. The case study revealed: • Security regulations and considerations impact significantly on effective management of knowledge. • Knowledge retrieval can be 'hit and miss' as complicated filing structures and indexing practices are applied inconsistently, leading to individuals adopting a number of strategies to share knowledge. • Succession planning for people with rare skills is an issue that impacts business continuation. Comparison between the two cases showed that the two organisations experienced different problems but that the knowledge behaviours adopted by the individuals involved were essentially the same. This pointed to the need to address the issues associated with the management of knowledge as cultural and organisational in nature. Personal strategies to manage and share knowledge included individuals retaining copies of files on desktop hard drives and keeping paper copies in drawers; documents were emailed to ensure the intended audience would get it or be able to access it; and asking a colleague for advice on where to find out things. An important difference between knowledge management between the two organisations was that the UK MoD relied on processes due to the rapid change of personnel whereas the service company relied on personal relationships as people remained in the roles for longer. The knowledge mapping of 'moving personnel and materiel using vehicles' revealed that each Line of Development (LoDs) has its own constituent (LoDs) indicating the requirement to manage organisational capability in order to deliver capability to customers. It also illustrated all the active knowledge that is required in order for the capability to be delivered. The research main contributions are: • Theoretical models for exploring the use of knowledge in acquisition projects over time • Comparing two organisations at separate ends of the organisational spectrum and identifying common organisational factors that influence the management of knowledge for through life capability • Recognising that the enterprise is a capability SoS. In order to successfully delivery capability, knowledge about and within the components needs to be managed. Other findings include: • Management of knowledge for TLCM puts the focus on managing knowledge for future capability requirements rather than on retention of knowledge products, bringing in aspects such as business continuation planning and consequently impacting on the organisation's future development. • There is a strong relationship between knowledge conservation, human resource management and company policies. • Managing changes in design and/or function requires a good understanding of the different processes used within the various disciplines involved across the capability components and how they contribute to the final product and to each other. • An organisation's goals and the manner in which it organises itself to achieve them with regard to the management of knowledge does not appear linked. Instead, focus falls on the organisational architecture and the human resource polices that it implies. • 'Knowing' is an individual capability and also a social one; communities of practice and networking are necessary components of an organisation's knowledge base. • Knowing whom to ask and where to look is in a knowledge retrieval perspective nearly as important as knowing what to look for. • 'Individuals know while documents, processes and tools support knowing'. This emphasises the need for a close connection between humans and IT-based knowledge repositories. • The role of IT in knowledge management can either be to correlate knowledge in people's heads to relevant projects or to correlate individuals and knowledge in relevant projects depending on the key questions asked in the management of knowledge within the organisation. • The role of IT in determining issues related to the relevance and location of documentation differs depending of the organisation's reliance on face to face interactions between employees as a means for communicating this information. • The capability end user is in some instances hard to define. How the end user is defined determines where the SoS boundaries are defined. It is probably better to define the boundary as a broad fuzzy border. The indeterminacy implied by this view becomes a complexity issue for management of knowledge. • The impetus to manage knowledge and how is influenced legal requirements and by the organisation's relationships with its stakeholders including the extent it is subject to external scrutiny. Based on the research, a number of recommendations are made.

Page generated in 0.0299 seconds