• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 8
  • 5
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 41
  • 41
  • 14
  • 9
  • 9
  • 7
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Wireless Sensor Network for Controlling the Varroasis Spread within Bee colonies across a Geographical Region

Dasyam, Venkat Sai Akhil, Pokuri, Saketh January 2024 (has links)
Background: With the global decline of honey bee populations, safeguarding these vital pollinators has become crucial. Varroa destructor mites are a primary threat, weakening bees and facilitating the spread of diseases, which can decimate colonies and disrupt ecosystems. This thesis investigates the application of a Wireless sensor network (WSN) for the monitoring and control of varroasis spread within bee colonies across large geographical areas. Objectives: The main objective of this research is to develop an integrated method that combines biological insights into varroasis with WSN functionalities for real-time disease monitoring and control. By doing so, the study aims to contribute to the development of a scalable and sustainable approach to apiculture and disease management. Methods: A multi-phase methodological approach was employed, encompassing the modelling of biological phenomena, formulation of WSN functionalities, and the design of a scalable WSN architecture. Simulation studies were conducted, followed by the development of a theoretical framework to support the practical application of the proposed WSN system. A key aspect of the methodology is the introduction of energy estimation models to evaluate the operational feasibility of the WSN. Results: The results indicate that the WSN is capable of dynamically adjusting its monitoring rate in response to changes in infection dynamics, effectively identifying and managing varroa mite populations. The system demonstrated adaptability to various infection rates, with the potential to improve the timely and targeted treatment of infested colonies. Energy consumption data further affirms the operational viability of the WSN. Conclusions: The study concludes that integrating WSNs with biological models is a viable solution for the real-time monitoring and management of varroasis. The proposed WSN system holds promise for enhancing the health and productivity of bee colonies on a broad scale, offering a novel contribution to the fields of apiculture and environmental monitoring.
12

EFFICIENT GRID COMPUTING BASED ALGORITHMS FOR POWER SYSTEM DATA ANALYSIS

Mohsin Ali Unknown Date (has links)
The role of electric power systems has grown steadily in both scope and importance over time making electricity increasingly recognized as a key to social and economic progress in many developing countries. In a sense, reliable power systems constitute the foundation of all prospering societies. The constant expansion in electric power systems, along with increased energy demand, requires that power systems become more and more complex. Such complexity results in much uncertainty which demands comprehensive reliability and security assessment to ensure reliable energy supply. Power industries in many countries are facing these challenges and are trying to increase the computational capability to handle the ever-increasing data and analytical needs of operations and planning. Moreover, the deregulated electricity markets have been in operation in a number of countries since the 1990s. During the deregulation process, vertically integrated power utilities have been reformed into competitive markets, with initial goals to improve market efficiency, minimize production costs and reduce the electricity price. Given the benefits that have been achieved by deregulation, several new challenges are also observed in the market. Due to fundamental changes to the electric power industry, traditional management and analysis methods cannot deal with these new challenges. Deterministic reliability assessment criteria still exists but it doesn’t satisfy the probabilistic nature of power systems. In the deterministic approach the worst case analysis results in excess operating costs. On the other hand, probabilistic methods are now widely accepted. The analytical method uses a mathematical formula for reliability evaluation and generates results more quickly but it needs accurate and a lot of assumptions and is not suitable for large and complex systems. Simulation based techniques take care of much uncertainty and simulates the random behavior of the system. However, it requires much computing power, memory and other computing resources. Power engineers have to run thousands of times domain simulations to determine the stability for a set of credible disturbances before dispatching. For example, security analysis is associated with the steady state and dynamic response of the power system to various disturbances. It is highly desirable to have real time security assessment, especially in the market environment. Therefore, novel analysis methods are required for power systems reliability and security in the deregulated environment, which can provide comprehensive results, and high performance computing (HPC) power in order to carry out such analysis within a limited time. Further, with the deregulation in power industry, operation control has been distributed among many organizations. The power grid is a complex network involving a range of energy resources including nuclear, fossil and renewable energy resources with many operational levels and layers including control centers, power plants and transmission and distribution systems. The energy resources are managed by different organizations in the electricity market and all these participants (including producers, consumers and operators) can affect the operational state of the power grid at any time. Moreover, adequacy analysis is an important task in power system planning and can be regarded as collaborative tasks, which demands the collaboration among the electricity market participants for reliable energy supply. Grid computing is gaining attention from power engineering experts as an ideal solution to the computational difficulties being faced by the power industry. Grid computing infrastructure involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid computing technology offers potentially feasible support to the design and development of grid computing based infrastructure for power system reliability and security analysis. It can help in building infrastructure, which can provide a high performance computing and collaborative environment, and offer an optimal solution between cast and efficiency. While power system analysis is a vast topic, only a limited amount of research has been initiated in several places to investigate the applications of grid computing in power systems. This thesis will investigate probabilistic based reliability and security analysis of complex power systems in order to develop new techniques for providing comprehensive result with enormous efficiency. A review of existing techniques was conducted to determine the computational needs in the area of power systems. The main objective of this research is to propose and develop a general framework of computing grid and special grid services for probabilistic power system reliability and security assessment in the electricity market. As a result of this research, grid computing based techniques are proposed for power systems probabilistic load flow analysis, probabilistic small signal analysis, probabilistic transient stability analysis, and probabilistic contingencies analysis. Moreover, a grid computing based system is designed and developed for the monitoring and control of distributed generation systems. As a part of this research, a detailed review is presented about the possible applications of this technology in other aspects of power systems. It is proposed that these grid based techniques will provide comprehensive results that will lead to great efficiency, and ultimately enhance the existing computing capabilities of power companies in a cost-effective manner. At a part of this research, a small scale computing grid is developed which will consist of grid services for probabilistic reliability and security assessment techniques. A significant outcome of this research will be the improved performance, accuracy, and security of data sharing and collaboration. More importantly grid based computing will improve the capability of power system analysis in a deregulated environment where complex and large amounts of data would otherwise be impossible to analyze without huge investments in computing facilities.
13

EFFICIENT GRID COMPUTING BASED ALGORITHMS FOR POWER SYSTEM DATA ANALYSIS

Mohsin Ali Unknown Date (has links)
The role of electric power systems has grown steadily in both scope and importance over time making electricity increasingly recognized as a key to social and economic progress in many developing countries. In a sense, reliable power systems constitute the foundation of all prospering societies. The constant expansion in electric power systems, along with increased energy demand, requires that power systems become more and more complex. Such complexity results in much uncertainty which demands comprehensive reliability and security assessment to ensure reliable energy supply. Power industries in many countries are facing these challenges and are trying to increase the computational capability to handle the ever-increasing data and analytical needs of operations and planning. Moreover, the deregulated electricity markets have been in operation in a number of countries since the 1990s. During the deregulation process, vertically integrated power utilities have been reformed into competitive markets, with initial goals to improve market efficiency, minimize production costs and reduce the electricity price. Given the benefits that have been achieved by deregulation, several new challenges are also observed in the market. Due to fundamental changes to the electric power industry, traditional management and analysis methods cannot deal with these new challenges. Deterministic reliability assessment criteria still exists but it doesn’t satisfy the probabilistic nature of power systems. In the deterministic approach the worst case analysis results in excess operating costs. On the other hand, probabilistic methods are now widely accepted. The analytical method uses a mathematical formula for reliability evaluation and generates results more quickly but it needs accurate and a lot of assumptions and is not suitable for large and complex systems. Simulation based techniques take care of much uncertainty and simulates the random behavior of the system. However, it requires much computing power, memory and other computing resources. Power engineers have to run thousands of times domain simulations to determine the stability for a set of credible disturbances before dispatching. For example, security analysis is associated with the steady state and dynamic response of the power system to various disturbances. It is highly desirable to have real time security assessment, especially in the market environment. Therefore, novel analysis methods are required for power systems reliability and security in the deregulated environment, which can provide comprehensive results, and high performance computing (HPC) power in order to carry out such analysis within a limited time. Further, with the deregulation in power industry, operation control has been distributed among many organizations. The power grid is a complex network involving a range of energy resources including nuclear, fossil and renewable energy resources with many operational levels and layers including control centers, power plants and transmission and distribution systems. The energy resources are managed by different organizations in the electricity market and all these participants (including producers, consumers and operators) can affect the operational state of the power grid at any time. Moreover, adequacy analysis is an important task in power system planning and can be regarded as collaborative tasks, which demands the collaboration among the electricity market participants for reliable energy supply. Grid computing is gaining attention from power engineering experts as an ideal solution to the computational difficulties being faced by the power industry. Grid computing infrastructure involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid computing technology offers potentially feasible support to the design and development of grid computing based infrastructure for power system reliability and security analysis. It can help in building infrastructure, which can provide a high performance computing and collaborative environment, and offer an optimal solution between cast and efficiency. While power system analysis is a vast topic, only a limited amount of research has been initiated in several places to investigate the applications of grid computing in power systems. This thesis will investigate probabilistic based reliability and security analysis of complex power systems in order to develop new techniques for providing comprehensive result with enormous efficiency. A review of existing techniques was conducted to determine the computational needs in the area of power systems. The main objective of this research is to propose and develop a general framework of computing grid and special grid services for probabilistic power system reliability and security assessment in the electricity market. As a result of this research, grid computing based techniques are proposed for power systems probabilistic load flow analysis, probabilistic small signal analysis, probabilistic transient stability analysis, and probabilistic contingencies analysis. Moreover, a grid computing based system is designed and developed for the monitoring and control of distributed generation systems. As a part of this research, a detailed review is presented about the possible applications of this technology in other aspects of power systems. It is proposed that these grid based techniques will provide comprehensive results that will lead to great efficiency, and ultimately enhance the existing computing capabilities of power companies in a cost-effective manner. At a part of this research, a small scale computing grid is developed which will consist of grid services for probabilistic reliability and security assessment techniques. A significant outcome of this research will be the improved performance, accuracy, and security of data sharing and collaboration. More importantly grid based computing will improve the capability of power system analysis in a deregulated environment where complex and large amounts of data would otherwise be impossible to analyze without huge investments in computing facilities.
14

EFFICIENT GRID COMPUTING BASED ALGORITHMS FOR POWER SYSTEM DATA ANALYSIS

Mohsin Ali Unknown Date (has links)
The role of electric power systems has grown steadily in both scope and importance over time making electricity increasingly recognized as a key to social and economic progress in many developing countries. In a sense, reliable power systems constitute the foundation of all prospering societies. The constant expansion in electric power systems, along with increased energy demand, requires that power systems become more and more complex. Such complexity results in much uncertainty which demands comprehensive reliability and security assessment to ensure reliable energy supply. Power industries in many countries are facing these challenges and are trying to increase the computational capability to handle the ever-increasing data and analytical needs of operations and planning. Moreover, the deregulated electricity markets have been in operation in a number of countries since the 1990s. During the deregulation process, vertically integrated power utilities have been reformed into competitive markets, with initial goals to improve market efficiency, minimize production costs and reduce the electricity price. Given the benefits that have been achieved by deregulation, several new challenges are also observed in the market. Due to fundamental changes to the electric power industry, traditional management and analysis methods cannot deal with these new challenges. Deterministic reliability assessment criteria still exists but it doesn’t satisfy the probabilistic nature of power systems. In the deterministic approach the worst case analysis results in excess operating costs. On the other hand, probabilistic methods are now widely accepted. The analytical method uses a mathematical formula for reliability evaluation and generates results more quickly but it needs accurate and a lot of assumptions and is not suitable for large and complex systems. Simulation based techniques take care of much uncertainty and simulates the random behavior of the system. However, it requires much computing power, memory and other computing resources. Power engineers have to run thousands of times domain simulations to determine the stability for a set of credible disturbances before dispatching. For example, security analysis is associated with the steady state and dynamic response of the power system to various disturbances. It is highly desirable to have real time security assessment, especially in the market environment. Therefore, novel analysis methods are required for power systems reliability and security in the deregulated environment, which can provide comprehensive results, and high performance computing (HPC) power in order to carry out such analysis within a limited time. Further, with the deregulation in power industry, operation control has been distributed among many organizations. The power grid is a complex network involving a range of energy resources including nuclear, fossil and renewable energy resources with many operational levels and layers including control centers, power plants and transmission and distribution systems. The energy resources are managed by different organizations in the electricity market and all these participants (including producers, consumers and operators) can affect the operational state of the power grid at any time. Moreover, adequacy analysis is an important task in power system planning and can be regarded as collaborative tasks, which demands the collaboration among the electricity market participants for reliable energy supply. Grid computing is gaining attention from power engineering experts as an ideal solution to the computational difficulties being faced by the power industry. Grid computing infrastructure involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid computing technology offers potentially feasible support to the design and development of grid computing based infrastructure for power system reliability and security analysis. It can help in building infrastructure, which can provide a high performance computing and collaborative environment, and offer an optimal solution between cast and efficiency. While power system analysis is a vast topic, only a limited amount of research has been initiated in several places to investigate the applications of grid computing in power systems. This thesis will investigate probabilistic based reliability and security analysis of complex power systems in order to develop new techniques for providing comprehensive result with enormous efficiency. A review of existing techniques was conducted to determine the computational needs in the area of power systems. The main objective of this research is to propose and develop a general framework of computing grid and special grid services for probabilistic power system reliability and security assessment in the electricity market. As a result of this research, grid computing based techniques are proposed for power systems probabilistic load flow analysis, probabilistic small signal analysis, probabilistic transient stability analysis, and probabilistic contingencies analysis. Moreover, a grid computing based system is designed and developed for the monitoring and control of distributed generation systems. As a part of this research, a detailed review is presented about the possible applications of this technology in other aspects of power systems. It is proposed that these grid based techniques will provide comprehensive results that will lead to great efficiency, and ultimately enhance the existing computing capabilities of power companies in a cost-effective manner. At a part of this research, a small scale computing grid is developed which will consist of grid services for probabilistic reliability and security assessment techniques. A significant outcome of this research will be the improved performance, accuracy, and security of data sharing and collaboration. More importantly grid based computing will improve the capability of power system analysis in a deregulated environment where complex and large amounts of data would otherwise be impossible to analyze without huge investments in computing facilities.
15

EFFICIENT GRID COMPUTING BASED ALGORITHMS FOR POWER SYSTEM DATA ANALYSIS

Mohsin Ali Unknown Date (has links)
The role of electric power systems has grown steadily in both scope and importance over time making electricity increasingly recognized as a key to social and economic progress in many developing countries. In a sense, reliable power systems constitute the foundation of all prospering societies. The constant expansion in electric power systems, along with increased energy demand, requires that power systems become more and more complex. Such complexity results in much uncertainty which demands comprehensive reliability and security assessment to ensure reliable energy supply. Power industries in many countries are facing these challenges and are trying to increase the computational capability to handle the ever-increasing data and analytical needs of operations and planning. Moreover, the deregulated electricity markets have been in operation in a number of countries since the 1990s. During the deregulation process, vertically integrated power utilities have been reformed into competitive markets, with initial goals to improve market efficiency, minimize production costs and reduce the electricity price. Given the benefits that have been achieved by deregulation, several new challenges are also observed in the market. Due to fundamental changes to the electric power industry, traditional management and analysis methods cannot deal with these new challenges. Deterministic reliability assessment criteria still exists but it doesn’t satisfy the probabilistic nature of power systems. In the deterministic approach the worst case analysis results in excess operating costs. On the other hand, probabilistic methods are now widely accepted. The analytical method uses a mathematical formula for reliability evaluation and generates results more quickly but it needs accurate and a lot of assumptions and is not suitable for large and complex systems. Simulation based techniques take care of much uncertainty and simulates the random behavior of the system. However, it requires much computing power, memory and other computing resources. Power engineers have to run thousands of times domain simulations to determine the stability for a set of credible disturbances before dispatching. For example, security analysis is associated with the steady state and dynamic response of the power system to various disturbances. It is highly desirable to have real time security assessment, especially in the market environment. Therefore, novel analysis methods are required for power systems reliability and security in the deregulated environment, which can provide comprehensive results, and high performance computing (HPC) power in order to carry out such analysis within a limited time. Further, with the deregulation in power industry, operation control has been distributed among many organizations. The power grid is a complex network involving a range of energy resources including nuclear, fossil and renewable energy resources with many operational levels and layers including control centers, power plants and transmission and distribution systems. The energy resources are managed by different organizations in the electricity market and all these participants (including producers, consumers and operators) can affect the operational state of the power grid at any time. Moreover, adequacy analysis is an important task in power system planning and can be regarded as collaborative tasks, which demands the collaboration among the electricity market participants for reliable energy supply. Grid computing is gaining attention from power engineering experts as an ideal solution to the computational difficulties being faced by the power industry. Grid computing infrastructure involves the integrated and collaborative use of computers, networks, databases and scientific instruments owned and managed by multiple organizations. Grid computing technology offers potentially feasible support to the design and development of grid computing based infrastructure for power system reliability and security analysis. It can help in building infrastructure, which can provide a high performance computing and collaborative environment, and offer an optimal solution between cast and efficiency. While power system analysis is a vast topic, only a limited amount of research has been initiated in several places to investigate the applications of grid computing in power systems. This thesis will investigate probabilistic based reliability and security analysis of complex power systems in order to develop new techniques for providing comprehensive result with enormous efficiency. A review of existing techniques was conducted to determine the computational needs in the area of power systems. The main objective of this research is to propose and develop a general framework of computing grid and special grid services for probabilistic power system reliability and security assessment in the electricity market. As a result of this research, grid computing based techniques are proposed for power systems probabilistic load flow analysis, probabilistic small signal analysis, probabilistic transient stability analysis, and probabilistic contingencies analysis. Moreover, a grid computing based system is designed and developed for the monitoring and control of distributed generation systems. As a part of this research, a detailed review is presented about the possible applications of this technology in other aspects of power systems. It is proposed that these grid based techniques will provide comprehensive results that will lead to great efficiency, and ultimately enhance the existing computing capabilities of power companies in a cost-effective manner. At a part of this research, a small scale computing grid is developed which will consist of grid services for probabilistic reliability and security assessment techniques. A significant outcome of this research will be the improved performance, accuracy, and security of data sharing and collaboration. More importantly grid based computing will improve the capability of power system analysis in a deregulated environment where complex and large amounts of data would otherwise be impossible to analyze without huge investments in computing facilities.
16

DESIGN, FABRICATION AND INSTALLATION OF A MICROPROCESSOR CONTROLLED AGRICULTURAL TELECOMMUNICATIONS REPEATER WITH DIAGNOSTICS

Seymour, Donald Bruce, 1955-, Seymour, Donald Bruce, 1955- January 1987 (has links)
No description available.
17

Non-destructive testing of thin strip material : Implementation of the 3MA technique at a steel producing company

Lizarralde, Jon Mikel January 2017 (has links)
This study is an initial attempt to investigate the possibility of substituting conventional laboratory destructive testing techniques at Sandvik's strip steel production facilities with the 3MA (Micro-magnetic Multi-parameter Microstructure and Stress Analysis) NDT (nondestructive testing) technique. The interest for the research comes from various problems with the actual destructive testing method. Sandvik manufactures thin strip steel (among other products) and controls the quality of its product by taking samples from the ends of the strip and measuring the sample's material properties in a separate laboratory. Hence, the sample preparation process is time and material consuming, and the results obtained from the laboratory measurements are not always representative of the real values along the whole length of the strip (usually several kilometers). Therefore, the present project involves the correlation between three material properties (Vickers hardness, tensile strength and carbide density) and a selection of micro-magnetic parameters measured with the 3MA-II equipment manufactured by the Fraunhofer IZFP institute. The 3MA-II system is based on four measuring techniques (harmonic analysis, magnetic Barkhausen noise, incremental permeability and Eddy current testing) and is capable of recording up to 41 micro-magnetic parameters. Samples of two different steel grades (composition) were used in the study. The results for hardness and tensile strength (average relative errors of 1.04% and 0.78%, respectively) corroborated the applicability of the 3MA technique to steel strip inspection. Thus, the implementation of this technique would lead to an improvement in the company's energy efficiency and sustainability. However, finding a good correlation between micromagnetic parameters and material properties is not always possible and, in the case of carbide density, no reliable correlation was achieved. So, further experiments are proposed for future studies regarding carbide density and other material properties.
18

Gestão de projetos: o monitoramento e controle nos processos de desenvolvimento de software

Ramos, Rommel Gabriel Gonçalves 07 March 2014 (has links)
Made available in DSpace on 2016-04-29T14:23:26Z (GMT). No. of bitstreams: 1 Rommel Gabriel Goncalves Ramos.pdf: 1268874 bytes, checksum: a50dd4bc0953f7bc2dc15b68d9594492 (MD5) Previous issue date: 2014-03-07 / The purpose of this research is to investigate the existing difficulty in applying the activity of monitoring and control in the processes of software development, presenting tools and indicators that can help to their constant use. Although the processes of software developments indicate activities related to monitoring and control in project management are still lacking in the effective use of these activities. In the case study will address the monitoring and control over the processes of software development, emphasizing the use of indicators of productivity of a company by performing a measurement on the performance of deliveries in software production activities / O propósito desta pesquisa é investigar a dificuldade existente na aplicação da atividade de monitoramento e controle nos processos de desenvolvimento de software, apresentando ferramentas e indicadores que podem auxiliar a sua utilização constante. Apesar dos processos de desenvolvimentos de software indicar atividades ligadas ao monitoramento e controle, na gestão de projetos ainda há uma carência no uso efetivo dessas atividades. No estudo de caso será abordado o monitoramento e controle sobre os processos de desenvolvimento de software, destacando a utilização de indicadores de produtividade de uma empresa, realizando uma mensuração quanto ao desempenho das atividades de entregas realizadas na produção de software
19

Multivariable And Sensor Feedback Based Real-Time Monitoring And Control Of Microalgae Production System

Jia, Fei January 2015 (has links)
A multi-wavelength laser diode based optical sensor was designed, developed and evaluated for monitoring and control microalgae growth in real-time. The sensor measures optical density of microalgae suspension at three wavelengths: 650 nm, 685 nm and 780 nm, which are commonly used for estimating microalgae biomass concentration and chlorophyll content. The sensor showed capability of measuring cell concentration up to 1.05 g L⁻¹ without sample dilution or preparation. The performance of the sensor was evaluated using both indoor photobioreactors and outdoor paddle wheel reactors. It was shown that the sensor was capable of monitoring the dynamics of the microalgae culture in real-time with high accuracy and durability. Specific growth rate (μ) and ratios of optical densities (OD ratios) at different wavelengths were calculated and were used as good indicators of the health of microalgae culture. A series of experiments was conducted to evaluate the sensor's capability of detecting environmental disturbances in microalgae systems, for instance, induced by dust or Vampirovibrio chlorellavorus, a bacteria found to cause crash of microalgae culture. Optical densities measured from the sensor were insensitive to the amount of dust that consisted of 59.7% of dry weight of microalgae in the system. However, the sensor was able to detect multiple events of introduction of dust timely by μ and OD ratios. The sensor was also capable of detecting subtle changes of culture in color that leads to a total crash of the culture before it can be differentiated by naked eye. The sensor was further integrated into an existing outdoor raceway to demonstrate the sensor's potential of being a core component to control microalgae production system. A real-time monitoring and control program along with a graphical user interface (GUI) was developed for a central control station aiming at improving resource use efficiency for biomass production.
20

Pommes: A Tool For Quantitative Project Management

Bozkurt, Candas 01 May 2005 (has links) (PDF)
Metric collection process and Project Management activities cannot be performed in an integrated fashion on most of the software projects. In software engineering world, there are Project Management Tools that has embedded project metrics and there are various Metric Collection Tools that collect specific metrics for satisfying requirements of different software life cycle phase activities (Configuration Management, Requirements Management, Application Development tools etc.). These tools however are not communicating with each other with any interface or any common database. This thesis focuses on the development of a tool to define, export, collect and use metrics for software project planning, tracking and oversight processes. To satisfy these objectives, POMMES with functionalities of Generic Metric Definition, Collection, Analysis, and Import, Update and Export of Project Metrics from 3rd Party Project Management Tools is developed and implemented in a software organization during this thesis work.

Page generated in 0.0966 seconds