211 |
Complying with the GDPR in the context of continuous integrationLi, Ze Shi 08 April 2020 (has links)
The full enforcement of the General Data Protection Regulation (GDPR) that began on May 25, 2018 forced any organization that collects and/or processes personal data from European Union citizens to comply with a series of stringent and comprehensive privacy regulations. Many software organizations struggled to comply with the entirety of the GDPR's regulations both leading up and even after the GDPR deadline. Previous studies on the subject of the GDPR have primarily focused on finding implications for users and
organizations using surveys or interviews. However, there is a dearth of in-depth studies that investigate compliance practices and compliance challenges in software organizations. In particular, small and medium enterprises are often neglected in these previous studies, despite small and medium enterprises representing the majority of organizations in the EU. Furthermore, organizations that practice continuous integration have largely been ignored in studies on GDPR compliance. Using design science methodology, we conducted an in-depth study over the span of 20 months regarding GDPR compliance practices and challenges in collaboration with a small, startup organization. Our first step helped identify our collaborator's business problems. Subsequently, we iteratively developed two artifacts to address those business problems: a set of privacy requirements operationalized from GDPR principles, and an automated GDPR tool that tests these GDPR-derived privacy requirements. This design science approach resulted in five implications for research and for practice about ongoing challenges to compliance. For instance, our research reveals that GDPR regulations can be partially operationalized and tested through automated means, which is advantageous for achieving long term compliance. In contrast, more research is needed to create more efficient and effective means to disseminate and manage GDPR knowledge among software developers. / Graduate
|
212 |
Entwurf und Implementierung einer neuen Architektur für TESSIToschev, Jöran 27 August 2003 (has links)
Das Projekt TESSI beschäftigt sich mit der Entwicklung eines Verfahrens und eines CASE-Werkzeugs zur Analyse von Anforderungstexten und dem parallelen Entwurf eines objektorientierten Modells. Veränderte Anforderungen und geplante Erweiterungen machten eine Überarbeitung der Version 1.1 des Programms TESSI notwendig. Diese Diplomarbeit beschreibt das Reengineering der Altsoftware und den Entwurf sowie die Implementierung der neuen Version 2.0 von TESSI.
|
213 |
Contributions To Ontology-Driven Requirements EngineeringSiegemund, Katja 29 April 2014 (has links)
Today, it is well known that missing, incomplete or inconsistent requirements lead to faulty software designs, implementations and tests resulting in software of improper quality or safety risks. Thus, an improved Requirements Engineering contributes to safer and better-quality software, reduces the risk of overrun time and budgets and, most of all, decreases or even eliminates the risk for project failures.
One significant problem requirements engineers have to cope with, are inconsistencies in the Software Requirements Specification. Such inconsistencies result from the acquisition, specification, and evolution of goals and requirements from multiple stakeholders and sources. In order to regain consistency, requirements information are removed from the specification which often leads to incompleteness. Due to this causal relationship between consistency, completeness and correctness, we can formally improve the correctness of requirements knowledge by increasing its completeness and consistency. Furthermore, the poor quality of individual requirements is a primary reason why so many projects continue to fail and needs to be considered in order to improve the Software Requirements Specification.
These flaws in the Software Requirements Specification are hard to identify by current methods and thus, usually remain unrecognised. While the validation of requirements ensures that they are correct, complete, consistent and meet the customer and user intents, the requirements engineer is hardly supported by automated validation methods.
In this thesis, a novel approach to automated validation and measurement of requirements knowledge is presented, which automatically identifies incomplete or inconsistent requirements and quality flaws. Furthermore, the requirements engineer is guided by providing knowledge specific suggestions on how to resolve them. For this purpose, a requirements metamodel, the Requirements Ontology, has been developed that provides the basis for the validation and measurement support. This requirements ontology is suited for Goal-oriented Requirements Engineering and allows for the conceptualisation of requirements knowledge, facilitated by ontologies. It provides a huge set of predefined requirements metadata, requirements artefacts and various relations among them. Thus, the Requirements Ontology enables the documentation of structured, reusable, unambiguous, traceable, complete and consistent requirements as demanded by the IEEE specification for Software Requirement Specifications. We demonstrate our approach with a prototypic implementation called OntoReq. OntoReq allows for the specification of requirements knowledge while keeping the ontology invisible to the requirements engineer and enables the validation of the knowledge captured within.
The validation approach presented in this thesis is capable of being applied to any domain ontology. Therefore, we formulate various guidelines and use a continuous example to demonstrate the transfer to the domain of medical drugs. The Requirements Ontology as well as OntoReq have been evaluated by different methods. The Requirements Ontology has been shown to be capable for capturing requirements knowledge of a real Software Requirements Specification and OntoReq feasible to be used by a requirements engineering tool to highlight inconsistencies, incompleteness and quality flaws during real time requirements modelling.
|
214 |
Understanding the Challenges and Needs of Requirements Engineering for Data-centric SystemsPoolla, Venkata Sai Abhishek, Mandava, Bhargav Krishna January 2021 (has links)
Background: As technology is advancing day by day, people tend to produce enormous volumes of data. This exceptional growth in data is leading to an increase in the development of intelligent systems that make use of this huge data available. We group the development of such type of intelligent software systems and term them as "Data-Centric Systems (DCS)". Such systems include AI/ML components in an aself-driving car, Recommender systems any many more. Developing DCS is complexin the Software development life cycle process; one of the main reasons behind this complexity is the ineffective handling of requirements. Moreover, the literature suggests that a large percentage (48%) of development problems begin during the requirements phase and fixing requirements-related problems consumes a high cost of rework in later stages of software development. To design DCS effectively, RE techniques are considered one of the essential factors, since it is required to promote the combination of a system’s functional and implementation expectations from two entirely different perspectives, such as customers and developers. Although RE frequently plays a critical role in DCS, little is known about how RE can effectively be incorporated into developing such systems. Objectives: This thesis aims to focus on understanding industry experiences in the development of DCS with the main emphasis on RE and investigate the techniques/approaches used in DCS designing during the RE process and identify the challenges practitioners face during the development process. Methods: Two workshop-style interviews are conducted to identify the design process of RE and the practitioners’ challenges during DCS development. To complement the results from the workshop and scaling up the target population, an online survey is conducted. Results: From the workshops, we have identified that no explicit stakeholder is involved during the RE phase of DCS. Still, all people collectively take the decisions when it comes to developing in agile, and the role varies depending on the type of projects the stakeholder is involved in. Four categories of requirements were identified, namely regulatory, infrastructure, data, safety and security requirements. Techniques/approaches used to elicit, document, analyse and validate the requirements were identified. Based on the data received, we have identified ten challenges faced by the practitioners during the DCS. Based on the number of responses recorded in the survey, the categorisation and the techniques/approaches used for RE were prioritised based on the highest number of responses received. A total of 15 themes were generated for the challenges based on the responses received from participants. Conclusions: To conclude, a specific RE architecture needs to be designed to help the practitioners during the development of DCS. We believe that the analysis of these insights provides the industry with a structured overview of the DCS development process in RE. Besides, this thesis allows the academic community to steer future research based on an understanding of industry needs in RE for DCS.
|
215 |
Requirements Engineering and Software Development Process of an A-SMGCS Earth Magnetic Field Sensor Data Playback and Basic Analysis ToolPanditpautra, Rishi Ashwin 28 February 2017 (has links)
Advanced Surface Movement Guidance and Control Systems (A-SMGCS) help to further improve safety and efficiency of the traffic on the aerodrome surface. The current A-SMGCS sensor technologies have certain operational and functional limitations. A new and unprecedented sensor technology is being tested as a pilot project. This unique sensors is called MagSense®. It works based on the principle of detecting the influence of ferromagnetic materials on earth’s magnetic field. For applications in the aviation environment, learning processes are necessary which are generally based on the graphical depiction of stored sensor data and features to analyze the graphs. For this purpose a visualization and analysis tool is needed.
In order to create an adequate tool to allow for depicting stored sensor data and the peaks caused by ferromagnetic objects in aircraft and vehicles, a requirements engineering process will be conducted wherein the requirements of the various stakeholders will be identified and harmonized. In general, the appropriate RE approach will ensure mutual agreement among the stakeholders and a set of requirements for the first edition of the tool without contradictions. The harmonized package of requirements will then be used as the starting point for a software development process, after which the tool will be produced as specified and validated as a part of this Master’s Thesis.
This Master’s Thesis puts a special focus on the choice of a suitable method in Requirements Engineering and Requirements Management, adequately adapted to the project size and its quality. The selection of appropriate elements from the methodology as well as the outcomes from applying them on a specific software production project are at the core.
|
216 |
Requirements Validation Techniques : Factors Influencing themPEDDIREDDY, SANTOSH KUMAR REDDY, NIDAMANURI, SRI RAM January 2021 (has links)
Context: Requirement validation is a phase of the software development life cycle where requirements are validated to get rid of inconsistency, incompleteness. Stakeholders involved in the validation process to make requirements are suitable for the product. Requirement validation techniques are for validating the requirements. Selection of requirements validation techniques related to the factors that need to consider while validating requirements makes the validation process better. This paper is about the factors that influence the selection of requirements validation technique and analyzing the most critical factors. Objectives: Our research aim is to find the factors influencing the selection of requirements validation techniques and evaluating critical factor from the factors list. To achieve our goal, we are following these objectives. To get a list of validation techniques that are currently being used by organizations, and to enlist the factors that influence the requirements validation technique. Methods: To identify the factors influencing the selection of requirement validation techniques and evaluating the critical factors, we conducted both a literature review and survey. Results: From the literature review, two articles considered as our starter set, and through snowball sampling, a total of fifty-four articles were found relevant to the study. From the results of the literature review, we have formulated a questionnaire and conducted a survey. A total of thirty-three responses have gathered from the survey. The survey obtains the factors influencing the requirement validation techniques. Conclusions: The factors we got from the survey possess a mixed view like each factor has its critically in different aspects of validation. Selecting one critical factor is not possible during the selection of the requirements validation technique. So, we shortlisted the critical factors that have more influence in the selection of requirement validation techniques, Factors, Requirements validation techniques.
|
217 |
Development of New Model-based Methods in ASIC Requirements EngineeringOnuoha, Chukwuma Onuoha 25 January 2022 (has links)
Requirements in the development of application-specific integrated circuits (ASICs) continue to increase. This leads to more complexities in handling and processing the
requirements, which often causes inconsistencies in the requirments. To better manage the resulting complexities, ASIC development is evolving into a model-based
process. This thesis is part of a continuing research into the application and evolution of a model-based process for ASIC development at the Robert Bosch GmbH.
It focuses on providing methologies that enable tracing of ASIC requirements and specifications as part of a model-based development process to eliminate inconsistencies
in the requirements. The question of what requirements are and, what their traceability means, is defined and analysed in the context of their relationships to
models.
This thesis applies requirements engineering (RE) practices to the processing of ASIC requirements in a development environment. This environment is defined
by availability of tools which are compliant with some standards and technologies. Relying on semi-formal interviews to understand the process in this environment and
what stakeholders expect, this thesis applies the standards and technologies with which these tools are compliant to provide methodologies that ensures requirements
traceability. Effective traceability methods were proven to be matrices and tables, but for cases of fewer requirements (ten or below), requirement diagrams are also efficient and
effective. Furthermore, the development process as a collaborative effort was shown to be enhanced by using the resulting tool-chain, when the defined methodologies
are properly followed. This solution was tested on an ASIC concept development project as a case study.
|
218 |
Designing an application to improve cloning of BIM drawings - a feasibility studyFors, Alexander January 2022 (has links)
Construction drawings have gotten a lot more sophisticated through the years. Most companies today use Building Information Modeling (BIM) which is the process of creating a digital representation of physical places. These models save time and reduce the risk of errors during the construction process, but are time-consuming to create. Companies are therefore interested in improving this process using software solutions. Tekla Structures is a BIM-software that enables the creation of 3D-models. These models can in turn be used to generate a 2D-drawing using a built-in cloning algorithm. This method saves a considerable amount of time when creating construction drawings, but the algorithm does not perform well when used for more complex tasks. This thesis is written in collaboration with the architecture and engineering consultancy Sweco. The study aims to analyse the feasibility of solutions for improving or replacing the Tekla Structures cloning algorithm. An elementary application is developed to answer the research question. During development methods such as architectural design and prototyping were employed as construction tools and used to identify constraints. In the later stages of development Human-computer interaction (HCI) guidelines are followed to create more user-friendly software. Tekla Open API has been used as an interface to interact with Tekla Structures, providing useful methods and classes. This study emphasises the complexity of creating these kinds of applications. While it is true that similar tools can and should be developed in the future, it is suggested that the focus should be on creating custom solutions using the built-in Tekla cloning algorithm.
|
219 |
Goal-oriented Process MiningGhasemi, Mahdi 05 January 2022 (has links)
Context: Process mining is an approach that exploits event logs to discover real processes executed in organizations, enabling them to (re)design and improve process models. Goal modelling, on the other hand, is a requirements engineering (RE) approach mainly used to analyze what-if situations and support decision making.
Problem: Common problems with process mining include the complexity of discovered “spaghetti” processes and a lack of goal-process alignment. Current process mining practices mainly focus on activities and do not benefit from considering stakeholder goals and requirements to manage complexity and alignment. The critical artifact that process mining practices rely on is the event log. However, using a raw version of real-life event logs will typically result in process models being too complex, unstructured, difficult to understand and, above all, not aligned with stakeholders’ goals.
Method: Involving goal-related factors can augment the precision and interpretability of mined models and help discover better opportunities to satisfy stakeholders. This thesis proposes three algorithms for goal-oriented process enhancement and discovery (GoPED) that show synergetic effects achievable by combining process mining and goal-oriented modelling. With GoPED, good historical experiences will be found within the event log to be used as a basis for inferring good process models, and bad experiences will be found to discover models to avoid. The goodness is defined in terms of alignment with regards to three categories of goal-related criteria:
• Case perspective: satisfaction of individual cases (e.g., patient, costumer) in terms of some goals;
• Goal perspective: overall satisfaction of some goals (e.g., to decrease waiting time) rather than individual cases; and
• Organization perspective: a comprehensive satisfaction level for all goals over all cases.
GoPED first adds goal-related attributes to conventional event characteristics (case identifier, activities, and timestamps), selects a subset of cases concerning goal-related criteria, and finally discovers a process model from that subset. For each criterion, an algorithm is developed to enable selecting the best subset of cases where the criterion holds. The resulting process models are expected to reproduce the desired level of satisfaction. The three GoPED algorithms were implemented in a Python tool. In addition, three other tools were implemented to complete a line of actions whose input is a raw event log and output is a subset of the event log selected with respect to the goal-related criteria. GoPED was used on real healthcare event logs (an illustrative example and a case study) to discover processes, and the performance of the tools was also assessed.
Results: The performance of the GoPED toolset for various sizes and configurations of event logs was assessed through extensive experiments. The results show that the three GoPED algorithms are practical and scalable for application to event logs with realistic sizes and types of configurations.
The GoPED method was also applied to the discovery of processes from the raw event log of the trajectories of patients with sepsis in a Dutch hospital, from their registration in the emergency room until their discharge. Although the raw data does not explicitly include goal-related information, some reasonable goals were derived from the data and a related research paper in consultation with a healthcare expert. The method was applied, and the resulting models were i) substantially simpler than the model dis-covered from the whole event log, ii) free from the drawbacks that using the whole event log causes, and iii) aligned with the predefined goals.
Conclusion: GoPED demonstrates the benefits of exploiting goal modelling capabilities to enhance event logs and select a subset of events to discover goal-aligned and simplified process models. The resulting process model can also be compared to a model discovered from the original event log to reveal new insights about the ability of different forms of process models to satisfy the stakeholders’ goals. Learning from good behaviours that satisfy goals and detecting bad behaviours that hurt them is an opportunity to redesign models, so they are simpler, better aligned with goals, and free from drawbacks that using the whole event log may cause.
|
220 |
Identifying Security Requirements using Meta-Data and Dependency HeuristicsMahakala, Kavya Reddy January 2018 (has links)
No description available.
|
Page generated in 0.1535 seconds