• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1531
  • 192
  • 128
  • 104
  • 19
  • 18
  • 9
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 3
  • Tagged with
  • 2199
  • 2199
  • 850
  • 456
  • 442
  • 283
  • 277
  • 249
  • 242
  • 221
  • 214
  • 202
  • 201
  • 199
  • 183
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
721

Automating regression testing on fat cients

Österberg, Emil January 2020 (has links)
Regression testing is important but time consuming. Automating the testing have many benefits. It will save money for companies because they will not have to pay testers to manually test their applications. It will produce better software with less bugs as testing can be done more frequently so bugs will be found faster. This thesis has compared two techniques used to automate regression testing. The more traditional component level record and replay tools and Visual GUI testing tools that uses image recognition. Eight tools in total was tested and compared, four from each technique. The system under test for this project was a fat client application used by Trafikverket. After automating a test suite using all tools, it could be concluded that the component level record and replay had some advantages over visual GUI testing tools, especially when it comes to verifying the state of the system under test. The benefits of visual GUI testing tools comes from their independence from the system under test and that the technique more correctly mimics how a real user interacts with the GUI. / Regressionstestning är en viktig men tidskrävande del av mjukvaruutveckling. Att automatisera testningen har flera fördelar. Det sparar pengar för företag eftersom de inte behöver betala testare för att manuellt utföra testerna. Det resulterar i bättre mjukvara med färre buggar eftersom man kan testa oftare och därmed hitta buggar tidigare. Det här projektet har undersökt och jämfört två tekniker som kan användas för att automatisera regressionstestning och verktyg som använder dessa tekniker. Dels de traditionella verktygen som identifierar objekt på komponentnivå samt verktyg som istället använder sig av bildigenkänningför att identifiera objekt. Totalt testades och uvärderades åtta verktyg, fyra av varje tekniktyp. Systemet som testades under projektet är en skrivbordsapplikation som används av Trafikverket. After att ha automatiserat en testsekvens med varje verktyg kunde konstateras att verktygen som identifierar objekt på komponentnivå har flera fördelar över verktyg som enbart använder bildigenkänning. Detta gäller främst när det kommer till verifiering av systemets tillstånd. Den största fördelen med bildigenkänningsverktygen visade sig vara dess oberoende från systemet, samt att tekniken mer efterliknar en verklig användare.
722

CYBER-PHYSICAL SYSTEMS: BUILDING A SECURITY REFERENCE ARCHITECTURE FOR CARGO PORTS

Unknown Date (has links)
Cyber-Physical Systems (CPS) are physical entities whose operations are monitored, coordinated, and controlled by a computing and communication core. These systems are highly heterogeneous and complex. Their numerous components and cross domain complexity make attacks easy to propagate and security difficult to implement. Consequently, to secure these systems, they need to be built in a systematic and holistic way, where security is an integral part of the development lifecycle and not just an activity after development. These systems present a multitude of implementation details in their component units, so it is fundamental to use abstraction in the analysis and construction of their architecture. In particular, we can apply abstraction through the use of patterns. Pattern-based architectural modeling is a powerful way to describe the system and analyze its security and the other non-functional aspects. Patterns also have the potential to unify the design of their computational, communication, and control aspects. Architectural modeling can be performed through UML diagrams to show the interactions and dependencies between different components and its stakeholders. Also, it can be used to analyze security threats and describe the possible countermeasures to mitigate these threats. An important type of CPS is a maritime container terminal, a facility where cargo containers are transported between ships and land vehicles; for example, trains or trucks, for onward transportation, and vice versa. Every cargo port performs four basic functions: receiving, storing, staging and loading for both, import and export containers. We present here a set of patterns that describe the elements and functions of a cargo port system, and a Reference Architecture (RA) built using these patterns. We analyze and systematically enumerate the possible security threats to a container terminal in a cargo port using activity diagrams derived from selected use cases of the system. We describe these threats using misuse patterns, and from them select security patterns as defenses. The RA provides a framework to determine where to add these security mechanisms to stop or mitigate these threats and build a Security Reference Architecture (SRA) for CPS. An SRA is an abstract architecture describing a conceptual model of security that provides a way to specify security requirements for a wide range of concrete architectures. The analysis and design are given using a cargo port as our example, but the approach can be used in other domains as well. This is the first work we know where patterns and RAs are used to represent cargo ports and analyze their security. / Includes bibliography. / Dissertation (PhD)--Florida Atlantic University, 2021. / FAU Electronic Theses and Dissertations Collection
723

Förbättring av WLAN-kvaliteten i Skellefteå kommunsverksamheter

Boqvist, Anna, Aryal, Elisha January 2021 (has links)
No description available.
724

The Challenges in Leveraging Cyber Threat Intelligence / Utmaningarna med att bemöta cyberhot motunderrättelseinformation

Gupta, Shikha, Joseph, Shijo, Sasidharan, Deepu January 2021 (has links)
Today cyber attacks, incidents, threats, and breaches continue to rise in scale and numbers, as sophisticated attackers continuously break through conventional safeguards each day. Whether strategic, operational, or tactical, threat intelligence can be defined as aggregated information and analytics that feed the different pillars of any given company’s cybersecurity infrastructure. It provides numerous benefits, enabling improved prediction and detection of threats, empowering and informing organizations to make better decisions during as well as following any cyber attack and aiding them to develop a proactive cyber security posture. It helps provide actionable intelligence, which equips senior management to make timely actions and decisions that might otherwise have an impact on the company’s ability to keep ahead and defend against this growing sea of threats. Driving momentum in this area also helps reduce their reaction times, enabling a shift for organizations to become more proactive than reactive. Perimeter defenses seem to no longer suffice as threats are becoming more complex and escalating with no best practices and guidelines available for companies to follow after, during, or before the time of the threat and risk due to the multiple components involved, including the various standards and platforms. Sharing and analyzing threat data effectively requires standard formats, protocols, shared understanding of the relevant terminology, purpose, and representation. Threat intelligence and its analysis are seen as a vital component of cyber security and a tool that many companies cannot leverage and utilize fully. Securing today's organizations and businesses, therefore, will require a new approach. In our study with security executives working across multiple industries, we have identified the various challenges that prevent the successful adoption of threat intelligence and with the rising adoption of the multiple platforms, including issues related to data quality, absence of universal standard format and protocol, challenge enforcing data sharing based on CTI data attribute, lack of authentication and confidentiality preventing data sharing, missing API integration capability in conjunction with multi-vendor tools, lack of identification of tacticalIOCs, failure to define TTL value(s), lack of deep automation, analytical and visualization capabilities. Ensuring the right expertise and capabilities in these identified areas will help leverage threat intelligence effectively, help to sharpen the focus, and provide the needed competitive edge.
725

UAN (User Action Notation) Tutor

Bhattarai, Hare Ram 17 February 2010 (has links)
Development strategies of user-interlaces have been changing rapidly. User-interlaces are no longer the byproducts of the traditional software development process. Interface designers are now more concerned with the usability of the product rather than its pure technical optimization. It has been recognized that higher usability can only be achieved if interfaces are designed by human factors specialists and implemented by software engineers. Clearly, there exists a need for an effective and unambiguous (i.e. non-prose) form of communication between the designers and implementers of user-interlaces. / Master of Information Systems
726

AI-based Age Estimation using X-ray Hand Images : A comparison of Object Detection and Deep Learning models

Westerberg, Erik January 2020 (has links)
Bone age assessment can be useful in a variety of ways. It can help pediatricians predict growth, puberty entrance, identify diseases, and assess if a person lacking proper identification is a minor or not. It is a time-consuming process that is also prone to intra-observer variation, which can cause problems in many ways. This thesis attempts to improve and speed up bone age assessments by using different object detection methods to detect and segment bones anatomically important for the assessment and using these segmented bones to train deep learning models to predict bone age. A dataset consisting of 12811 X-ray hand images of persons ranging from infant age to 19 years of age was used. In the first research question, we compared the performance of three state-of-the-art object detection models: Mask R-CNN, Yolo, and RetinaNet. We chose the best performing model, Yolo, to segment all the growth plates in the phalanges of the dataset. We proceeded to train four different pre-trained models: Xception, InceptionV3, VGG19, and ResNet152, using both the segmented and unsegmented dataset and compared the performance. We achieved good results using both the unsegmented and segmented dataset, although the performance was slightly better using the unsegmented dataset. The analysis suggests that we might be able to achieve a higher accuracy using the segmented dataset by adding the detection of growth plates from the carpal bones, epiphysis, and the diaphysis. The best performing model was Xception, which achieved a mean average error of 1.007 years using the unsegmented dataset and 1.193 years using the segmented dataset. / <p>Presentationen gjordes online via Zoom. </p>
727

Impact evaluation of an automatic identificationtechnology on inventory management : A simulation approach with the focus on RFID

Petersson, Martin January 2020 (has links)
Automatic identification system is a prominent technology used in warehouses to give managers real time information of their products and assistance to warehouse employees in keeping an accurate inventory record. This kind of assistance is needed as an inaccurate inventory leads to profit loss due to misplacement or other mistakes. This project cooperated with an organization called Stora Enso specifically one of their forest nursery to find a solution to improve their inventory management system. Their current inventory system is a manual process which leads to mistakes occurring that affects the inventory accuracy. This thesis project evaluates automatic identification systems to observe if the technology is a possible solution and aims to answer the research question ”What are the significant impacts an automatic identification system has on an inventory management system?”. From the automatic identification evaluation one system is picked for further evaluation and due to its advantages radio frequency identification (RFID) is picked. To evaluate RFID in a warehouse setting a discrete-event simulation was created that simulates the forest nursery’s warehouse. The simulation is then used to evaluate the impact of different RFID implementations and their respective cost. The simulation results show that just a simple RFID implementation can improve inventory accuracy and remove some of the mistakes a manual system has with a relatively low direct cost. It also shows that a full RFID implementation that gives full visibility of a warehouse can almost remove inventory mistakes however the cost analysis shows that it requires a large investment.
728

Att använda Azure som IoT plattform

Arvidsson, Moa January 2020 (has links)
Portability of the Internet of Things Solutions is important in today’s society. This is due to the rapid growth of smart devices. More and more companies are choosing to use cloud services for data storage and processing. Halmstad Stadsnät AB is developing and communicating the IoT-platform and using the current state of Nokia IMPACT. The problem with IMPACT is that it does not fully support End-toend solutions. Therefore, Halmstad Stadsnät AB explores other possibilities for IoT-solutions when it comes to software. Microsoft AZURE is a platform that Halmstad Kommun uses for IT-solutions. The overall goal of this project is to test the portability of Microsoft AZURE’s IoT-solutions on IMPACTplatforms. The methods used have created an End-to-end solution for AZURE and then step by step test to transfer it to IMPACT. The project concludes that portability between these two platforms is good, but it requires certain adjustments when transferring / Portabilitet av internet of things lösningar är i dagens samhälle nödvändig. Detta pågrund av den snabba tillväxten av bärbara datorenheter. Allt fler företag väljer att använda molntjänster för lagring och bearbetning av data. Halmstad Kommun håller på att utveckla en kommuntäckande IoT-plattform och använder sig i dagsläget av Nokia IMPACT. Problemet med IMPACT är att den inte har fullt stöd för End-to-end lösningar. Därför undersöker Stadsnätet andra möjligheter för IoT-lösning när det kommer till mjukvara. Microsoft AZURE är en plattform som Halmstad kommun använder för IT lösningar. De övergripande målet med det här projektet är att testa portabiliteten för Microsoft AZUREs IoTlösnignar på IMPACT-plattformen. De metoder som använts har varit att skapa en End-to-end lösning för AZURE och sedan steg för steg testa att överföra den till IMPACT. Slutsatsen av projektet är att portabilitet mellan dessa två plattformar är god, men dock krävs vissa åtgärder vid överföring.
729

Number Recognition of Real-world Images in the Forest Industry : a study of segmentation and recognition of numbers on images of logs with color-stamped numbers

Munter, Johan January 2020 (has links)
Analytics such as machine learning are of big interest in many types of industries. Optical character recognition is essentially a solved problem, whereas number recognition on real-world images which can be one form of machine learning are a more challenging obstacle. The purpose of this study was to implement a system that can detect and read numbers on given dataset originating from the forest industry being images of color-stamped logs. This study evaluated accuracy of segmentation and number recognition on images of color-stamped logs when using a pre-trained model of the street view house numbers dataset. The general approach of preprocessing was based on car number plate segmentation because of the similar problem of identifying an object to then locate individual digits. Color segmentation were the biggest asset for the preprocessing because of the distinct red color of digits compared to the rest of the image. The accuracy of number recognition was significantly lower when using the pre-trained model on color-stamped logs being 26% in comparison to street view house numbers with 95% but could still reach over 80% per digit accuracy rate for some image classes when excluding accuracy of segmentation. The highest segmentation accuracy among classes was 93% and the lowest was 32%. From the results it was concluded that unclear digits on images lessened the number recognition accuracy the most. There are much to consider for future work, but the most obvious and impactful change would be to train a more accurate model by basing it on the dataset of color-stamped logs.
730

Deep Neural Networks Based Disaggregation of Swedish Household Energy Consumption

Bhupathiraju, Praneeth Varma January 2020 (has links)
Context: In recent years, households have been increasing energy consumption to very high levels, where it is no longer sustainable. There has been a dire need to find a way to use energy more sustainably due to the increase in the usage of energy consumption. One of the main causes of this unsustainable usage of energy consumption is that the user is not much acquainted with the energy consumed by the smart appliances (dishwasher, refrigerator, washing machine etc) in their households. By letting the household users know the energy usage consumed by the smart appliances. For the energy analytics companies, they must analyze the energy consumed by the smart appliances present in a house. To achieve this Kelly et. al. [7] have performed the task of energy disaggregation by using deep neural networks and producing good results. Zhang et. al. [7] has gone even a step further in improving the deep neural networks proposed by Kelly et. al., The task was performed by Non-intrusive load monitoring (NILM) technique. Objectives: The thesis aims to assess the performance of the deep neural networks which are proposed by Kelly et.al. [7], and Zhang et. al. [8]. We use deep neural networks for disaggregation of the dishwasher energy consumption, in the presence of vampire loads such as electric heaters, in a Swedish household setting. We also try to identify the training time of the proposed deep neural networks.  Methods: An intensive literature review is done to identify state-of-the-art deep neural network techniques used for energy disaggregation.  All the experiments are being performed on the dataset provided by the energy analytics company Eliq AB. The data is collected from 4 households in Sweden. All the households consist of vampire loads, an electrical heater, whose power consumption can be seen in the main power sensor. A separate smart plug is used to collect the dishwasher power consumption data. Each algorithm training is done on 2 houses with data provided by all the houses except two, which will be used for testing. The metrics used for analyzing the algorithms are Accuracy, Recall, Precision, Root mean square error (RMSE), and F1 measure. These software metrics would help us identify the best suitable algorithm for the disaggregation of dishwasher energy in our case.  Results: The results from our study have proved that Gated recurrent unit (GRU) performed best when compared to the other neural networks in our study like Simple recurrent neural network (SRN), Convolutional Neural Network (CNN), Long short-Term memory (LSTM) and Recurrent convolution neural network (RCNN). The Accuracy, RMSE and the F1 score of the GRU algorithm are higher when compared with the other algorithms. Also, if the user does not consider F1 score and RMSE as an evaluation metric and considers training time as his or her metric, then Simple recurrent neural network outperforms all the other neural nets with an average training time of 19.34 minutes.

Page generated in 0.0589 seconds