Spelling suggestions: "subject:"datavetenskap (datalogi)"" "subject:"datvetenskap (datalogi)""
21 |
Program för schemadesign / Program for Designing SchedulesYassin, Amin January 2018 (has links)
Databaser är en teknologi som aktivt används inom olika områden och branscher, och det är mer nu än någonsin som nya varianter av databashanterare kommer ut för att tillfredsställa de olika behoven för att hantera data. Den här rapporten handlar om utvecklingen av ett schemaprogram för Institutionen för naturvetenskap och teknik vid Örebro universitet. I utvecklingen av detta program används WPF-ramverket från Microsoft för att kunna skapa Windows-applikationer. Det används en SQLite-databas också för att spara och hämta data i form av resurser för scheman. I rapporten undersöks också SQL- och NoSQL-databaser utifrån hur de skiljer sig ifrån varandra och hur kan de användas för Windows-applikationer. Slutsatsen på det senare är att SQL-databaser är lämpliga för applikationer, där utvecklaren är medveten om exakt vilka data som kan behövas och vill säkra datakonsistens över allt annat. Å andra sidan, NoSQLdatabaser är lämpliga för applikationer som är anslutna till distribuerade databassystem, där det viktiga prioriteras enligt CAP-satsen av datatillgänglighet, datakonsistens, och partitionstolerans. / Database technology is becoming more actively used in different sectors and branches and the variations of database management systems (DBMS) are on the increase now more than ever. This increase is due to the strive to satisfy the variating needs to manage data. This report presents the development of a scheduling application on Windows for the School of Science and Technology at Örebro University. For this development process the WPF framework by Microsoft is used. Also, SQLite database is used to save and retrieve data in the form of resources for schedules. This report also investigates SQL- and NoSQL-databases. This investigation focuses on the differences between these two when incorporating them into Windows applications and the conclusion is that SQL-databases are to be used when the programmer is aware exactly of what types of data are needed for the application and to ensure that integrity constraints on data is a top priority. On the other hand, NoSQL-databases are suitable for applications that are a part of a distributed database system and that two of the three letters in the CAP-theorem are to be considered.
|
22 |
DICOM Second Generation RT : An Analysis of New Radiation Concepts by way of First-Second Generation ConversionHolst, David January 2019 (has links)
The current DICOM communication standard for radiotherapy is outdated and has serious design issues. A new version of the standard, known as DICOM 2nd generation for Radiotherapy, has been introduced and this thesis examines new concepts relating to radiation delivery. Firstly, some background into the practice of radiotherapy is given, as well as a description of the DICOM standard. Secondly, the thesis describes the design issues of the current standard and how the 2nd generation aims to solve these. Thirdly, the thesis explores the conversion of a first generation C-Arm Photon/Electron treatment plan to the second generation RT Radiation and RT Radiation Set IODs. A converter is implemented based on a model proposed in a previous work. With some simplifications, the conversion of Basic Static and Arc treatment plans is shown to be successful. Conversion of further dynamic plan types is judged to be fairly simple to implement following the same methodology. The conversion model’s efficacy and testability are discussed and while the model is flexible and facilitates extension to further modalities some areas of improvement are suggested. Lastly, a GUI for the converter is demonstrated and the possibilities of user interaction during conversion are discussed.
|
23 |
Development and initial validation of a stochastic discrete event simulation to assess disaster preparedness / Utveckling och preliminär validering av ett stokastiskt simuleringsverktyg för katastrofmedicinsk förmågeanalysLantz Cronqvist, Mattias January 2018 (has links)
Assessing disaster preparedness in a given region is a complex problem. Current methods are often resource intensive and may lack generalizability beyond a specific scenario. Computer-based stochastic simulations may be an additional method but would require systems that are valid, flexible and easy-to-use. Emergo Train System (ETS) is an analogue simulation system used for disaster preparedness assessments. This thesis aimed to digitalize the ETS model and develop a stochastic simulation software for improved disaster preparedness assessments. Simulation software was developed in C#. The simulation model was based on ETS. Preliminary verification and validation (V&V) tests were performed, including unit and integration testing, trace validation, and a comparison to a prior analogue ETS disaster preparedness assessment exercise. The software contains medically validated patients from ETS and is capable of automatically running disaster scenarios with stochastic variations in the injury panorama, available resources, geographical location, and other parameters. It consists of two main programs; an editor where scenarios can be constructed and a simulation system to evaluate the outcome. Initial V&V testing showed that the software is reliable and internally consistent. The comparison to the analogue exercise showed a general high agreement in terms of patient outcome. The analogue exercise featured a train derailment with 397 injured, of which 45 patients suffered preventable death. In comparison, the computer simulation ran 100 iterations of the same scenario and indicated that a median of 41 patients (IQR 31 to 44) would suffer a preventable death. Stochastic simulation methods can be a powerful complement to traditional capability assessments methods. The developed simulation software can be used for both assessing emergency preparedness with some validity and as a complement to analogue capability assessment exercises, both as input and to validate results. Future work includes comparing the simulation to real disaster outcomes. / <p></p><p></p>
|
24 |
On the Use of 5G for Smart Grid Inter-Substation Control Signaling / Användning av 5G för kontrollsignalering inom smarta elnät.Carlsson, Adrian January 2019 (has links)
In the energy domain today we are seeing an increasing number of energy equipments used and faceing new challenges such as network reliability, distributed renewable energy, increasing network complexity and energy efficiency. The concept of smart grid control systems has recently been seen as an appropriate way to address these new challenges. Today, the IEC 61850 standard is one of the most common standards used for power system automation. One of the services introduced is the so-called Generic Object Oriented Substation Event (GOOSE), which is a protocol to transfer time critical messages between multiple devices in a substation. The 5th generation of mobile networks (5G) are enabling new services and applications requiring lower latency, improved energy efficiency, better reliability and massive connection density. These promises of higher reliability and lower latency could then possibly be used in the future smart grid transmissions. In this work, the main goal was to understand the importance of time-critical messages, such as GOOSE messages, in the IEC61850 standard, and how these possibly could be used in the new 5th generation of mobile network. A proposed experimental setup which can be used for future research within both the GOOSE messaging area itself and the Open5GCore for emulated 5G mobile networks is presented. The intension of the experimental study is to send the GOOSE messages traversing through 5G networks by Open5GCore - an emulated 5G software.
|
25 |
Cloudify our product configurationTrivic, Göran, Azan, Mohammed January 2019 (has links)
No description available.
|
26 |
Usability evaluation of IPsec configuring componentsHiran, Vaishali Rahul January 2015 (has links)
The security protocol IPsec is used in the LTE network to achieve a securecommunication from prying eyes. However, the use of IPsec is optional bythe LTE standard. Whether or not to use the IPsec thus becomes a securitydecision that each operator has to make after having considered applicablerisks and anticipated costs. It is also important to consider the OperationalExpenditure (OPEX) for deploying, operating, and maintaining the IPsecinstallation. One important factor that can aect OPEX is usability. Forthis reason understanding the usability properties of a system can help toidentify improvements that can reduce OPEX.This study mainly focused on investigating the challenges and also investigateswhether poor usability was a contributing factor for deploymentchallenges of IPsec in the LTE infrastructure. Additionally, this study alsofocused on prerequisite knowledge for an individual in order to ensure thecorrect deployment of IPsec in the LTE network.Cognitive Walkthrough and Heuristic Evaluation usability methods wereused in this study. By using these methods, several usability issues related toIPsec conguring components like documentation, the MO structure, anda used tool were identied. It was also identied that each componenthad rooms for improvements, especially for documentation which can signicantly aid in the deployment of IPsec. Moreover, in order to smoothlydeploy IPsec in the LTE network, it is important to have beforehand knowledgeof conguring components used to deploy IPsec.
|
27 |
DESIGNING A HUMAN CENTERED INTERFACE FOR A NOVEL AGRICULTURAL MULTI-AGENT CONTROL SYSTMArvidsson, Fredrik January 2019 (has links)
The subject of this report is the Command and Control (CaCS) system which is a component whose purpose is to simplify planning, scheduling and surveying work done on a farm in a goal-oriented way. CaCS is part of a larger project, the Aggregate Farming in the Cloud platform (AFarCloud),whose purpose is to simplify the use of contemporary technology to increase the efficiency of farms. AFarCloud is an EU project spanning between 2018 to 2020 and as such, the CaCS is in its infancy. Since the intended users of AFarCloud and CaCS is small to medium sized agricultural businesses,the interface of the CaCS should be constructed in such a way that it is useful and easy to learn. In order to live up to those standards, a combination of live interviews, prototype evaluationsand a comparison with similar software were performed and then compared with the International Standard document on Human-Centered Design for Interactive Systems (ISO 9241-210). The results indicate that a modular interface, where only the information relevant for the unique user’s farm is displayed, is preferable in order to increase the usability of the CaCS. Furthermore, useof icons and explanatory text must be made in consideration of the mental models of the users in order to improve learnability and avoid confusion.
|
28 |
Semantic segmentation of seabed sonar imagery using deep learning / Semantisk segmentering av sonarbilder från havsbotten med deep learningGranli, Petter January 2019 (has links)
For investigating the large parts of the ocean which have yet to be mapped, there is a need for autonomous underwater vehicles. Current state-of-the-art underwater positioning often relies on external data from other vessels or beacons. Processing seabed image data could potentially improve autonomy for underwater vehicles. In this thesis, image data from a synthetic aperture sonar (SAS) was manually segmented into two classes: sand and gravel. Two different convolutional neural networks (CNN) were trained using different loss functions, and the results were examined. The best performing network, U-Net trained with the IoU loss function, achieved dice coefficient and IoU scores of 0.645 and 0.476, respectively. It was concluded that CNNs are a viable approach for segmenting SAS image data, but there is much room for improvement.
|
29 |
Empirical studies of multiobjective evolutionary algorithm in classifying neural oscillations to motor imageryParkkila, Christoffer January 2019 (has links)
Brain-computer interfaces (BCIs) enables direct communication between a brain and a computer by recording and analyzing a subject’s neural activity in real-time. Research in BCI that classifies motor imagery (MI) activities are common in the literature due to its importance and applicability, e.g., stroke rehabilitation. Electroencephalography (EEG) is often used as the recording technique because its non-invasive, portable and have a relatively low cost. However, an EEG recording returns a vast number of features which must be reduced to decrease the computational time and complexity of the classifier. For this purpose, feature selection is often applied. In this study, a multiobjective evolutionary algorithm (MOEA) was used as feature selection in a high spatial and temporal feature set to (1) compare pairwise combinations of different objectives, (2) evaluate the relationship between the specific objective pair and their relation to model prediction accuracy, (3) compare multiobjective optimization versus a linear combination of the individual objectives. The results show that correlation feature selection (CFS) obtained the best performance between the evaluated objectives which were also more optimized than a linear combination of the individual objectives when classified with support vector machine (SVM).
|
30 |
Catalyst : A Cloud-based Data Catalog System for a Swedish Mining CompanySwain, Adyasha January 2019 (has links)
In today’s digitization scenario, drivers such as the Internet of Things (IoT), cloud computing and big data lead to many initiatives such as Industry 4.0 or smart manufacturing. Large mining organizations are witnessing the emergence of big data not only through IoT but also through legacy systems and internal processes. Addressing big data is a challenging and time-demanding task that requires an extensive computational infrastructure to ensure successful data processing and analysis. Though most organizations have adopted a wide variety of powerful analytics, visualization tools, and storage options, efficient data usage, and sharing is taxing and may lead to data isolation. The thesis proposes, develops and validates a data catalog system called CATALYST: A Cloud-Based Data Catalog System for a Swedish Mining Company to address the data isolation, access and sharing challenges faced in a large organization. The prototype implementation and the evaluation of our system show that the average query time was reduced from 59.813 milliseconds to 11.009 milliseconds, as well as the average data count was reduced from 12,691 to 5721.7, which is almost less than 50 per cent, and solving data isolation challenges within Boliden, a large Swedish mining company. Finally, Boliden has confirmed the value of CATALYST in general and finds it beneficial for data management within their organization
|
Page generated in 0.0974 seconds