• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 88
  • 84
  • 2
  • Tagged with
  • 174
  • 128
  • 123
  • 40
  • 37
  • 32
  • 22
  • 22
  • 21
  • 21
  • 20
  • 20
  • 20
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Design in Telemedicine : Development and Implementation of Usable Computer Systems

Borälv, Erik January 2005 (has links)
Designing computer systems that effectively support the user is the major goal within human-computer interaction. To achieve this, we must understand and master several tasks. This process must initially deal with the question of knowing what to develop and later, with the question of knowing how to design and develop the system. This view might seem off-target at first, since it does not explicitly mention the goals or functions of the system. However, more often than not, there is no objective goal to aim for that can be formally specified and used as a target criterion that will signal when we have designed an appropriate system. Instead, there is a large set of vague goals – some of which may last through the entire project and some that will not. It is therefore somewhat confounding that most of the current methods of systems development require that these goals are explicitly laid out, in order to steer development. For researchers in Human-Computer Interaction, the existence of many varying – and possibly conflicting goals – presents is a great challenge. The constructive main focus on producing usable systems is a matter of understanding this complex situation and knowing how to proceed from there. There are many existing approaches that can be used to carry out this complex development process. This thesis presents one approach, based on the notion that the elements that constitute a successful system are also a part of the solution. This thesis presents this approach as it is applied to the development of systems for computer-supported work in health care. The projected solution suggests that we need to focus more intently on active user involvement in iterative development that is significantly long-term. The traditional, rather narrow circle of focus that encompasses design, development and evaluation is not sufficient.
42

CryptoNET : Generic Security Framework for Cloud Computing Environments

Abbasi, Abdul Ghafoor January 2011 (has links)
The area of this research is security in distributed environment such as cloud computing and network applications. Specific focus was design and implementation of high assurance network environment, comprising various secure and security-enhanced applications. “High Assurance” means that -               our system is guaranteed to be secure, -               it is verifiable to provide the complete set of security services, -               we prove that it always functions correctly, and -               we justify our claim that it can not be compromised without user neglect and/or consent.   We do not know of any equivalent research results or even commercial security systems with such properties. Based on that, we claim several significant research and also development contributions to the state–of–art of computer networks security. In the last two decades there were many activities and contributions to protect data, messages and other resources in computer networks, to provide privacy of users, reliability, availability and integrity of resources, and to provide other security properties for network environments and applications. Governments, international organizations, private companies and individuals are investing a great deal of time, efforts and budgets to install and use various security products and solutions. However, in spite of all these needs, activities, on-going efforts, and all current solutions, it is general belief that the security in today networks and applications is not adequate. At the moment there are two general approaches to network application’s security. One approach is to enforce isolation of users, network resources, and applications. In this category we have solutions like firewalls, intrusion–detection systems, port scanners, spam filters, virus detection and elimination tools, etc. The goal is to protect resources and applications by isolation after their installation in the operational environment. The second approach is to apply methodology, tools and security solutions already in the process of creating network applications. This approach includes methodologies for secure software design, ready–made security modules and libraries, rules for software development process, and formal and strict testing procedures. The goal is to create secure applications even before their operational deployment. Current experience clearly shows that both approaches failed to provide an adequate level of security, where users would be guaranteed to deploy and use secure, reliable and trusted network applications. Therefore, in the current situation, it is obvious that a new approach and a new thinking towards creating strongly protected and guaranteed secure network environments and applications are needed. Therefore, in our research we have taken an approach completely different from the two mentioned above. Our first principle is to use cryptographic protection of all application resources. Based on this principle, in our system data in local files and database tables are encrypted, messages and control parameters are encrypted, and even software modules are encrypted. The principle is that if all resources of an application are always encrypted, i.e. “enveloped in a cryptographic shield”, then -               its software modules are not vulnerable to malware and viruses, -               its data are not vulnerable to illegal reading and theft, -               all messages exchanged in a networking environment are strongly protected, and -               all other resources of an application are also strongly protected.   Thus, we strongly protect applications and their resources before they are installed, after they are deployed, and also all the time during their use. Furthermore, our methodology to create such systems and to apply total cryptographic protection was based on the design of security components in the form of generic security objects. First, each of those objects – data object or functional object, is itself encrypted. If an object is a data object, representing a file, database table, communication message, etc., its encryption means that its data are protected all the time. If an object is a functional object, like cryptographic mechanisms, encapsulation module, etc., this principle means that its code cannot be damaged by malware. Protected functional objects are decrypted only on the fly, before being loaded into main memory for execution. Each of our objects is complete in terms of its content (data objects) and its functionality (functional objects), each supports multiple functional alternatives, they all provide transparent handling of security credentials and management of security attributes, and they are easy to integrate with individual applications. In addition, each object is designed and implemented using well-established security standards and technologies, so the complete system, created as a combination of those objects, is itself compliant with security standards and, therefore, interoperable with exiting security systems. By applying our methodology, we first designed enabling components for our security system. They are collections of simple and composite objects that also mutually interact in order to provide various security services. The enabling components of our system are:  Security Provider, Security Protocols, Generic Security Server, Security SDKs, and Secure Execution Environment. They are all mainly engine components of our security system and they provide the same set of cryptographic and network security services to all other security–enhanced applications. Furthermore, for our individual security objects and also for larger security systems, in order to prove their structural and functional correctness, we applied deductive scheme for verification and validation of security systems. We used the following principle: “if individual objects are verified and proven to be secure, if their instantiation, combination and operations are secure, and if protocols between them are secure, then the complete system, created from such objects, is also verifiably secure”. Data and attributes of each object are protected and secure, and they can only be accessed by authenticated and authorized users in a secure way. This means that structural security properties of objects, upon their installation, can be verified. In addition, each object is maintained and manipulated within our secure environment so each object is protected and secure in all its states, even after its closing state, because the original objects are encrypted and their data and states stored in a database or in files are also protected. Formal validation of our approach and our methodology is performed using Threat Model. We analyzed our generic security objects individually and identified various potential threats for their data, attributes, actions, and various states. We also evaluated behavior of each object against potential threats and established that our approach provides better protection than some alternative solutions against various threats mentioned. In addition, we applied threat model to our composite generic security objects and secure network applications and we proved that deductive approach provides better methodology for designing and developing secure network applications. We also quantitatively evaluated the performance of our generic security objects and found that the system developed using our methodology performs cryptographic functions efficiently. We have also solved some additional important aspects required for the full scope of security services for network applications and cloud environment: manipulation and management of cryptographic keys, execution of encrypted software, and even secure and controlled collaboration of our encrypted applications in cloud computing environments. During our research we have created the set of development tools and also a development methodology which can be used to create cryptographically protected applications. The same resources and tools are also used as a run–time supporting environment for execution of our secure applications. Such total cryptographic protection system for design, development and run–time of secure network applications we call CryptoNET system. CrytpoNET security system is structured in the form of components categorized in three groups: Integrated Secure Workstation, Secure Application Servers, and Security Management Infrastructure Servers. Furthermore, our enabling components provide the same set of security services to all components of the CryptoNET system. Integrated Secure Workstation is designed and implemented in the form of a collaborative secure environment for users. It protects local IT resources, messages and operations for multiple applications. It comprises four most commonly used PC applications as client components: Secure Station Manager (equivalent to Windows Explorer), Secure E-Mail Client, Secure Web Browser, and Secure Documents Manager. These four client components for their security extensions use functions and credentials of the enabling components in order to provide standard security services (authentication, confidentiality, integrity and access control) and also additional, extended security services, such as transparent handling of certificates, use of smart cards, Strong Authentication protocol, Security Assertion Markup Language (SAML) based Single-Sign-On protocol, secure sessions, and other security functions. Secure Application Servers are components of our secure network applications: Secure E-Mail Server, Secure Web Server, Secure Library Server, and Secure Software Distribution Server. These servers provide application-specific services to client components. Some of the common security services provided by Secure Application Servers to client components are Single-Sign-On protocol, secure communication, and user authorization. In our system application servers are installed in a domain but it can be installed in a cloud environment as services. Secure Application Servers are designed and implemented using the concept and implementation of the Generic Security Server. It provides extended security functions using our engine components. So by adopting this approach, the same sets of security services are available to each application server. Security Management Infrastructure Servers provide domain level and infrastructure level services to the components of the CryptoNET architecture. They are standard security servers, known as cloud security infrastructure, deployed as services in our domain level could environment. CryptoNET system is complete in terms of functions and security services that it provides. It is internally integrated, so that the same cryptographic engines are used by all applications. And finally, it is completely transparent to users – it applies its security services without expecting any special interventions by users. In this thesis, we developed and evaluated secure network applications of our CryptoNET system and applied Threat Model to their validation and analysis. We found that deductive scheme of using our generic security objects is effective for verification and testing of secure, protected and verifiable secure network applications. Based on all these theoretical research and practical development results, we believe that our CryptoNET system is completely and verifiably secure and, therefore, represents a significant contribution to the current state-of-the-art of computer network security. / QC 20110427
43

IT i Geografiundervisningen : Vad görs idag enligt lärare och elever

Näsström, Erik January 2014 (has links)
Lärare idag lever och verkar i ett samhälle där utvecklingen inom informationsteknologin går allt snabbare. Allt fler skolor får bredbandsanslutningar som möjliggör att strömma ned filmer och annat material från internet. Men det är inte enbart inom den tekniska utvecklingen som det sker förändringar. Länder ändrar gränser eller faller isär till nya, och i många svenska klassrum hänger det kartor som är förlegade sedan länge. Det finns därför stora möjligheter att inom detta område kombinera informationsteknologi och geografi till en helhet. Istället för kartor med felaktig information i klassrummet finns nu tillgång till nya, uppdaterade kartor via internet. Frågeställningen var hur användningen av informationsteknologi används som stöd i geografiundervisningen på svenska skolor.För att undersöka detta användes en kvantitativ studie i form av en enkätundersökning. Inbjudningar skickades ut till ett antal lärare runt om i Sverige, eleverna tillfrågades senare även via ett internetforum för att öka svarsfrekvensen ytterligare en del. Enkäten bestod av ett antal frågor, en för lärare och en liknande för elever. Det visade sig att både lärare och elever svarade mycket snarlikt på frågorna. Trots att svarsfrekvensen var oacceptabelt låg med sina 22 %, har resultaten ändå en bra validitet då respondenterna är spridda över flera orter. För att öka reliabiliteten hade det varit bäst att kombinera den kvantitativa med en mer kvalitativ undersökning, som till exempel intervjuer. Det resultatet från denna studie visar på är att både lärare och elever har en god kännedom om olika gratis programvaror, där samtliga berör geografiämnet. Bägge grupper tror dessutom i olika grad på en förbättring av både undervisningen och lärandesituationen om mer informationsteknologi användes. Eleverna var i båda fallen mer positivt inställda till detta än lärarna.
44

IT-strategier i litteraturen och praktiken

Modén, Frida January 2008 (has links)
<p>IT-marknaden förändras extremt snabbt och nya teknologier utvecklas konstant. Samma sak gäller för en verksamhet, den är dynamisk och förändras hela tiden. En IT-strategi kan hjälpa till att hantera dessa förändringar samt underlätta och förbättra arbetet med IT.</p><p>Studiens syfte är att undersöka hur arbetet med IT-strategier bör gå till enligt litteraturen och praktiken och ta fram riktlinjer som kan hjälpa verksamheter som arbetar med IT-strategier. För att svara på studiens problemställning genomfördes därmed en litteraturstudie samt intervjuer med tre IT-chefer. Resultatet består av en lista över riktlinjer över hur arbetet kring IT-strategier bör gå till. Det visade sig att litteraturen och praktiken har ungefär samma syn på hur det bör ske, dock kompletterar de varandra en del. Den största skillnaden var att inte alla i praktiken håller med om att det är vitalt att ha en IT-strategi.</p>
45

Informationsflöden inom mobila-stationära organisationer

Härgestam, Mikael, Eckeskog, Joel January 2011 (has links)
A good flow of information in organizations is one of the key aspects of how we define a well-functioning organization. The information flow could be internal as well as external between actors within as well as outside of the organization. In a logistics organization this flow of information is equally important as in any other kind of organization. In this paper we have explored information flows in logistics organizations by applying, boundary spanning and the mobile-stationary divide perspective. In boundary spanning the main goal is to erase or relocate some of the boundaries within an organization, in our case to make the flow of information internally as well as externally better. The mobile-stationary divide is well applicable on the logistics business, where one part of the organization is stationary and the other one is mobile. With this in mind two research questions emerged for us to answer. How does information flow across organizational boundaries within the logistics business? And how can IT support the flow of information between organizational boundaries in a mobile-stationary organization? This paper contributes with the identification of two mobile-stationary divides; between factories and the logistics business as well as within the logistics business it self. In addition we identify a mobile-mobile divide as well. Our research also shows that IT is something that could support the flow of information between the organizations through computers and applications as well as smart phones. All our interviewees view IT as something that enhances their work and would be welcomed if implemented. We believe that with this trust and acceptance of IT, it would not be impossible to decrease the mobile-stationary and the mobile-mobile divide in the logistics business.
46

Framtidens boende ur ett IT-perspektiv

Johansson, Anders January 1999 (has links)
I föreliggande rapport studeras de förändringar som vårt framtida boende i kunskapssamhället kan komma att genomgå. Studien är inriktad på vilka förändringar som följer i spåren av utvecklingen inom området informationsteknologi. De problem som tas upp i rapporten behandlar de planer och visioner som framtidsforskare har beträffande informationsteknologi i bostaden. Dessa planer och visioner jämförs med bostadsbolagens planer. Rapporten undersöker även om det finns en beredskap avseende de krav på bostaden som förvärvsarbetare i kunskapssamhället lär ställa på bostaden i framtiden. Resultatet visar att våra bostäder kan komma att vara föremål för datorisering inom en snar framtid. Resultatet visar även att de planer och visioner framtidsforskare för fram har god överensstämmelse med bostadsbolagens planer. Vad gäller utformning av bostaden visar resultatet att det finns en god beredskap vad gäller arkitektonisk förändring av bostaden. I rapporten ges även en bild av hur boendet i kunskapssamhället kan komma att se ut.
47

Quantum Communication Networks

Rafiei, Nima January 2008 (has links)
Quantum communication protocols invoke one of the most fundamentallaws of quantum mechanics, namely the superposition principle whichleads to the no-cloning theorem. During the last three decades, quantumcryptography have gone from prospective theories to practical implementationsscalable for real communication. Scientist from all over the world havecontributed to this major progress, starting from Stephen Wiesner, CharlesH. Bennett and Gilles Brassard who all developed the theory of QuantumKey Distribution (QKD). QKD lets two users share a key through a quantumchannel (free space or fiber link) under unconditionally secure circumstances.They can use this key to encode a message which they thereaftershare through a public channel (internet, telephone,...). Research developmentshave gone from the ordinary 2-User Quantum Key Distribution oververy small free space distances to distances over 200 km in optical fiber andQuantum Key Distribution Networks.As great experimental achievements have been made regarding QKDprotocols, a new quantum communication protocol have been developed,namely Quantum Secret Sharing. Quantum Secret Sharing is an extensionof an old cryptography scheme called Secret Sharing. The aim of secretsharing is to split a secret amongst a set of users in such a way that thesecret is only revealed if every user of this set is ready to collaborate andshare their part of the secret with other users.We have developed a 5-User QKD Network through birefringent singlemode fiber in two configurations. One being a Tree configuration and theother being a Star configuration. In both cases, the number of users, thedistances between them and the stability of our setup are all well competitivewith the current worldwide research involving similar work.We have also developed a Single Qubit Quantum Secret Sharing schemewith phase encoding through single mode fiber with 3, 4 and 5 parties. Thelatter is, to the best of our knowledge, the first time a 5-Party Single QubitQuantum Secret Sharing experiment has been realized.
48

Attityder till IT bland historielärare på gymnasiet

Bjuväng, Niclas January 2007 (has links)
Denna uppsats behandlar hur gymnasielärare i historia ser på IT som ett pedagogiskt redskap. Genom en kvalitativ undersökning med intervjuer samt en kvantitativ undersökning med 103 respondenter fann jag att en stor del av dessa lärare använder IT i sin undervisning, hur ofta de använder IT som ett pedagogiskt redskap varierar. En stor del av respondenterna svarade att de ville gå en kortare utbildning för att få kunskap och tips om hur man kan jobba med IT i historieundervisningen. Tillgången till datorsalar var god och en stor del av respondenterna ansåg att det fanns fördelar med att använda IT som ett pedagogiskt verktyg. Slutsatsen blir att flertalet lärare använder IT i sin undervisning idag, men de törstar efter mer utbildning inom området. Användandet var dessutom fokuserat på att kommunicera med hjälp av e-post och söka efter information med hjälp av Internet. / This paper investigates attitudes towards IT amongst highschool historyteacher.By conducting interviews and a quantitative investigation with 103 participants I found that a large part of the participants uses IT in their teaching. A large part of the participants also answered that they would like to participate in a shorter education to learn how to use IT in their teachings. The conclusion is that a lot of the historyteachers are using IT in their teaching, but want more education in the IT-area. The use of IT was also focused on e-mail communication and searching after information on Internet.
49

Redovisnings- och revisionsbranschens påverkan av digitalisering / The impact of digitizationin the accounting and auditing industry

Halvars, Viktoria, Svantorp, Petra January 2016 (has links)
Tidigare forskning har visat att den teknologiska utvecklingen har påverkat många branscher. Vi har valt att fokusera på en särskild bransch och därmed är syftet med denna studie att förklara och förstå hur redovisnings- och revisionsbranschens har påverkats av digitaliserings framfart. Studien bygger vidare på tre forskningsfrågor, där första undersöker hur redovisnings- och revisionsbranschen utvecklats och förändrats under 2000-talet, den andra undersöker viktiga faktorer att beakta vid implementeringen av digitalisering och den tredje undersöker vilken förändring redovisningskonsulter och revisorer står inför. Studien bygger på empiri insamlat av redovisningskonsulter och revisorer och besvarar de tre forskningsfrågor med utgångspunkt i den teoretiska referensramen. För att uppnå syftet med studien har en kvalitativ studie valts där 11 stycken semistrukturerade intervjuer har genomförts med både redovisningskonsulter och revisorer. För att få en djupare förståelse kring ämnet diskuteras relevanta begrepp och teorier i en teoretisk referensram. Analysen bygger sedan på teori och på citat från informanterna. Utifrån informanternas uppfattningar är vår slutsats att redovisnings- och revisionsbranschen påverkats av digitaliseringens framfart. Framförallt genom förändrade arbetsuppgifter och att tillgängligheten och mobiliteten som digitaliserade arbetsmetoder för med sig, givit informanterna mer frihet i arbetet. Vi har till skillnad från tidigare forskning inom ämnet märkt av att revisionsbranschen ligger något efter redovisningsbranschen i dess arbete med att implementera digitaliserade arbetsmetoder. / Previous research has shown that the technological development has affected many industries. We have chosen to focus on a particular industry and the main purpose of this study is to explain and understand how the accounting and auditing industry has been affected by digitization. The study consists of three research questions, the first one explores how the accounting and auditing profession has changed during the 2000s, the second examines the key factors to consider in the implementation of digitization and the third examines the change that accounting consultants and auditors are facing. The study is based on empirical data collected by accountants and auditors and our three research questions that are based on our theoretical framework. In order to answer our research questions and main purpose, our study has a qualitative approach. To get a deeper understanding of our topics, we have collected relevant theories in a theoretical framework. We have conducted eleven semi-structured interviews, with both accounting consultants and auditors. The analysis is based on our theoretical framework and our empirical data. Based on informants' perceptions, our conclusion is that the accounting and auditing industry has been affected by digitization in many ways. Unlike previous research, we have noticed that the auditing industry is far behind when it comes to digitization in the daily work activities.
50

Hantering av riskfaktorer för små och medelstora företag vid IT outsourcing

Wallin, Patrik, Lycksén, Anna January 2010 (has links)
<p><strong>Kurs:</strong> Mälardalens Högskola, EIK024, Magisteruppsats i IT-ekonomi</p><p>Författare: Anna Lycksén & Patrik Wallin, Västerås</p><p>Titel: Hantering av riskfaktorer för små och medelstora företag vid IT Outsourcing</p><p>Handledare: Therese Hansen</p><p>Nyckelord: IT Outsourcing, SME, riskfaktor, hantering</p><p>Problem: Outsourcing av affärsprocesser såsom IT har funnits länge, men fokus hartidigare varit på stora företag av ett antal olika anledningar. För det förstaville outsourcingleverantörerna vinna stora kontrakt för att uppnå stordriftsfördelarså att deras egna kostnader kunde hållas så låga som möjligt.För det andra lyckades inte de mindre outsourcingleverantörerna som försöktege sig in på marknaden för små och medelstora företag (SME) med atterbjuda tjänster som ansågs vara tillräckligt lockande för SME. Slutligen ansågde flesta SME att de inte hade något att tjäna på att outsourca. Tidernahar dock förändrats. IT outsourcing för SME blir allt mer vanligt, men trotsdetta forskas det relativt lite på området. Med tanke på ovanstående anseruppsatsförfattarna att området är såväl relevant som intressant att behandla.</p><p>Syfte: Syftet med uppsatsen är att utifrån IT outsourcings tre faser beskrivaoch analysera hanteringen av riskfaktorer för små och medelstora företag(SME) vid IT outsourcing, för att därigenom belysa vad SME börtänka på för att lyckas med IT outsourcing.</p><p>Teori: Den teoretiska referensramen börjar med ett konceptuellt ramverk därkoncept som IT Outsourcing, SME, risk och riskfaktor fastställs. Vidare innehållerteoriavsnittet en sammanställning av de tre faser som ingår i IToutsourcing, där riskfaktorer och hanteringsåtgärder kopplas till respektivefas . Ledande referenser är Chou & Chou, Aubert, Palvia och Bahli & Rivard.</p><p>Metod: Studien utgörs av besöksintervjuer och telefonintervjuer. Intervjuer hargjorts på företagen Arvid Nordquist, Företag X, Gärdin & Persson och Byggbeslag.</p><p>Resultat: Resultatet av studien visar att det är viktigt för SME att förbereda noggrantoch sedan välja rätt leverantör vid IT outsourcing. Vidare indikerar ocksåstudiens resultat att det är av stor vikt för SME att värdesätta och behålla</p>

Page generated in 0.159 seconds