311 |
Cyber-security protection techniques to mitigate memory errors exploitationMarco Gisbert, Héctor 04 November 2016 (has links)
Tesis por compendio / [EN] Practical experience in software engineering has demonstrated that the goal of
building totally fault-free software systems, although desirable, is impossible
to achieve. Therefore, it is necessary to incorporate mitigation techniques in
the deployed software, in order to reduce the impact of latent faults.
This thesis makes contributions to three memory corruption mitigation
techniques: the stack smashing protector (SSP), address space layout
randomisation (ASLR) and automatic software diversification.
The SSP is a very effective protection technique used against stack buffer
overflows, but it is prone to brute force attacks, particularly the dangerous
byte-for-byte attack. A novel modification, named RenewSSP, has been proposed
which eliminates brute force attacks, can be used in a completely transparent
way with existing software and has negligible overheads. There are two
different kinds of application for which RenewSSP is especially beneficial:
networking servers (tested in Apache) and application launchers (tested on
Android).
ASLR is a generic concept with multiple designs and implementations. In this
thesis, the two most relevant ASLR implementations of Linux have been analysed
(Vanilla Linux and PaX patch), and several weaknesses have been found. Taking
into account technological improvements in execution support (compilers and
libraries), a new ASLR design has been proposed, named ASLR-NG, which
maximises entropy, effectively addresses the fragmentation issue and removes a
number of identified weaknesses. Furthermore, ASLR-NG is transparent to
applications, in that it preserves binary code compatibility and does not add
overheads. ASLR-NG has been implemented as a patch to the Linux kernel 4.1.
Software diversification is a technique that covers a wide range of faults,
including memory errors. The main problem is how to create variants,
i.e. programs which have identical behaviours on normal inputs but
where faults manifest differently. A novel form of automatic variant
generation has been proposed, using multiple cross-compiler suites and
processor emulators.
One of the main goals of this thesis is to create applicable results.
Therefore, I have placed particular emphasis on the development of real
prototypes in parallel with the theoretical study. The results of this thesis
are directly applicable to real systems; in fact, some of the results have
already been included in real-world products. / [ES] La creación de software supone uno de los retos más complejos para el
ser humano ya que requiere un alto grado de abstracción. Aunque se ha
avanzado mucho en las metodologías para la prevención de los fallos
software, es patente que el software resultante dista mucho de ser
confiable, y debemos asumir que el software que se produce no está
libre de fallos. Dada la imposibilidad de diseñar o implementar
sistemas libres de fallos, es necesario incorporar técnicas de
mitigación de errores para mejorar la seguridad.
La presente tesis realiza aportaciones en tres de las principales
técnicas de mitigación de errores de corrupción de memoria: Stack
Smashing Protector (SSP), Address Space Layout Randomisation (ASLR) y
Automatic Software Diversification.
SSP es una técnica de protección muy efectiva contra
ataques de desbordamiento de buffer en pila, pero es sensible a ataques de
fuerza bruta, en particular al peligroso ataque denominado byte-for-byte.
Se ha propuesto una novedosa modificación del SSP, llamada RenewSSP,
la cual elimina los ataques de fuerza bruta. Puede ser usada
de manera completamente transparente con los programas existentes sin
introducir sobrecarga. El RenewSSP es especialmente beneficioso en dos áreas de
aplicación: Servidores de red (probado en Apache) y
lanzadores de aplicaciones eficientes (probado en Android).
ASLR es un concepto genérico, del cual hay multitud de diseños e
implementaciones. Se han analizado las dos implementaciones más
relevantes de Linux (Vanilla Linux y PaX patch), encontrándose en
ambas tanto debilidades como elementos mejorables. Teniendo en cuenta
las mejoras tecnológicas en el soporte a la ejecución (compiladores y
librerías), se ha propuesto un nuevo diseño del ASLR, llamado
ASLR-NG, el cual: maximiza la entropía, soluciona el problema de la
fragmentación y elimina las debilidades encontradas. Al igual que la
solución propuesta para el SSP, la nueva propuesta de ASLR es
transparente para las aplicaciones y compatible a nivel
binario sin introducir sobrecarga. ASLR-NG ha sido implementado como
un parche del núcleo de Linux para la versión 4.1.
La diversificación software es una técnica que cubre una amplia gama
de fallos, incluidos los errores de memoria. La principal dificultad
para aplicar esta técnica radica en la generación de las
"variantes", que son programas que tienen un comportamiento idéntico
entre ellos ante entradas normales, pero tienen un comportamiento
diferenciado en presencia de entradas anormales. Se ha propuesto una
novedosa forma de generar variantes de forma automática a partir de un
mismo código fuente, empleando la emulación de sistemas.
Una de las máximas de esta investigación ha sido la aplicabilidad de
los resultados, por lo que se ha hecho especial hincapié en el
desarrollo de prototipos sobre sistemas reales a la par que se llevaba
a cabo el estudio teórico. Como resultado, las propuestas de esta
tesis son directamente aplicables a sistemas reales, algunas de ellas
ya están siendo explotadas en la práctica. / [CA] La creació de programari suposa un dels reptes més complexos per al ser humà ja
que requerix un alt grau d'abstracció. Encara que s'ha avançat molt en les
metodologies per a la prevenció de les fallades de programari, és palès que el
programari resultant dista molt de ser confiable, i hem d'assumir que el
programari que es produïx no està lliure de fallades. Donada la impossibilitat
de dissenyar o implementar sistemes lliures de fallades, és necessari
incorporar tècniques de mitigació d'errors per a millorar la seguretat.
La present tesi realitza aportacions en tres de les principals tècniques de
mitigació d'errors de corrupció de memòria: Stack Smashing Protector (SSP),
Address Space Layout Randomisation (ASLR) i Automatic Software
Diversification.
SSP és una tècnica de protecció molt efectiva contra atacs de desbordament de
buffer en pila, però és sensible a atacs de força bruta, en particular al
perillós atac denominat byte-for-byte.
S'ha proposat una nova modificació del SSP, RenewSSP, la qual elimina els atacs
de força bruta. Pot ser usada de manera completament transparent amb els
programes existents sense introduir sobrecàrrega. El RenewSSP és especialment
beneficiós en dos àrees d'aplicació: servidors de xarxa (provat en Apache) i
llançadors d'aplicacions eficients (provat en Android).
ASLR és un concepte genèric, del qual hi ha multitud de dissenys i
implementacions. S'han analitzat les dos implementacions més rellevants de
Linux (Vanilla Linux i PaX patch), trobant-se en ambdues tant debilitats com
elements millorables. Tenint en compte les millores tecnològiques en el suport
a l'execució (compiladors i llibreries), s'ha proposat un nou disseny de
l'ASLR: ASLR-NG, el qual, maximitza l'entropia, soluciona el problema de
la fragmentació i elimina les debilitats trobades. Igual que la solució
proposada per al SSP, la nova proposta d'ASLR és transparent per a les
aplicacions i compatible a nivell binari sense introduir sobrecàrrega. ASLR-NG
ha sigut implementat com un pedaç del nucli de Linux per a la versió 4.1.
La diversificació de programari és una tècnica que cobrix una àmplia gamma de
fa\-llades, inclosos els errors de memòria. La principal dificultat per a aplicar
esta tècnica radica en la generació de les "variants", que són programes que
tenen un comportament idèntic entre ells davant d'entrades normals, però tenen
un comportament diferenciat en presència d'entrades anormals. S'ha proposat una
nova forma de generar variants de forma automàtica a partir d'un mateix codi
font, emprant l'emulació de sistemes.
Una de les màximes d'esta investigació ha sigut l'aplicabilitat dels resultats,
per la qual cosa s'ha fet especial insistència en el desenrotllament de
prototips sobre sistemes reals al mateix temps que es duia a terme l'estudi
teòric. Com a resultat, les propostes d'esta tesi són directament aplicables
a sistemes reals, algunes d'elles ja estan sent explotades en la pràctica. / Marco Gisbert, H. (2015). Cyber-security protection techniques to mitigate memory errors exploitation [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/57806 / Compendio
|
312 |
Discovering Location Patterns in iOS Users Utilizing Machine Learning Methods For Purposes of Digital Forensics InvestigationsMilos Stankovic (9741251) 06 August 2024 (has links)
<p dir="ltr">The proliferation of mobile devices and big data has put digital forensic investigators at a disadvantage. Despite all the technological advances, the tools and methods used during the investigations must catch up. With smartphones becoming integral to crime scenes, often containing multiple instances, courts and law enforcement offices greatly depend on their data. In addition to traditional data on smartphones, such as call logs, text messages, and emails, sensor data can drastically increase the chances of resolving and painting the complete picture of the events required for a successful investigation. While sensor data are collected frequently, it often creates a lot of noise due to the amount of entries over some time. In attempting to decipher the data and link them to the relevant events, digital forensics investigators are prone to missing or simply disregarding the data extracted from smartphones. Interpreting sensor data such as location and various phone activities already collected and extracted can lead to finding two main links required for the investigation: time and location. Knowing an individual's time and location can significantly improve the investigation process and aid in the final outcome. Despite smartphones being capable of collecting sensor data and discovering these two variables, data interpretation and correlation between them still need to be improved. The statement is particularly true for smartphones with newer operating system versions. Due to the special forensic software required to extract the data and the ability to interpret them, digital forensic investigators are either strained for time or are unequipped for processing them.</p><p dir="ltr">In order to mitigate the gap, automation of the process capable of handling large amounts of data while classifying the time and the location appropriate for the investigation is necessary. Reducing investigation times and increasing prediction accuracy will allow faster resolving times while freeing up desperately needed resources for digital forensic investigators. Therefore, this study presents a novel approach to identifying and predicting user locations using machine learning based on various sensor data collected from multiple smartphones. As the first step in achieving the goal, a user study was conducted, collecting real-world data for training and testing of the machine learning models. The process includes engineering the necessary procedures and methodologies required to extract raw data and process them for successful model training. The results showed that the models are capable of differentiating between the three different locations using XGBoost with score test accuracy over 0.88. Additionally, Random Forest Entropy and Random Forest Gini achieved accuracy over 0.85. As for for the results where only two locations were predicted Random Forest Entropy and Random Forest Gini achieved accuracy test score per model over 0.97. </p>
|
313 |
Social Exchange Theory in the Context of X (Twitter) and Facebook Social Media Platforms with a Focus on Privacy Concerns among Saudi StudentsAlqahtani, Sameer Mohammed S. 12 1900 (has links)
The current research examines the use of social media and its security settings using the Social Exchange Theory (SET) within a Saudi student environment. This research includes an introduction, literature review, methodology, results, and conclusion with the results section presenting the findings from the three essays. The first essay employs the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) methodology of SET. PRISMA's systematic and exhaustive approach to literature evaluation increases the likelihood of obtaining high-quality, reproducible findings. In the second essay, which focuses on awareness of X's (Twitter) security settings, a quantitative research approach was utilized. A sample of former and current Saudi students (graduate and undergraduate) at the University of North Texas participated in the investigation. This research provides an empirical examination of the use of X (Twitter) and its security features within this community by employing statistical analysis of the data from respondents. Likewise, the same sample of Saudi students from the University of North Texas was used for the third essay in which the use of Facebook's security settings was examined. Having a consistent sample across both studies enables a comparison and a greater understanding of the security awareness and practices of this group across various social media platforms. The findings across the different studies extend our understanding of the role of culture in privacy and security concerns related to social media.
|
314 |
Open Source Security and Quality Assessment : Analysera beroenden i program / Open Source Security and Quality Assessment : Analyze dependencies in programsGibro, Edvin, Glansholm, Bacilika, Holta, Viktor, Karlsson, Simon, Kjellin, Jessica, Randow, Max, Simonson, Erik, Söderström, Jakob January 2024 (has links)
Denna rapport behandlar projektet: Open Source Security and Quality Assessment (OSSQA) som utvecklades på uppdrag av cybersäkerhetsföretaget Advenica AB. Detta projekt genomfördes av åtta studenter i kursen TDDD96, Kandidatprojekt i programvaruutveckling på Linköpings universitet, under våren 2024. Rapporten presenterar projektets genomförande, resultat och slutsatser och går dessutom in i detalj på bland annat hur värde skapades för kunden, vilka erfarenheter som har lyckats samlas in av medlemmarna i gruppen under projektet samt andra relevanta aspekter av projektet. Resultatet av projektet blev en produkt som analyserar mjukvaruprojekt och betygsätter dess beroenden för att till sist presentera ett resultat på säkerheten och kvaliteten på hela projektet. Den resulterande produkten ansågs ha god användarvänlighet som uppmättes till 77,5 poäng enligt SUS-metoden. Kunden fick därmed en produkt levererad som uppnådde deras förväntningar och som var enkel både att använda och att vidareutveckla. Projektgruppen har även lyckats samla in många värdefulla erfarenheter.
|
315 |
Styrning av cyberrisker i svensk offentlig sektor : En kvalitativ intervju och dokumentstudie om hur svenska offentliga organisationer styr avseende cyberrisker / Governance of cyberrisk in the swedish public sector : A qualitative interview and document study on how Swedish public organizations govern cyber risksGiordano, Simon, Forsman, Frej January 2024 (has links)
Background: Cyberattacks have significantly increased recently amongst Swedish public sector organizations, heightening the need for robust governance of cyber risks. Cyber risks are particularly complex and dynamic, requiring strong leadership support and strategic planning. Previously, cyber risks have often been addressed from an IT perspective, whereas this study approaches them from a governance perspective. Purpose: The purpose of the study is to map and increase the knowledge about how authorities and regions govern cyber risks. The aim is to contribute with a practical conceptual model that is useful and to theoretically complement the literature on ERM. Methodology: The study was conducted using a qualitative approach. The empirical data was collected through a combination of document analysis and semi-structured interviews. Respondents were selected due to their high competence in cyber risks or governance. The theoretical material was gathered from previous research in articles and books related to the governance of cyber risks. Conclusion: Public organizations govern cyber risks through laws, policies, and internal models, but there are no unified requirements or frameworks. The ones used need to be adapted to each organization's specific needs. The study's conceptual model for cyber risk governance is proposed to be circular and continuously adaptable, focusing on strategy, identification, evaluation, prioritization. Culture and communication are central governance elements, with revision and follow-up emphasized as critical steps. Collaboration between public organizations for joint data storage is recommended to facilitate risk management. The risk-reducing measures are expressed differently in relation to the governing tools. / Bakgrund: Antalet cyberattacker har ökat den senaste tiden inom svensk offentlig sektor, vilket har gjort att behovet av effektiv styrning av cyberrisker ökat. Cyberrisker är särskilt komplexa och dynamiska, vilket kräver starkt ledningsstöd och strategisk planering. Tidigare har cyberrisker ofta behandlats utifrån ett IT-perspektiv medan denna studie behandlar problematiken ur ett styrande perspektiv. Syfte: Syftet med studien är att genom kartläggning öka kunskapen om myndigheter och regioners styrning av cyberrisker. Syftet har varit att bidra med en praktisk konceptuell modell som är användbar och att bidra teoretiskt genom att komplettera litteraturen kring ERM. Metod: Studien har genomförts genom ett kvalitativt tillvägagångssätt. Empirin har samlats in genom en kombination av dokumentanalys samt semi-strukturerade intervjuer. Respondenterna har valts ut på grund av deras höga kompetens inom cyberrisker, alternativt styrning. Det teoretiska materialet har samlats in genom tidigare forskning från artiklar och annan litteratur som berört styrning av cyberrisker. Slutsats: Offentliga organisationer styr cyberrisker genom lagar, policys och interna modeller, men det finns inga enhetliga krav eller ramverk. De som används kräver anpassning till varje organisations specifika behov. Studiens konceptuella modell för styrning av cyberrisker föreslås vara cirkulär och ständigt anpassningsbar, med fokus på strategi, identifiering, utvärdering, prioritering. Kultur och kommunikation är centrala styrelement, revidering samt uppföljning framhålls som kritiska steg. Ett samarbete mellan offentliga organisationer för gemensam datalagring rekommenderas för att underlätta riskreduceringen. Riskreducerande åtgärderna ter sig tämligen olika, satta i relation till de styrande verktygen.
|
316 |
Secure Satellite Communication : A system design for cybersecurity in spaceWallin, Lucas January 2024 (has links)
This thesis presents an in-depth exploration of designing a cybersecurity system for satellitecommunication, addressing cyberthreats as the space industry transitions from security byobscurity in mission specific designs to the use of mass-produced components. To counteract these threats, a comprehensive security system must be implemented,considering all facets of satellite communication, from key management and encryption to digitalsignatures, digital certificates, and hardware security modules (HSMs). The role of HSMs insecurely storing cryptographic keys and performing cryptographic operations is emphasized,highlighting their importance in protecting sensitive data. A partial implementation of the digital signature component demonstrates the practicalimportance of using HSMs for key storage, underscoring the feasibility of the proposed systemin real-world applications. The findings indicate that established protocols and algorithms, when combined effectively, can provide robust security solutions for satellite communication. This research contributes to the development of secure satellite communication systems byoffering a detailed security design tailored to the specific needs and challenges of the spaceenvironment. It provides a framework for future implementations, ensuring that satellite systemscan operate securely and efficiently in an increasingly interconnected and vulnerable digitallandscape.
|
317 |
British Library Unplugged : A Media Analysis of Institutional Pressures during a Cyber Attack on a National LibraryLindström, Emilie, Spirkina, Sasha January 2024 (has links)
This thesis explores the legitimacy of national libraries, by analysing the media's portrayal of the British Library during a major cyber attack by the Rhysida group in October 2023. Using diverse media sources, the research examines how media narratives reflect institutional pressures during prolonged disruption. The research employs a mixed-method approach, combining quantitative media coverage mapping with qualitative thematic analysis. The mapping categorises news articles based on content type, publication section, and perspectives represented. Thematic analysis identifies key themes such as the disruption of library services, cybersecurity concerns, and critiques of digital fragility. The findings reveal a complex interplay between the library's historical role as a national institution and its modern digital vulnerabilities. Additionally, the study discusses the broader implications of digital practices for the institutional identity of libraries, and the perceived responsibilities of national libraries in safeguarding cultural and intellectual heritage against cyber threats.
|
318 |
MODELING RISK IN THE FRONT-END OF THE OSS DEBIAN SUPPLY-CHAIN USING MODELS OF NETWORK PROPAGATIONSahithi Kasim (18859078) 24 June 2024 (has links)
<p dir="ltr">Our research revolves around the evolving landscape of Open-Source Software (OSS) supply chains, emphasizing their critical role in contemporary software development while investigating the escalating security concerns associated with their integration. As OSS continues to shape the software ecosystem, our research acknowledges the paradigm shift in the software supply chain, highlighting its complexity and the associated security challenges. Focusing on Debian packages, we employ advanced network science methods to comprehensively assess the structural dynamics and vulnerabilities within the OSS supply chain. The study is motivated by the imperative to understand, model, and mitigate security risks from interconnected software components.</p><p dir="ltr">Our research questions delve into 1) identifying high-risk packages 2) comparing risk profiles between source and build stages and 3) predicting future vulnerabilities. Data collection involves collecting source code repositories, build-info information, and vulnerability data of Debian packages. Leveraging a multifaceted methodology, we perform the following things: graph construction, subsampling, metrics creation, explorative data analysis, and statistical investigations on the Debian package network. This statistical approach integrates the Wilcoxon test, Chi-Square test, and advanced network dynamics modeling with machine learning, to explore evolving trends and correlations between different stages of the OSS supply chain.</p><p dir="ltr">Our goals include providing actionable insights for industry practitioners, policymakers, and developers to enhance risk management in the OSS supply chain. The expected outcomes encompass an enriched understanding of vulnerability propagation, the identification of high-risk packages, and the comparison of network-based risk metrics against traditional software engineering measures. Ultimately, our research contributes to the ongoing discourse on securing open-source ecosystems, offering practical strategies for risk mitigation and fostering a safer and more resilient OSS supply chain.</p>
|
319 |
Cyberattacks in international relationsEdelman, Ross David January 2013 (has links)
New methods of conflict and coercion can prompt tectonic shifts in the international system, reconfiguring power, institutions, and norms of state behavior. Cyberattacks, coercive acts that disrupt or destroy the digital infrastructure on which states increasingly rely, have the potential to be such a tool — but only if put into practice. This study examines which forces in the international system might restrain state use of cyberattacks, even when they are militarily advantageous. To do so I place this novel technology in the context of existing international regimes, employing an analogical approach that identifies the salient aspects of cyberattacks, and compares them to prior weapons and tactics that share those attributes. Specifically, this study considers three possible restraints on state behavior: rationalist deterrence, the jus ad bellum regime governing the resort to force, and incompatibility with the jus in bello canon of law defining just conduct in war. First, I demonstrate that cyberattacks frustrate conventional deterrence models, and invite, instead, a novel form of proto-competition I call ‘structural deterrence.’ Recognizing that states have not yet grounded their sweeping claims about the acceptability of cyberattacks in any formal analysis, I consider evidence from other prohibited uses of force or types of weaponry to defining whether cyberattacks are ‘legal’ in peacetime or ‘usable’ in wartime. Whereas previous studies of cyberattacks have focused primarily on policy guidance for a single state or limited analysis of the letter of international law, this study explicitly relates international law to state decision-making and precedent. It draws together previously disparate literature across strategic studies, international law, and diplomatic history to offer conclusions applicable beyond any single technology, and of increasing importance as states’ dependence on technology grows.
|
320 |
No protection, nu business : An event study on stock volatility reactions to cyberattacks between 2010 and 2015 for firms listed in the USACollin, Erik, Juntti, Gustav January 2016 (has links)
With the surge of Internet-based corporate communication, organization, andinformation management, financial markets have undergone radical transformation. Inthe interconnected economy of today, market participants are forced to acceptcyberattacks, data breaches, system failures, or security flaws as any other (varying)cost of doing business. While cyberspace encompasses practically any firm indeveloped economies and a large portion in developing ones, combatting such risks isdeemed a question of firm-specific responsibility: the situation resembles an ‘every manfor himself’ scenario. Consulting standard financial theory, rational utility-maximizinginvestors assume firm-specific (idiosyncratic) risk under expectations of additionalcompensation for shouldering such risk – they are economically incentivized. The omnipresence of cyberattacks challenges fundamental assumptions of the CapitalAsset Pricing Model, Optimal Portfolio Theory, and the concept of diversifiability. Thethesis problematizes underlying rationality notions by investigating the effect of acyberattack on stock volatility. Explicitly, the use of stock volatility as a proxy for riskallows for linking increased volatility to higher risk premiums and increased cost ofcapital. In essence, we investigate the following research question: What is the effect ofa disclosed cyberattack on stock volatility for firms listed in the USA?. Using event study methodology, we compile a cyberattack database for events between2010 and 2015 involving 115 firms listed on US stock exchanges. The specified timeperiod cover prevailing research gaps; due to literature paucity the focus on volatilityfits well. For a finalized sample of 189 events, stock return data is matched to S&P500index return data within a pre-event estimation window and a post-event window tocalculate abnormal returns using the market model. The outputs are used to estimateabnormal return volatility before and after each event; testing pre and post volatilityagainst each other in significance tests then approximates the event-induced volatility.Identical procedures are performed for all subsamples based on time horizon, industrybelonging, attack type, firm size, and perpetrator motivation. The principal hypothesis, that stock volatility is significantly higher after a cyberattack,is found to hold within both event windows. Evidence on firm-specific characteristics ismore inconclusive. In the long run, inaccessibility and attacks on smaller firms seem torender significantly larger increases in volatility compared to intrusion and attacks onlarger firms; supporting preexisting literature. Contrastingly, perpetrator motive appearsirrelevant. Generally, stocks are more volatile immediately after an attack, attributableto information asymmetry. For most subsamples volatility seem to diminish with time,following the Efficient Market Hypothesis. Summing up, disparate results raisequestions of the relative importance of contingency factors, and also about futuredevelopments within and outside academic research.
|
Page generated in 0.0661 seconds