Spelling suggestions: "subject:"[een] INTEROPERABILITY"" "subject:"[enn] INTEROPERABILITY""
241 |
Interoperability Performance Among Campus Law Enforcement AgenciesMassirer, Tammie Ann 01 January 2018 (has links)
The September 11, 2001 terrorist attacks exposed considerable breakdowns in communications interoperability and information sharing among first responders. Multijurisdictional responses to the active-shooter incidents at the University of Texas in 2010; Sandy Hook Elementary of Newtown, Connecticut in 2012, and the Reynolds High School shooting of Multnomah County, Oregon in 2014 were replete with interoperability failures as well. Recent multijurisdictional response events continue to illuminate difficulties with first-responder interoperability and minimal research exists to promote understanding of the interoperability challenges of university police departments. The purpose of this study was to explore the barriers that impede communications of campus based law enforcement agencies during multiagency or multijurisdictional response. General systems theory and the unified theory of acceptance and use of technology model provided the conceptual framework for this qualitative case study. Face-to-face interviews were conducted with 10 leaders of university public safety agencies in California. Data were collected, inductively coded, and thematically analyzed. Key findings indicate that participants perceived barriers of funding, policy, inclusiveness, and training that affect communications interoperability performance. The positive social change implications from this study include recommendations of policy change for improved interoperability during multiagency or multijurisdictional response which can contribute to increased first-responder safety, more efficient multijurisdictional response, and improved safety of students and society at large.
|
242 |
Aligning Social Media, Mobile, Analytics, and Cloud Computing Technologies and Disaster ResponseWorthy, William Tuley 01 January 2018 (has links)
After nearly 2 decades of advances in information and communications technologies (ICT) including social media, mobile, analytics, and cloud computing, disaster response agencies in the United States have not been able to improve alignment between ICT-based information and disaster response actions. This grounded theory study explored emergency response ICT managers' understanding of how social media, mobile, analytics, and cloud computing technologies (SMAC) are related to and can inform disaster response strategies. Sociotechnical theory served as the conceptual framework to ground the study. Data were collected from document reviews and semistructured interviews with 9 ICT managers from emergency management agencies in the state of Hawaii who had experience in responding to major disasters. The data were analyzed using open, axial coding, and selective coding. Three elements of a theory emerged from the findings: (a) the ICT managers were hesitant about SMAC technologies replacing first responder's radios to interoperate between emergency response agencies during major disasters, (b) the ICT managers were receptive to converging conventional ICT with SMAC technologies, and (c) the ICT managers were receptive to joining legacy information sharing strategies with new information sharing strategies based on SMAC technologies. The emergent theory offers a framework for aligning SMAC technologies and disaster response strategies. The implications for positive social change include reduced interoperability failures between disaster agencies during major catastrophes, which may lower the risk of casualties and deaths to emergency responders and disaster victims, thus benefiting them and their communities.
|
243 |
Implementation of Interoperability in the Emergency Center: A DNP ProjectSilka, Christina R. 09 April 2020 (has links)
No description available.
|
244 |
New Methodologies for Optimal Location of Synchronized Measurements and Interoperability Testing for Wide-Area ApplicationsMadani, Vahid 11 May 2013 (has links)
Large scale outages have occurred worldwide in recent decades with some impacting 15-25% of a nation’s population. The complexity of blackouts has been extensively studied but many questions remain. As there are no perfect solutions to prevent blackouts, usually caused by a complex sequence of cascading events, a number of different measures need to be undertaken to minimize impact of future disturbances. Increase in deployment of phasor measurement units (PMUs) across the grid has given power industry an unprecedented technology to study dynamic behavior of the system in real time. Integration of large scale synchronized measurements with SCADA system requires a careful roadmap and methodology. When properly engineered, tested, and implemented, information extracted from synchrophasor data streams provides realtime observability for transmission system. Synchrophasor data can provide operators with quick insight into precursors of blackout (e.g., angular divergence) which are unavailable in traditional SCADA systems. Current visualization tools and SE functions, supported by SCADA, provide some basic monitoring. Inaccuracies in measurements and system models, absence of redundancy in the measured parameters or breaker statuses in most cases, and lack of synchronization and time resolution in SCADA data result in limited functionality and precision for a typical EMS required in today’s operating environment of tighter margins that require more frequent and more precise data. Addition of synchrophasor data, typically having several orders of magnitude higher temporal resolution, (i.e., 60 to 120 measurements per second as opposed to one measurement every 4 to 8 seconds), can help detect higher speed phenomena and system oscillations. Also, time synchronization to one micro-second allows for accurate comparison of phase angles across the grid and identification of major disturbances and islanding. This dissertation proposes a more comprehensive, holistic set of criteria for optimizing PMU placement with consideration for diverse factors that can influence PMU siting decision-making process and incorporates several practical implementation aspects. An innovative approach to interoperability testing is presented and solutions are offered to address the challenges. The proposed methodology is tested to prove the concept and address real-life implementation challenges, such as interoperability among the PMUs located across a large area.
|
245 |
Data quality and governance in a UK social housing initiative: Implications for smart sustainable citiesDuvier, Caroline, Anand, Prathivadi B., Oltean-Dumbrava, Crina 03 March 2018 (has links)
no / Smart Sustainable Cities (SSC) consist of multiple stakeholders, who must cooperate in order for SSCs to be successful. Housing is an important challenge and in many cities, therefore, a key stakeholder are social housing organisations. This paper introduces a qualitative case study of a social housing provider in the UK who implemented a business intelligence project (a method to assess data networks within an organisation) to increase data quality and data interoperability. Our analysis suggests that creating pathways for different information systems within an organisation to ‘talk to’ each other is the first step. Some of the issues during the project implementation include the lack of training and development, organisational reluctance to change, and the lack of a project plan. The challenges faced by the organisation during this project can be helpful for those implementing SSCs. Currently, many SSC frameworks and models exist, yet most seem to neglect localised challenges faced by the different stakeholders. This paper hopes to help bridge this gap in the SSC research agenda.
|
246 |
Integrating IFC Models and Virtual Reality for Indoor Lighting Design / Integrering av IFC-modeller och virtuell verklighet för inomhusbelysningWisén, André January 2019 (has links)
Previous research has studied the use of Building Performance Simulations (BPS) tools withBuilding Information Modeling (BIM). BPS can be used to visualize and evaluate the designof buildings. Virtual Reality (VR) can be used as a BPS tool for designing indoor lighting.The problem is that previous research still has difficulties with data interoperability. That isthe integration of VR with BIM. Many studies suggest using the file format IFC to mitigatethis problem.The aim of this study is to investigate how to increase data interoperability by using a fileformat IFC. A prototype system will be developed to test this. The research question is if openBIM formats can improve the quality of design solutions? The study will try to answer howVR can be integrated with BIM and how VR can be used for indoor lighting design.The result from this thesis is in part a prototype system called FooBar. FooBar shows how VRcan be integrated with BIM. However, the file format IFC was not used throughout the wholedesign process. Instead, IFC is only used at the beginning and the end of the process. Thisstudy shows how VR can be used as an alternative BPS tool. Users can manipulate lights inthe building model. These changes are then updated in the original BIM model.This means that VR can be used to improve the quality of design solutions. In other words,FooBar can help to cope with multidisciplinary design processes. Users can immersethemselves into the virtual environment and see different design alternatives for themselves.Different design alternatives can easily be rendered in VR. With a system like FooBar, userscan easily define, propose, and analyze different design ideas to reach design goals. / Tidigare forskning har studerat hur simuleringsverktyg (Building Performance Simulations,BPS) kan användas med byggnadsinformationsmodellering (Building Information Modeling,BIM).Simuleringsverktyg kan dels användas för att visualisera design och användas somdesignutvärdering. Virtual Reality (VR), eller virtuell verklighet, kan användas som ettsimuleringsverktyg för design av inomhusbelysning. Problemet är att tidigare forskning harproblem med datakompatibilitet, d.v.s. integrationen av VR med BIM. Många studier föreslåratt filformatet IFC används för att lösa detta problem.Syftet med denna studie är att undersöka hur man ökar datakompatibiliteten genom attanvända ett IFC. Ett prototypsystem kommer att utvecklas för att testa detta. Forskningsfråganhandlar om huruvida öppna BIM-format kan förbättra kvaliteten på designlösningarna.Studien ska även försöka svara på hur VR kan integreras med BIM och hur VR kan användasför inomhusbelysningsdesign.Resultatet av detta arbete är en del av ett prototypsystem kallat FooBar. FooBar visar hur VRkan integreras med BIM. Filformatet IFC användes emellertid inte under heladesignprocessen. Istället används IFC endast i början och slutet av processen. Det andraresultatet är att denna studie visar hur VR kan användas som ett alternativ till andrasimuleringsverktyg. Med FooBar kan användare flytta, rotera och skala om ljus ibyggnadsmodellen. Dessa ändringar uppdateras sedan i den ursprungliga BIM-modellen.Det betyder att VR kan användas för att förbättra kvaliteten på designlösningarna. FooBar kanmed andra ord bidra till att underlätta multidisciplinära designprocesser. Användare kan självakliva in i den virtuella miljön (i VR) och se olika designalternativ framför sig. Således kanolika designalternativ enkelt göras i VR. Med ett system som FooBar kan användarna enkeltdefiniera, föreslå och analysera olika designidéer för att nå de designmål som finns i projektet.
|
247 |
A Framework for Grid-Enabling Scientific Workflow Systems. Architecture and application case studies on interoperability and heterogeneity in support for Grid workflow automation.Azam, Nabeel A. January 2010 (has links)
Since the early 2000s, Service Oriented Architectures (SOAs) have played a key role in the development of complex applications within a virtual organization (VO) context. Grids and workflows have emerged as vital technologies for addressing the (SOA) paradigm. Given the variety of Grid middleware, scientific workflow systems and Grid workflows available, bringing the two technologies together in a flexible, reusable and generalized way has been largely overlooked, particularly from a scientific end user perspective. The lack of domain focus in this area has led to a slow uptake of Grid technologies.
This thesis aims to design a framework for Grid-enabling workflows, which identifies the essential technological components, how these components fit together in layered architecture and the interactions between them. To produce such a framework, this thesis first investigates the definition of a Grid-workflow architecture and mapping Grid functionality to workflow nodes, focusing on striking a balance between performance, usability and the Grid functionality supported. Next, it presents an examination of framework extensions for supporting various forms of Grid heterogeneity, essential for
ii
VO based collaboration. Given the complex nature of Grid technologies, the work presented here investigates abstracting Grid based workflows through high-level definitions and resolution using semantic technologies. Finally, this thesis presents a way to resolves abstract Grid workflows using semantic technologies and intelligent, autonomous agents.
The frameworks presented in this thesis are tested and evaluated within the context of domain-based case studies defined in the SIMDAT, BRIDGE and ARGUGRID EU funded research projects.
|
248 |
State Validation of Ethash-based Blockchains using a zk-SNARK-based Chain RelayLeonard, Stutzer January 2022 (has links)
We present an Ethash-based blockchain relay that utilizes Off-Chain Computation (OCC) to validate block headers on-chain. Current work compromises on fundamental ideas of the blockchain concept: they either require a centralized entity, require a centralized Trusted Third Party (TTP) or are built on economic assumptions. That way, they try to circumvent the on-chain cost-heavy Ethash computation. We utilize Zero Knowledge Proofs (ZKPs) to outsource the Ethash validation to an Off-Chain Computation Framework (OCCF) and only verify the validity of the OCC on-chain. The required dataset for the Ethash validation is inserted into a merkle tree for computational feasibility. Additionally, we validate multiple block headers in batches to further minimize on-chain costs. The on-chain costs of our batch validation mechanism are minimal and constant since only the proof of an OCC is verified on-chain. Through merkle proofs we enable the efficient inclusion of intermediary block headers for any submitted batch. The OCC is feasible on average consumer hardware specifications. Our prototype verifies 5 block headers in a single proof using the ZoKrates framework. Compared to current approaches we only use 3.3% of the gas costs resulting in a highly scalable alternative that is trustless, distributed and has no economic assumptions. For future work, we propose to distribute the computational overhead of computing Ethash inside a ZKP through an off-chain distribution module. This is because we rely on the concurrent execution of the OCC by at least 36 active participants to catch up with the current state of the relay’s blockchain. / Vi presenterar ett Ethash-baserat blockchain-relä som använder sig av Off-Chain Computation (OCC) att validera blockhuvuden i kedjan. Nuvarande arbete kompromissar med grundläggande idéer om blockchain-konceptet: de kräver antingen en centraliserad enhet, kräver en centraliserad Trusted Third Party (TTP) eller bygger på ekonomiska antaganden. På så sätt försöker de kringgå den kostnadstunga Ethash-beräkningen på kedjan. Vi använda Zero Knowledge Proof (ZKP) för att lägga ut Ethash-valideringen på en Off-Chain Computation Framework (OCCF) och verifiera endast giltigheten av OCC på kedjan. Den nödvändiga datamängden för Ethash-valideringen är infogas i ett merkleträd för beräkningsmöjlighet. Dessutom validerar vi flera blockhuvuden i omgångar för att ytterligare minimera kostnader i kedjan. På-kedjan kostnaderna för vår batchvalideringsmekanism är minimala och konstanta eftersom endast bevis på en OCC verifieras i kedjan. Genom merkle proofs möjliggör vi det effektiva inkludering av mellanliggande blockrubriker för alla inlämnade partier. OCC är genomförbart i genomsnitt konsumenthårdvaruspecifikationer. Vår prototyp verifierar 5 blockhuvuden i ett enda bevis med hjälp av ZoKrates-ramverket. Jämfört med nuvarande tillvägagångssätt vi använder bara 3,3% av gaskostnaderna vilket resulterar i ett mycket skalbart alternativ dvs förtroendelös, distribuerad och har inga ekonomiska antaganden. För framtida arbete, föreslår vi för att distribuera beräkningsoverheaden för beräkning av Ethash inuti en ZKP genom en distributionsmodul utanför kedjan. Detta beror på att vi skulle förlita oss på det samtidiga utförandet av OCC med minst 36 aktiva deltagare för att komma ikapp med det aktuella tillståndet för reläets blockkedja.
|
249 |
Informationsöverföring till energisimulering : En utvärderande studie av metoder för informationsöverföring från BIM-program till energisimuleringsprogram / Information exchange to energy simulation : An evaluative study of methods used for information exchange from BIM-programs to energy simulation programsAndersson, Jimmy, Hansen, Edvin January 2023 (has links)
Detta arbete ämnar att undersöka vilka metoder som används i Sverige förinformationsöverföring från modellering- till energisimuleringsprogram. Utifrån identifierademetoder konstaterades för- och nackdelar med respektive metod som fungerade som underlagtill förslag av utvecklingsmöjligheter. Problemet som uppstår vid informationsöverföring från BIM-modeller till energimodeller är att det sker informationsförluster som sedan måste matas in manuellt i energisimuleringsprogrammet. Detta manuella arbete blir tidskrävande och kostsamt vilket medför att färre energisimuleringar utförs. Slutsatserna av studien ska bidra till att främja en energieffektiv modelleringsprocess och därmed minska bygg- och fastighetsindustrins energianvändning. Studien initierades med en litteraturstudie för att få en större förståelse för informationsöverföring. Därefter bestod forskningsmetoden främst av en semistrukturerad intervjustudie där fem företag intervjuades. De intervjuade respondenterna bestod av energispecialister och besvarade frågor som utgick från en intervjuguide. Utöver intervjustudien utfördes ytterligare en litteraturstudie för att få ett mer utförligt resultat. Resultaten av intervjustudien visade att samtliga respondenter arbetar med energisimuleringsprogrammet IDA ICE, och att den vanligaste metoden för informationsöverföring var med DWG-filer. Denna metod importerar endast en planritning som används som en mall för att modellera byggnaden i IDA ICE. Fördelarna med metoden är att energiexperten lär känna byggnaden och kan identifiera dess styrkor samt svagheter utifrån ett energiperspektiv. Nackdelarna är att metoden kräver mycket tidskrävande manuellt arbete som dessutom ökar risken att det blir fel på grund av den mänskliga faktorn. Metoden med DWG-import är ett separerat arbetssätt som motverkar automation och samarbete mellan olika discipliner. En annan identifierad metod för informationsöverföring till IDA ICE är med IFC-filer. Metoden importerar geometrisk data och orientering till IDA ICE. Vidare visar litteraturstudien att även en semiautomatisk import av termiska zoner kan uppnås. För att geometrisk data och termiska zoner ska kunna importeras korrekt till IDA ICE krävs det att den ursprungliga BIM-modellen upprättas dels så att IFC-filen kan läsa BIM-modellen och dels så att IDA ICE kan läsa IFC-filen. Nackdelarna med IFC-importer är att energimodellen måste kontrolleras så att den stämmer överens med den ursprungliga BIM-modellen och det kan bli tidskrävande om det är många fel i IFC-filen. Om IFC-filen innehåller alltför många fel finns det en risk att ingen information kan importeras till IDA ICE, vilket betyder att DWG-import måste användas istället. Speckle är en metod som är i en tidig utvecklingsfas. Speckle möjliggör att information kan överföras mellan program utan användningen av ett filformat. Däremot stödjer inte IDA ICE en direkt dataström med Speckle utan det krävs några mellansteg för att få informationsöverföringen att fungera. Slutsatsen för arbetet är att DWG-import bör bytas ut mot IFC-import eftersom det kan spara tid, sänka kostnader och möjliggöra fler energisimuleringar. För att en IFC-import ska fungera måste arkitekten modellera och exportera filen på ett visst sätt från BIM-programmet. Ett upprättande av en manual för hur arkitekten ska modellera och exportera kan vara en lösning till detta problem. Det föreslås att EQUA utvecklar en arkitektmanual eftersom de har mest kunskap om vad som krävs för att en IFC ska kunna importeras till IDA ICE. / This work aims to identify methods used in Sweden for information exchange between BIM-software and energy simulation software. The advantages and disadvantages of each method were pointed out which led to suggestions for improvement. The conclusion of this thesis is supposed to promote a more energy-efficient modeling process. The study was initiated with a qualitative literature study to gather knowledge about the topic. A semistructured interview study was conducted where experts on the subject answered questions based on the aim of the study. In addition to the interview study another qualitative literature study was conducted to broaden and gain a more detailed result. The result of the interview study showed that all respondents use the energy simulation software IDA ICE, and that the most common method for information exchange is by using the DWG-file format. A method which only allows for the import of the building's blueprint. Another identified method was by importing IFC-files which allows for the import of a building's geometry. A third method was by streaming information through Speckle. This method requires additional plugins and is in an early stage of its development. It was concluded that DWG-import should be replaced by IFC-import as it saves time in the energy modeling process which allows for an increased number of simulations to be made. For an IFC-file to work the architect must create the model as well as export it from the BIM-software in a certain way. In order for the architect to understand how to do this a manual with instructions needs to be produced. It is suggested that EQUA develops this manual as they have the most knowledge about what it takes to import a working IFC-file.
|
250 |
AN INTEGRATION ARCHITECTURE OF AIAAS: INTEROPERABILITY AND FUNCTIONAL SUITABILITYMusabimana Boneza, Benedicte January 2023 (has links)
This thesis explores the integration of Artificial Intelligence as a Service (AIaaS) into existing systems, focusing on handling challenges related to unclear data processing and complex integration. The study examined existing research to understand current integration practices and ensure alignment with established standards. Based on this research, an integration architecture was designed and emphasizes two key factors: ensuring the system works as expected (functional suitability) and ensuring different parts of the system can communicate smoothly (interoperability). The integration architecture was designed to simplify communication between different parts of the system, making sure they all work together effectively. It also helps reduce the complications that often come with integration. This mutual reinforcement between functional suitability and interoperability implies coherent outcomes and establishes an environment that fosters smooth communication among system components. The practical implications of this research are exemplified through the implementation of the proposed architecture within the Gokind platform, resulting in positive outcomes. The transition from manual receipt verification to automated receipt recognition using the Google Vision Application Programming Interface (API) showcases accelerated processing times, scalability, and efficient resource allocation. Despite achieving an impressive 90% accuracy rate, the study identifies areas for potential improvement, advocating for ongoing refinement. While the study successfully navigates the challenges related to Artificial Intelligence as a Service (AIaas) integration, it acknowledges certain limitations, such as the potential for exploring varied AIaas providers and environments and the essential consideration of security aspects. Moreover, future research avenues are suggested, including variance analysis across AIaas classes, comparative studies among providers, fortified security measures, and comprehensive exploration of architectural attributes’ impact.
|
Page generated in 0.0602 seconds