Spelling suggestions: "subject:"1echnology anda science"" "subject:"1echnology ando science""
61 |
Data sharing across research and public communitiesHe, Yurong 27 January 2017 (has links)
<p> For several decades, the intensifying trend of researchers to believe that sharing research data is “good” has overshadowed the belief that sharing data is “bad.” However, sharing data is difficult even though an impressive effort has been made to solve data sharing issues within the research community, but relatively little is known about data sharing beyond the research community. This dissertation aims to address this gap by investigating <i><b>how data are shared effectively across research and public communities</b></i>.</p><p> The practices of sharing data with both researchers and non-professionals in two comparative case studies, Encyclopedia of Life and CyberSEES, were examined by triangulating multiple qualitative data sources (i.e., artifacts, documentation, participant observation, and interviews). The two cases represent the creation of biodiversity data, the beginning of the data sharing process in a home repository, and the end of the data sharing process in an aggregator repository. Three research questions are asked in each case:</p><p> • Who are the data providers?</p><p> • Who are the data sharing mediators?</p><p> • What are the data sharing processes?</p><p> The findings reveal the data sharing contexts and processes across research and public communities. Data sharing contexts are reflected by the cross-level data providers and human mediators rooted in different groups, whereas data sharing processes are reflected by the dynamic and sustainable collaborative efforts made by different levels of human mediators with the support of technology mediators.</p><p> This dissertation provides theoretical and practical contributions. Its findings refine and develop a new data sharing framework of knowledge infrastructure for different-level data sharing across different communities. Both human and technology infrastructure are made visible in the framework. The findings also provide insight for data sharing practitioners (i.e., data providers, data mediators, data managers, and data contributors) and information system developers and designers to better conduct and support open and sustainable data sharing across research and public communities.</p>
|
62 |
Data governance| The missing approach to improving data qualityBarker, James M. 01 February 2017 (has links)
<p> In an environment where individuals use applications to drive activities from what book to purchase, what film to view, to what temperature to heat a home, data is the critical element. To make things work data must be correct, complete, and accurate. Many firms view data governance as a panacea to the ills of systems and organizational challenge while other firms struggle to generate the value of these programs. This paper documents a study that was executed to understand what is being done by firms in the data governance space and why? The conceptual framework that was established from the literature on the subject was a set of six areas that should be addressed for a data governance program including: data governance councils; data quality; master data management; data security; policies and procedures; and data architecture. There is a wide range of experiences and ways to address data quality and the focus needs to be on execution. This explanatory case study examined the experiences of 100 professionals at 41 firms to understand what is being done and why professionals are undertaking such an endeavor. The outcome is that firms need to address data quality, data security, and operational standards in a manner that is organized around business value including strong business leader sponsorship and a documented dynamic business case. The outcome of this study provides a foundation for data governance program success and a guide to getting started.</p>
|
63 |
Evaluating Electronic Health Records Interoperability Symbiotic Relationship to Information Management Governance Security RisksThomas, Maurice A. 03 April 2019 (has links)
<p> A major initiative in the U.S. healthcare care industry is to establish a nationwide health information network securing the sharing of information between all involved U.S. healthcare stakeholders. However, implementing an interoperability solution is a massive, complex, and enduring effort with significant challenges such as inconsistent technology and data standards, as well as complex privacy and security issues. The purpose of this qualitative, case study is to examine the impacts of interoperability initiatives involving the U.S. government and to provide an understanding of the information governance and security risk as standards that are vendor-neutral and trustworthy. This qualitative case study was conducted using federal participants who are health information management (HIM) and health information technology (HIT) professionals working in the Washington DC metropolitan area. The participants' interview data revealed nine major themes; patient identification matching, payment claims and auditing, information sharing, data stewardship, regulatory compliance, technology enhancements, training and certification, standards optimization, and value-based care. The implication of the study's themes showed interoperability is beneficial to the healthcare industry, but there is a greater need for technology and data standardization, information governance, data stewardship, and a greater understanding of federal and state data privacy and security laws. Future recommendation for practices discussed; policy and regulatory adjustments to enhance auditing and compliance, establish a healthcare data ecosystem to improve data and information governance, and technology alternatives such as master data management and white space data. Recommendation for further research included expanding the sample population to compare other federal organizations or the United Kingdom's HIT interoperability project initiative. </p><p>
|
64 |
Development of an e-Textile Debugging Module to Increase Computational Thinking among Graduate Education StudentsKim, Victoria Herbst 03 May 2019 (has links)
<p> The increased presence of technology in all aspects of daily life makes computational thinking a necessary skill. Predictions say that the rising need for computational thinkers will be unmet by computer science graduates. An e-textile learning module, based on principles of constructionism, was designed as a method to develop computational thinking skills and encourage interest and confidence in the computing fields in both male and female graduate education students. The module leveraged the affordances of the LilyPad Arduino, a technology that allows for the creation of projects that integrate textiles and electronics without soldering. The creation of the learning module relied on design-based research methodologies and followed the use-modify-create principle for the included activities. Multiple data sources were analyzed using The Computational Thinking Rubric for Examining Students’ Project Work to examine artifacts and interactions for indications of computational thinking concepts, practices, and perspectives. Students participated in debugging activities and created their own projects as part of the learning module. Analysis of the learning module activities showed students using computational thinking concepts, engaged in computational thinking practices, and exhibiting computational thinking perspectives. During the coding process, several new computational thinking concepts, practices, and perspectives emerged. There was evidence of both an increase and decrease in confidence among the student participants. Improvements for the next iteration of the learning module were presented and the implications for the study of computational thinking explored. The study helps contradict the shrinking pipeline metaphor by showing that it is possible to encourage interest in computation in university students, not just middle-school students.</p><p>
|
65 |
Interpretative Phenomenological Analysis of Accessibility Awareness Among Faculty in Online Learning EnvironmentsSessler Trinkowsky, Rachael 27 August 2015 (has links)
<p>Although all organizations and institutions should consider accessibility when developing online content, inaccessibility is a recurring issue in recent literature pertaining to online learning environments (OLEs) and faculty accessibility awareness. The goal was to describe how online faculty gain knowledge regarding accessibility, to explore the lived experiences of online faculty who have worked with students who have disabilities, and to gain a better understanding of how faculty experience the process of accessibility implementation. The following research questions guided this study: How do faculty in OLEs experience encounters regarding accessibility for students who have print related disabilities? How do faculty in OLEs experience the journey of developing the skills needed to provide accessibility for students with print related disabilities? What aspects of accessibility and Universal Design for Learning (UDL) do faculty members practice in OLEs and what meaning do they ascribe to the lived experience of providing these accommodations?
An interview guide was used to address the research questions. Participants were recruited from the Online Learning Consortium and Assistive Technology Industry Association for participation in phenomenological interviews, which were recorded and then transcribed verbatim. The transcripts of these interviews were analyzed to determine eight super-ordinate themes: Accessibility and usability awareness of online faculty; interactions and relationships between faculty, students, various departments, and outside organizations relating to SWDs and accessibility; different perspectives and experiences of faculty who teach courses within programs that have an emphasis on accessibility, AT, or working with people with disabilities; faculty experiences and perspectives of working with SWDs and providing accessible materials in OLEs; faculty training and experience with accessibility and people with disabilities; faculty autonomy within OLEs as it relates to creating accessible content; accommodations and accessibility features used in OLEs; as well as LMS accessibility and usability. The results of this study led to several implications regarding training and support services for faculty, students, other staff, and administration within online programs, best practices for implementing accessibility, as well as recommendations for future studies.
|
66 |
Asset Reuse of Images From a RepositoryHerman, Deirdre 06 March 2014 (has links)
<p> According to Markus's theory of reuse, when digital repositories are deployed to collect and distribute organizational assets, they supposedly help ensure accountability, extend information exchange, and improve productivity. Such repositories require a large investment due to the continuing costs of hardware, software, user licenses, training, and technical support. The problem addressed in this study was the lack of evidence in the literature on whether users in fact reused enough digital assets in repositories to justify the investment. The objective of the study was to investigate the organizational value of repositories to better inform architectural, construction, software and other industries whether repositories are worth the investment. This study was designed to examine asset reuse of medical images at a health information publisher. The research question focused on the amount of asset reuse over time, which was determined from existing repository transaction logs generated over an 8-year period by all users. A longitudinal census data analysis of archival research was performed on the entire dataset of 85,250 transaction logs. The results showed that 42 users downloaded those assets, including 11,059 images, indicating that the repository was used by sufficient users at this publisher of about 80 employees. From those images, 1,443 medical images were reused for new product development, showing a minimal asset reuse rate of 13%. Assistants (42%), writers (20%), and librarians (16%) were the primary users of this repository. Collectively, these results demonstrated the value of repositories in improving organizational productivity—through reuse of existing digital assets such as medical images to avoid unnecessary duplication costs—for social change and economic transformation.</p>
|
67 |
Efficiently managing the computer engineering and Computer Science labsPickrell, Nathan 03 May 2013 (has links)
<p> University lab environments are handled differently than corporate, government, and commercial Information Technology (IT) environments. While all environments have the common issues of scalability and cross-platform interoperability, educational lab environments must additionally handle student permissions, student files, student printing, and special education labs. The emphasis is on uniformity across lab machines for a uniform course curriculum.</p><p> This thesis construes how a specific set of Computer Science labs are maintained. It describes how documentation is maintained, how the lab infrastructure is setup, how the technicians managing the lab build master lab images, how all of the workstations in the lab are cloned, and how a portion of the maintenance is handled. Additionally, this paper also describes some of the specialty labs provided for courses with functional topics.</p>
|
68 |
Data-Driven Decision Making as a Tool to Improve Software Development ProductivityBrown, Mary Erin 02 October 2013 (has links)
<p> The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited information about software managers' experiences with data-driven decision making (DDD) in agile software organizations as a tool to improve software development productivity. The purpose of this phenomenological study was to explore how agile software managers view DDD as a tool to improve software development productivity and to understand how agile software development organizations may use DDD now and in the future to improve software development productivity. Research questions asked about software managers', project managers', and agile coaches' lived experiences with DDD via a set of interview questions. The conceptual framework for the research was based on the 3 critical dimensions of software organization productivity improvement: people, process, and tools, which were defined by the Software Engineering Institute's Capability Maturity Model Integrated published in 2010. Organizations focus on processes to align the people, procedures and methods, and tools and equipment to improve productivity. Positive social change could result from a better understanding of DDD in an agile software development environment; this increased understanding of DDD could enable organizations to create more products, offer more jobs, and better compete in a global economy.</p>
|
69 |
Information and Communications Technology Based Solution to Rank Emergency HospitalsJenkins, Taneaka Anesha 16 November 2013 (has links)
<p> With the advent of smart phone technologies, the healthcare industry finds it challenging to keep up with technology demands. In the medical domain, patients are experiencing longer wait times for medical treatment. A basis of dissatisfaction with healthcare, often observed by patients, is the amount of time they wait during a visit. The wait times have a greater delay within medical emergency facilities. Current medical wait time applications may encourage patients to be seen quickly but does not necessarily offer quality care or other aspects of their visits. The amount of time a patient experiences in an emergency facility could influence the patient's perspective and could be contingent upon other qualities. We sought to investigate the association between patient perception of the hospital, time to reach the hospital, patient wait time, patient reviews, and average service time of various North Carolina hospitals using product moment correlation analysis. Analyses were performed of the various hospitals based upon each parameter. In this thesis, we propose a smart phone based service to optimize travel time to a medical facility utilizing patient wait time, service time, time to reach the hospital, patient reviews, and patient perception of the facility and Global Positioning System (GPS) data. Various hospitals were compared ranking in according to the parameters individually, relative to other hospitals in neighboring counties and cities. Each constraint is assigned a weight to be used in the overall ranking of the hospital. We have established relationships about correlation parameters. The parameters were assessed to determine correlations between any two given parameters.</p>
|
70 |
Darknets, cybercrime & the onion router| Anonymity & security in cyberspaceYetter, Richard B. 01 May 2015 (has links)
<p> Anonymizing Internet technologies present unique challenges to law enforcement and intelligence organizations. Tor is an anonymity technology that has received extensive media coverage after a virtual black market hidden by its network was seized by the FBI in 2013. Anonymizing technologies have legitimate purposes, and as states increasingly employ sophisticated Internet censorship and surveillance techniques, they are becoming increasingly relevant. This study examines the creation of the modern Internet, explores the drastic changes that have taken place, and takes an in-depth look at intended and unintended uses of anonymizing technologies.</p>
|
Page generated in 0.0979 seconds