461 |
Interactive videodisc technology in public school settings: an analytic review and delphi studyLowenstein, Ronnie B. January 1988 (has links)
This study examined the nature and potential of interactive videodisc technology in public schools through an analytic review of the current literature and a modified Delphi study. The analytic review synthesized some of the writings related to classroom applications, technology policy in education, cognition and learning, and dissemination and diffusion of innovation in schools. The Delphi component of the study managed an interaction among 32 national videodisc and education leaders. It consisted of an in-depth interview and two subsequent rounds of questioning of a panel chosen from the fields of education, government, private industry, and the military. The initial interviews asked panelists to respond to four questions: (a) the potential of interactive videodisc technology for education, (b) the measures/policies needed to achieve visitors of potential, (c) the barriers inhibiting die potential, and (d) future scenarios.
Analysis of the interview data informed the design of two subsequent research that were limited to visions of potential and measures needed to achieve those visions. The questionnaires, by providing anonymous feedback of group judgment and individual comments, enabled panelists to reassess their original positions and beliefs.
A review of the findings revealed nine domains of issues panelists considered important to understanding the relationship of interactive videodisc technology and schooling. 'They are: (1) technological capabilities; (2) legitimate descriptors; (3) potential benefits; (4) goals and rationale for use; (5) production ard design issues; (6) marketing issues; (7) research issues; (8) funding and responsibility issues; and (9) applications in different locations.
A systematic search for commonality within those domains disclosed four recurring themes: (1) the complexity and interrelatedness of issues; (2) the importance of the context to technology applications; (3) the between potential and reality; and (4) historical parallels between public school applications of interactive videodiscs and other media technology.
The concluding chapter presents a discussion of those themes and their implications, along with recommendations for further research and for ways that the various stakeholders of technology in education might promote thoughtful applications of interactive videodisc technology. / Ed. D.
|
462 |
Analysis of handover decision making in downlink Long Term Evolution networksElujide, Israel Oludayo 15 January 2015 (has links)
Submitted in fulfillment of the requirements of the Master of Technology Degree in Information Technology, Durban University of Technology, Durban, South Africa, 2014. / This dissertation reports on handover in downlink Long Term Evolution (LTE) networks. The LTE is seen as the technology that will bring about Fourth Generation (4G) mobile broadband experience. The necessity to maintain quality of service for delay sensitive data services and applications used by mobile users makes mobility and handover between base stations in the downlink LTE very critical. Unfortunately, several handover schemes in LTE are based on Reference Symbols Received Power (RSRP) which include measurement error due to limited symbols in downlink packets. However, prompt and precise handover decision cannot be based on inaccurate measurement. Therefore, the downlink LTE intra-system handover is studied with focus on user measurement report.
The study centers on preparation stage of the LTE handover procedure. Two different types of physical layer filtering technique namely linear averaging and local averaging are focused upon among others investigated. The performance of LTE conventional physical layer filtering technique, linear filtering, is compared with an alternative technique called local averaging. The output of each physical layer filtering is then used for LTE standardized radio resource layer filtering (otherwise called L3 filtering). The analysis of results from handover decision is based on simulations performed in an LTE system-level simulator. The performance metrics for the results are evaluated in terms of overall system and mobility-related performance.
The system performance is based on spectral efficiency and throughput while mobility-related performance is based on handover failure. The performance comparison of the results shows that local averaging technique provides improved system performance of about 51.2 % for spectral efficiency and 42.8% cell-edge throughput for high speed users. Local averaging also produces a reduction of about 26.95% in average number of handover failure when L 3 filtering is applied for low speed mobile terminal. This result confirms that both averaging techniques are suitable for LTE network. Moreover, in the case of high mobility local averaging tends to be better than linear averaging.
|
463 |
Customer experience with smartphones : a university student perspectiveMupamhanga, Musiyiwa January 2016 (has links)
Dissertation submitted in fulfillment of the requirements for the Degree of Master Of Technology: Marketing, Durban University Of Technology, Durban, South Africa, 2016. / The classical view that an industry is a customer-satisfying process and not a goods-producing process is vital for all businesses to understand. Today the mobile industry have produced a smartphone which represents a dramatic departure from traditional computing platforms as they no longer represent a static notion of context, where changes are absent, small or predictable. Therefore, today’s industries need to begin the production process with customer’s needs and not with patents, raw materials, products or selling skills. With this view rather, an organisation can only create the environment and the circumstances in which the consumer could have an experience. Furthermore, an organisation cannot grant an experience to the consumer in isolation. In seeking to expand an understating of the above classical view, this study inquired into customer experience derived from owing and using the most decorated product of the era, that is, the smartphone.
The essence of this study aimed at investigating customer experience by studying smartphone usage from the students’ perspective. It studied the gap between students’ expectation and the subsequent experiences in order to determine satisfaction levels. Furthermore, cognitive dissonance was investigated to determine if there were any remorse feelings towards the smartphone. A descriptive study was employed with a quantitative inquiry and the survey used the convenience sampling method. A questionnaire was administered to students within the Durban University of Technology (DUT) fraternity. The Statistical Package for Social Science Software (SPSS) version 21 was used to analyse and to interpret the data.
The key findings of the study indicate that South African university students (DUT) have positive experiences with the smartphones. Albeit, the findings indicate positive experiences, minimal presence of cognitive dissonance is also depicted. The presence of dissonance highlights that an idea cannot have a single measure which is universally meaningful. Therefore, the study expose that every product will always be exposed to suggestions of change, no matter how it can be deemed smart. / M
|
464 |
Assessment of a group decision support system in a field setting.Heminger, Alan Ray. January 1988 (has links)
There has been increasing research interest in recent years in using the power of computers to support group work. There have been two main areas of research: experimental research into GDSS supported group work in laboratory settings, and research designed to develop GDSSs which are effective, efficient and acceptable to their users. However, there have been some contradictory findings from these two areas of research. The developmental effort has shown great promise in relatively controlled developmental settings. At the same time, experimental research has indicated that GDSSs may not provide the hoped for increases in effectiveness and efficiency while being accepted by their users. This study has attempted to clarify this situation by using a field study to assess the implementation of a GDSS in an operational environment. The setting for this study was a large engineering and manufacturing site of a large electronics company. A GDSS which had been developed at the University of Arizona was installed at the host company's site, and it was assessed for the first nine months of its use. Results indicate that the system was perceived to be effective, efficient and acceptable for use by its intended users.
|
465 |
Channel Assignment in Cognitive Radio Wireless NetworksUnknown Date (has links)
Cognitive radio technology that enables dynamic spectrum access has been
a promising solution for the spectrum scarcity problem. Cognitive radio networks
enable the communication on both licensed and unlicensed channels, having the potential
to better solve the interference and collision issues. Channel assignment is of
great importance in cognitive radio networks. When operating on licensed channels,
the objective is to exploit spectrum holes through cognitive communication, giving
priority to the primary users. In this dissertation, we focus on the development of efficient
channel assignment algorithms and protocols to improve network performance
for cognitive radio wireless networks. The first contribution is on channel assignment
for cognitive radio wireless sensor networks aiming to provide robust topology control,
as well as to increase network throughput and data delivery rate. The approach
is then extended to specific cognitive radio network applications achieving improved
performances. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2016. / FAU Electronic Theses and Dissertations Collection
|
466 |
Implementation of a mobile data collector in wireless sensor networks for energy conservationUnknown Date (has links)
A Wireless Sensor Network (WSN) is composed of low-cost electronic devices with sensing, data storage and transmitting capabilities, powered by batteries. There are extensive studies in the field of WSN investigating different algorithms and protocols for data collection. A data collector can be static or mobile. Using a mobile data collector can extend network lifetime and can be used to collect sensor data in hardly accessible locations, partitioned networks, and delay-tolerant networks. The implementation of the mobile data collector in our study consists of combining two different platforms: the Crossbow sensor hardware and the NXT Legos. We developed an application for data collection and sensor querying support. Another important contribution is designing a semi-autonomous robot control. This hardware prototype implementation shows the benefits of using a mobile data collector in WSN. It also serves as a reference in developing future applications for mobile WSNs. / by Pedro L. Heshike. / Thesis (M.S.C.S.)--Florida Atlantic University, 2011. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2011. Mode of access: World Wide Web.
|
467 |
Adaptive hierarchical weighted fair queuing scheduling in WiMAX networksUnknown Date (has links)
The growing demand for faster connection to the Internet service and wireless
multimedia applications has motivated the development of broadband wireless access
technologies in recent years. WiMAX has enabled convergence of mobile and fixed
broadband networks through a common wide-area radio-access technology and flexible
network architecture. Scheduling is a fundamental component in resource management in
WiMAX networks and plays the main role in meeting QoS requirements such as delay,
throughput and packet loss for different classes of service. In this dissertation work, the performance of uplink schedulers at the fixed WiMAX MAC layer has been considered, we proposed an Adaptive Hierarchical Weighted Fair Queuing Scheduling algorithm, the new scheduling algorithm adapts to changes in traffic, at the same time; it is able to heuristically enhance the performance of WiMAX network under most circumstances. The heuristic nature of this scheduling algorithm enables the MAC layer to meet the QoS requirements of the users. The performance of this adaptive WiMAX Uplink algorithm has been evaluated by simulation using MATLAB. Results indicate that the algorithm is efficient in scheduling the Base Stations’ traffic loads, and improves QoS. The utilization of relay stations is studied and simulation results are compared with the case without using relay stations. The results show that the proposed scheduling algorithm improves Quality of Service of WiMAX system. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2014. / FAU Electronic Theses and Dissertations Collection
|
468 |
Low complexity H.264 video encoder design using machine learning techniquesUnknown Date (has links)
H.264/AVC encoder complexity is mainly due to variable size in Intra and Inter frames. This makes H.264/AVC very difficult to implement, especially for real time applications and mobile devices. The current technological challenge is to conserve the compression capacity and quality that H.264 offers but reduce the encoding time and, therefore, the processing complexity. This thesis applies machine learning technique for video encoding mode decisions and investigates ways to improve the process of generating more general low complexity H.264/AVC video encoders. The proposed H.264 encoding method decreases the complexity in the mode decision inside the Inter frames. Results show, at least, a 150% average reduction of complexity and, at most, 0.6 average increases in PSNR for different kinds of videos and formats. / by Paula Carrillo. / Thesis (M.S.C.S.)--Florida Atlantic University, 2008. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2008. Mode of access: World Wide Web.
|
469 |
System sizing and resource allocation for video-on-demand systems.January 1997 (has links)
by Mary Y.Y. Leung. / Thesis (M.Phil.)--Chinese University of Hong Kong, 1997. / Includes bibliographical references (leaves 64-66). / Abstract --- p.i / Acknowledgments --- p.iii / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Video-On-Demand Environment --- p.1 / Chapter 1.2 --- Problem Definition --- p.3 / Chapter 2 --- Related Work --- p.7 / Chapter 2.1 --- Data Sharing Techniques --- p.7 / Chapter 2.1.1 --- Batching --- p.7 / Chapter 2.1.2 --- Buffering --- p.9 / Chapter 2.1.3 --- Static Partitioning --- p.10 / Chapter 2.1.4 --- Adaptive Piggybacking --- p.10 / Chapter 2.2 --- Providing VCR Functionalities --- p.12 / Chapter 3 --- System Model --- p.15 / Chapter 3.1 --- Operations involved in VCR Control --- p.15 / Chapter 3.2 --- Normal Playback Model --- p.17 / Chapter 3.3 --- VCR Model --- p.18 / Chapter 4 --- Resource Allocation for Normal Playback --- p.21 / Chapter 4.1 --- Mathematical Model --- p.22 / Chapter 4.1.1 --- Hits occurring within the same partition (hit w) --- p.24 / Chapter 4.1.2 --- Jump to other partitions (hito) --- p.27 / Chapter 4.1.3 --- Fast-forwarding to the end of a movie --- p.30 / Chapter 4.1.4 --- The expected hit probability P(hit) --- p.31 / Chapter 4.2 --- Model Verification --- p.32 / Chapter 5 --- Resource Allocation for VCR mode --- p.35 / Chapter 5.1 --- Scheme 1: No merging --- p.36 / Chapter 5.2 --- Scheme 2: Merging by adaptive piggybacking and buffering --- p.36 / Chapter 5.2.1 --- Resuming within the threshold (Δ ≤ k) --- p.38 / Chapter 5.2.2 --- Resuming beyond the threshold (Δ > k) --- p.39 / Chapter 5.3 --- Verification --- p.42 / Chapter 6 --- Applications to System sizing --- p.45 / Chapter 6.1 --- Cost of Resources for Normal Playback --- p.46 / Chapter 6.2 --- Cost of Resources for VCR functions --- p.48 / Chapter 6.3 --- Overall system cost --- p.49 / Chapter 6.4 --- Comparison --- p.50 / Chapter 6.4.1 --- Scheme 1 vs. Scheme 2 --- p.51 / Chapter 6.4.2 --- Scheme 2 vs. pure I/O & pure buffer --- p.54 / Chapter 6.4.3 --- Different values of k --- p.58 / Chapter 6.4.4 --- Different values of ψ --- p.60 / Chapter 7 --- Conclusions --- p.62 / Bibliography --- p.64 / Chapter A --- Appendix --- p.67 / Chapter A.l --- Rewind --- p.67 / Chapter A.1.1 --- Hits occurring within the same partition (hit w) --- p.67 / Chapter A.1.2 --- Jump to other partitions (hit0) --- p.68 / Chapter A.2 --- Pause --- p.70
|
470 |
Dynamic Characterization of Aluminum Softball BatsLee, Danny V. 09 May 2001 (has links)
On January 1, 2000, the Amateur Softball Association of America (ASA) imposed maximum bat performance limitations on commercial softball bats. The ASA adopted a testing standard defined by the American Society of Testing and Materials (ASTM) to determine the bat performance factor (BPF), a normalized coefficient of restitution that must be less than 1.2 for the bat to be eligible for ASA sanctioned events.
The ASTM standard requires that the softball strike the bat, which is free to rotate in the horizontal plane, at 26.8 mfs ± 0.3 mfs (88 ftfs ± 1 ftfs) with little or no spin. The central project goal was to develop the ASTM test apparatus, which consisted of a precision ball launcher, a pivoting stage for the bat, and instrumentation for velocity measurements. The key feature of the testing apparatus developed in this project was the ability to measure the rebound velocity of the ball directly-ASTM method derives the ball rebound velocity by assuming the bat behaves as a rigid body and applying conservation of angular momentum.
Tests revealed a discrepancy in the BPF between the ASTM method and an alternative method,. termed the direct method, which uses the direct measurement of the ball rebound velocity. Furthermore, the ASTM method proved to be very sensitive to parameter errors, demonstrated by magnification factors between 2.0 and 3.0. The direct method was insensitive to parameter variation with magnification factors between o and 1.0.
The ball rebound velocity discrepancy was also analyzed with mechanism simulation software. A three-degree-of-freedom model of the bat was used to test the effects of elasticity and pivot friction. The analysis determined that applying conservation of angular momentum on an elastic body caused transient errors in the derivation of the ball rebound velocity; and pivot friction significantly affected the motion of the bat and thus, the derived ball rebound velocity.
The experimental results show that the direct method was more accurate than the ASTM method in calculating the BPF; and the conclusion of the analytical model shows that the ASTM method can be corrected by precisely identifying external moments in the system.
|
Page generated in 0.1502 seconds