Spelling suggestions: "subject:"userinterface"" "subject:"users’interface""
91 |
Development of User Interface for Multibeam Echo Sounder Quality ControlHu, Shing-wen 23 July 2007 (has links)
Multi-beamecho sounder systemhas been around nowfor some 40 years and their use in shallow waters for the last 14 years. With modern shallow water systems running at up to 9,600 soundings/second, data collection at the rate of approximately 250 million soundings/day system is possible. Processing of Multibeam Echo sounder (MBES) data is a challenging task from both hydrographic and technological perspectives. We recognize that a completely automatic system is improbable, but propose that significant benefits can still be had if we can automatically process good quality data, and highlight areas that probably need further attention.
We propose an algorithm that takes uncleaned MBES data and attempts to pick out outliers as possible as we can. The traditionalmethod that is still in use today by numerous software applications is based on a line-by-line processing approach. Automatically filtering for a depth window, by beam number, slope between points, quality flags and recently by whether the beam¡¦s error is outside the IHO order for the survey are a number of ways in which the line-by-line approach has been speeded up. The fundamental differences between our method and the previous methods are that our algorithm does not actually delete any soundings at all and transform original one dimension information into two dimensions. Finally, we use Hierarchical Clustering to classifyMBES data into outliers and normal.
We develop the user interface formulti-beamecho sounder quality control. It provides almost the necessary tools and modules to perform a survey. Standard modules are Survey planning (track guidance lines, waypoints), channel design and 3D modeling, data acquisition, data QC and data processing/flagged. However, it will visualize the soundings to aid the decisionmaking process.
|
92 |
A new graphical user interface for a 3D topological mesh modelerMorris, David Victor 10 October 2008 (has links)
In this thesis, I present a new platform-independent, open source, intuitive
graphical user interface for TopMod, an application designed for interacting with
3-dimensional manifold meshes represented by a Doubly Linked Face List (DLFL).
This new interface, created using the Trolltech Qt user interface library, enables
users to construct and interact with complex manifold meshes much faster and more
easily than was previously possible. I also present a method for the rapid creation
of a successful online community of users and developers, by integrating a variety
of open source web-based software packages. The new website, which includes a
discussion forum, a news blog, a collaborative user and developer wiki, and a source
code repository and release manager, received an average of 250 unique visits per
day during the first two months of its existence, and it continues to be utilized by a
variety of users and developers worldwide.
|
93 |
MDA : A Methodology For Web-based UI TransformationLiu, Wen-Chin 08 August 2009 (has links)
This study presents a systematic methodology which integrated the model driven architecture with object-oriented technique to transform the platform independent model (PIM) into Web-based user interface (UI) platform specific model (PSM), and then into code model. A real-world case using the integrated techniques is presented and the Rational Rose is used to illustrate the concepts, application, and the advantages of using the proposed methodology. With this approach, the system developer can transform PIM into Web-based UI PSM and code automatically and thereby enhance the efficiency of system development.
|
94 |
Automated testing of a web-based user interfaceKastegård, Sandra January 2015 (has links)
Testing is a vital part of software development and test automation is an increasingly common practise. Performing automated testing on web-based applications is more complicated than desktop applications, which is particularly clear when it comes to testing a web based user interface as they are becoming more complex and dynamic. Depending on the goals and needed complexity of the testing, a variety of different frameworks/tools are available to help implementing it. This thesis investigates how automated testing of a web-based user interface can be implemented. Testing methods and a selection of relevant testing frameworks/tools are presented and evaluated based on given requirements. Out of the selected frameworks/tools, the Selenium WebDriver framework is chosen and used for implementation. The implementation results in automated test cases for regression testing of the functionality of a user interface created by Infor AB.
|
95 |
A Modular and Extensible User Interface for the Telemetry and Control of a Remotely Operated VehicleMorrow, Tyler 10 1900 (has links)
ITC/USA 2014 Conference Proceedings / The Fiftieth Annual International Telemetering Conference and Technical Exhibition / October 20-23, 2014 / Town and Country Resort & Convention Center, San Diego, CA / This paper discusses the rover engagement display (RED), an application that integrates network communication, control systems, numerical and visual analysis of telemetry, and a graphical user interface for communicating with the embedded systems of a remote vehicle. The target vehicle is a wheeled rover participating in the University Rover Challenge, a competition that observes the performance of rovers in an environment similar to that of the planet Mars. Communication with the rover occurs via a TCP connection and messages adhere to a simple protocol. The RED user interface is visually modular in an attempt to provide additional scalability and extensibility. Control algorithms, user interface design concepts, and code architecture (C#) are discussed.
|
96 |
Supporting Children's Self-esteem Development Using Mobile Application: Theoretical background and design of the user interfaceLehtimäki, Vera January 2015 (has links)
No description available.
|
97 |
Ανάπτυξη γραμμικής διεπαφής χρήστη (graphical user interface) για τη δημιουργία μακροεντολών προσομοίωσης (macrofiles) στο πακετό λογισμικού GateΜπούκης, Σπυρίδων Α. 11 December 2008 (has links)
- / -
|
98 |
{Spatial Tactile Feedback Support for Mobile Touch-screen DevicesYatani, Koji 12 January 2012 (has links)
Mobile touch-screen devices have the capability to accept flexible touch input, and can provide a larger screen than mobile devices with physical buttons. However, many of the user interfaces found in mobile touch-screen devices require visual feedback. This raises a number of user interface challenges. For instance, visually-demanding user interfaces make it difficult for the user to interact with mobile touch-screen devices without looking at the screen---a task the user sometimes wishes to do particularly in a mobile setting. In addition, user interfaces on mobile touch-screen devices are not generally accessible to visually impaired users. Basic tactile feedback (e.g., feedback produced by a single vibration source) can be used to enhance the user experience on mobile touch-screen devices. Unfortunately, this basic tactile feedback often lacks the expressiveness for generating vibration patterns that can be used to convey specific information about the application to the user. However, the availability of richer information accessible through the tactile channel would minimize the visual demand of an application. For example, if the user can perceive which button she is touching on the screen through tactile feedback, she would not need to view the screen, and can instead focus her visual attention towards the primary task (e.g., walking).
In this dissertation, I address high visual demand issues found in existing user interfaces on mobile touch-screen devices by using spatial tactile feedback. Spatial tactile feedback means tactile feedback patterns generated in different points of the user's body (the user's fingers and palm in this work). I developed tactile feedback hardware employing multiple vibration motors on the backside of a mobile touch-screen device. These multiple vibration motors can produce various spatial vibration patterns on the user's fingers and palm. I then validated the effects of spatial tactile feedback through three different applications: eyes-free interaction, a map application for visually impaired users, and collaboration support. Findings gained through the series of application-oriented investigations indicate that spatial tactile feedback is a beneficial output modality in mobile touch-screen devices, and can mitigate some visual demand issues.
|
99 |
{Spatial Tactile Feedback Support for Mobile Touch-screen DevicesYatani, Koji 12 January 2012 (has links)
Mobile touch-screen devices have the capability to accept flexible touch input, and can provide a larger screen than mobile devices with physical buttons. However, many of the user interfaces found in mobile touch-screen devices require visual feedback. This raises a number of user interface challenges. For instance, visually-demanding user interfaces make it difficult for the user to interact with mobile touch-screen devices without looking at the screen---a task the user sometimes wishes to do particularly in a mobile setting. In addition, user interfaces on mobile touch-screen devices are not generally accessible to visually impaired users. Basic tactile feedback (e.g., feedback produced by a single vibration source) can be used to enhance the user experience on mobile touch-screen devices. Unfortunately, this basic tactile feedback often lacks the expressiveness for generating vibration patterns that can be used to convey specific information about the application to the user. However, the availability of richer information accessible through the tactile channel would minimize the visual demand of an application. For example, if the user can perceive which button she is touching on the screen through tactile feedback, she would not need to view the screen, and can instead focus her visual attention towards the primary task (e.g., walking).
In this dissertation, I address high visual demand issues found in existing user interfaces on mobile touch-screen devices by using spatial tactile feedback. Spatial tactile feedback means tactile feedback patterns generated in different points of the user's body (the user's fingers and palm in this work). I developed tactile feedback hardware employing multiple vibration motors on the backside of a mobile touch-screen device. These multiple vibration motors can produce various spatial vibration patterns on the user's fingers and palm. I then validated the effects of spatial tactile feedback through three different applications: eyes-free interaction, a map application for visually impaired users, and collaboration support. Findings gained through the series of application-oriented investigations indicate that spatial tactile feedback is a beneficial output modality in mobile touch-screen devices, and can mitigate some visual demand issues.
|
100 |
Programos vartojamumo tyrimas naudojant karkasą vartotojo veiksmams registruoti ir analizuoti / Program usage research with framework for registration and analysis of user actionsAbromaitis, Jonas 05 November 2013 (has links)
Norint pilnai ištirti programos, skirtos tarptautinei rinkai, vartojamumą, reikia išbandyti ją su skirtingų kultūrų atstovais. Kadangi dažniausiai nėra galimybių tiesiogiai bendrauti su vartotojais iš viso pasaulio, reikalingas karkasas, nuotoliniam vartotojų veiksmų registravimui. Kuriamas karkasas ne tik registruos vartotojo veiksmus, bet ir leis juos analizuoti. Tokiu būdu, galima aptikti tam tikroje pasaulio vietoje būdingų vartotojo sasajos trūkumų, juos pataisyti, ir ištirti naujos versijos efektyvumą. / If you want to do usability research for program, which is being used in whole world, you need to try it with persons from different cultures. Most of the times there is no way, that you can communicate with users all over the world, so you need framework, that whould register user actions through distance. This new framework, that is being created, will also let you to do analysis of registered actions, so you can detect usability problems, which reveals only in specific place of the world. You can also fix that defect and do the research again, to find out if that problem is gone.
|
Page generated in 0.0476 seconds