Human–computer interaction (HCI) is research in the design and the use of computer technology, which focuses on the interfaces between people (users) and computers. HCI researchers observe the ways humans interact with computers and design technologies that allow humans to interact with computers in novel ways. A device that allows interaction between human being and a computer is known as a "Human-computer Interface (HCI)".
As a field of research, human–computer interaction is situated at the intersection of computer science, behavioral sciences, design, media studies, and several other fields of study. The term was popularized by Stuart K. Card, Allen Newell, and Thomas P. Moran in their 1983 book, The Psychology of Human–Computer Interaction. The first known use was in 1975 by Carlisle. The term is intended to convey that, unlike other tools with specific and limited uses, computers have many uses which often involve an open-ended dialogue between the user and the computer. The notion of dialogue likens human–computer interaction to human-to-human interaction: an analogy that is crucial to theoretical considerations in the field. (Full article...)
In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, ergonomics and psychology.
Generally, the goal of user interface design is to produce a user interface that makes it easy, efficient, and enjoyable (user-friendly) to operate a machine in the way which produces the desired result (i.e. maximum usability). This generally means that the operator needs to provide minimal input to achieve the desired output, and also that the machine minimizes undesired outputs to the user. (Full article...)
Image 2A computer monitor provides a visual interface between the machine and the user. (from Human–computer interaction)
Image 3A child's hand location and movement being detected by a gesture recognition algorithm (from Gesture recognition)
Image 4A VPL Research DataSuit, a full-body outfit with sensors for measuring the movement of arms, legs, and trunk. Developed c. 1989. Displayed at the Nissho Iwai showroom in Tokyo (from Virtual reality)
Image 6A CAVE system at IDL's Center for Advanced Energy Studies in 2010 (from Virtual reality)
Image 7In theory, VR represents a participant's field of view (yellow area). (from Virtual reality)
Image 8These binary silhouette(left) or contour(right) images represent typical input for appearance-based algorithms. They are compared with different hand templates and if they match, the correspondent gesture is inferred. (from Gesture recognition)
Image 10Paramount for the sensation of immersion into virtual reality are a high frame rate and low latency.
Image 11The skeletal version (right) is effectively modeling the hand (left). This has fewer parameters than the volumetric version and it's easier to compute, making it suitable for real-time gesture analysis systems. (from Gesture recognition)
Image 12Middleware usually processes gesture recognition, then sends the results to the user. (from Gesture recognition)
Image 13An operator controlling The Virtual Interface Environment Workstation (VIEW) at NASAAmes around 1990 (from Virtual reality)
Image 17View-Master, a stereoscopic visual simulator, was introduced in 1939.
Image 18The user interacts directly with hardware for the human input and output such as displays, e.g. through a graphical user interface. The user interacts with the computer over this software interface using the given input and output (I/O) hardware. Software and hardware are matched so that the processing of the user input is fast enough, and the latency of the computer output is not disruptive to the workflow. (from Human–computer interaction)
Image 19An Omni treadmill being used at a VR convention (from Virtual reality)
Image 24Virtual Fixtures immersive AR system developed in 1992. Picture features Dr. Louis Rosenberg interacting freely in 3D with overlaid virtual objects called 'fixtures'. (from Virtual reality)
Image 25A real hand (left) is interpreted as a collection of vertices and lines in the 3D mesh version (right), and the software uses their relative position and interaction in order to infer the gesture. (from Gesture recognition)