You are on page 1of 15

HUMAN COMPUTER INTERACTION TUTORIAL

Q1. What do you mean by Cognitive Psychology ?


Cognitive psychology studies and analyses the mental processes. This includes how we think,
remember, learn and perceive.

System of theories which states that human mental and physical activity can be explained
in terms of information processing by a computer, and attempts to investigate how
mind works by focusing on how organisms perceive, remember, think and learn. As an
emerging discipline, it integrates ideas from fields such as artificial intelligence, epistemology,
linguistics, mathematics, neuroscience, and philosophy. It maintains that behavior can only be
understood by studying the underlying mental processes, interaction between an organism and
its environment influences its behavior as well as its knowledge of the environment which
affects its future response to the environment, how animals behave may not
be applied directly to the study of human behavior, but how machines learn may be,
development of learning strategies and structuring of learning environment brings
understanding, knowledge is not something that is acquired but something that is created by a
learner based on what he or she already knows, an instructor should focus on encouraging
exploration and knowledge formation, development of judgment,
and acquisition and organization of information by the learner.

EXAMPLES

 Attention - Sometimes our cognitive processing systems get overloaded and we have to
select information to process further. This deals with how and why performance improves
with attention.
 Formation of concepts - This aspect studies human’s ability to organize experiences into
categories. Response to stimulus is determined by the relevant category and the
knowledge associated with that particular category.
 Judgment and decision - This is the study of decision making. Any behavior, implicit or
explicit, requires judgment and then a decision or choice.
 Language processing - This is the study of how language is acquired, comprehended and
produced. It also focuses on the psychology of reading. This includes processing words,
sentences, concepts, inferences and semantic assumptions.
 Learning - This is the study of new cognitive or conceptual information that is taken in and
how that process occurs. It includes implicit learning that takes into account previous
experience on performance.
 Memory - Studying human memory is a large part of cognitive psychology. It covers the
process of acquiring, storing and retrieving memory, including facts, skills and capacity.
 Perception - This includes the senses and the processing of what we sense. This also
includes what we sense and how it interacts with what we already know.
 Problem solving - Solving problems is a way that humans achieve goals.
 Achieving goals - Moving to a goal can include different kinds of reasoning, perception,
memory, attention and other brain functions.
 Reasoning - This is the process of formulating logical arguments. It involves making
deductions and inferences and why some people value certain deductions over others. This
can be affected by educated intuitive guesses, fallacies or stereotypes.

Q2. What is the difference between cognition &perception?

Cognition can simply be defined as the mental processes which assist us to remember, think,
know, judge, solve problems, etc. It basically assists an individual to understand the world
around him or her and to gain knowledge. All human actions are a result of cognitive
procedures. These cognitive abilities can range from being very simple to extremely complex in
nature. Cognition can include both conscious and also unconscious processes. Attention,
memory, visual and spatial processing, motor, perception are some of the mental processes.
This highlights that perception can also be considered as one such cognitive ability. In many
disciplines, cognition is an area of interest for academics as well as scientists. This is mainly
because the capacities and functions of the cognition are rather vast and applies to many fields.

Perception is the process by which we interpret the things around us through sensory stimuli.
This can be through sight, sound, taste, smell, and touch. When we receive sensory
information, we not only identify it but also respond to the environment accordingly. In daily
life, we rely greatly on this sensory information for even the minute tasks. Let us understand
this through an example. Before crossing the road from a pedestrian crossing, we usually tend
to look both ways before crossing the road. In such as instance, it is the sensory information
gained through sight and sound that gives the signal for us to cross the road. This can be
considered an instance where people respond to the environment according to the received
information. This highlights that perception can be considered as an essential cognitive skill,
which allows people to function effectively. This skill or ability does not require much effort
from the side of the individual as it is one of the simplest processes of cognition.

Before crossing the road, we gather information through sensory stimuli.

Difference between Cognition and Perception

• Cognition includes a number of mental processes such as attention, memory, reasoning,


problem solving ,etc.

• Perception is the process that allows us to use our sense to make sense of the information
around us through organization, identification and interpretation.

• The main difference is that while cognition encompasses a variety of skills and processes,
Perception can be defined as one such cognitive skill or ability which assists in enhancing the
quality of cognitive abilities.

Q3.What is perceptual motor behavior?

Perceptual Motor Behaviour will have implications for the design of robotics, technology, and
novel rehabilitation programs. The findings from experiments conducted in the Perceptual
Motor Behaviour increase our understanding of the underlying neural processes for motor
control and learning, and test new and innovative approaches to the assessment and treatment
of new or challenging motor skills. The advancements made will contribute to developing
interventions that are cost effective and track changes in motor performance accurately.

Currently there are three main areas of research:

1. Multisensory-motor integration in the typically developing population

2. Sensorimotor integration in individuals with an Autism Spectrum Disorder


3. Assessment and treatment of individuals with neurological disorders

By studying how people move, we can:

1) Strive to better measure a patient’s clinical progression throughout a course of treatment

2) Quantify the delivery of manual therapies by clinicians, and perceptual factors that influence
performance

Our lab explores the two themes above using three approaches to research:

1) Basic Science

2) Applied Clinical Science

3) Clinical Intervention Studies.

Q4. Discuss Ergonomics in detail?

Ergonomics (from the Greek word ergon meaning work, and nomoi meaning natural laws), is
the science of refining the design of products to optimize them for human use. Human
characteristics, such as height, weight, and proportions are considered, as well as information
about human hearing, sight, temperature preferences, and so on. Ergonomics is sometimes
known as human factors engineering.

Computers and related products, such as computer desks and chairs, are frequently the focus
of ergonomic design. A great number of people use these products for extended periods of
time -- such as the typical work day. If these products are poorly designed or improperly
adjusted for human use, the person using them may suffer unnecessary fatigue, stress, and
even injury.

The term “ergonomics” can simply be defined as the study of work. It is the science of fitting
jobs to the people who work in them. Adapting the job to fit the worker can help reduce
ergonomic stress and eliminate many potential ergonomic disorders (e.g., carpel tunnel
syndrome, trigger finger, tendonitis). Ergonomics focuses on the work environment and items
such as the design and function of workstations, controls, displays, safety devices, tools and
lighting to fit the employee’s physical requirements, capabilities and limitations to ensure
his/her health and well being.

Q5. Define heuristics? Explain with example?

A heuristic is a word from the Greek meaning "to discover." It is an approach to problem
solving that takes one's personal experience into account.

Ways to Use Heuristics In Everyday Life


Here are some examples of real-life heuristics that people use as a way to solve a problem or to
learn something:

 "Consistency heuristic" is a heuristic where a person responds to a situation in way that


allows them to remain consistent.
 "Educated guess" is a heuristic that allows a person to reach a conclusion without
exhaustive research. With an educated guess a person considers what they have observed
in the past, and applies that history to a situation where a more definite answer has not yet
been decided.
 "Absurdity heuristic" is an approach to a situation that is very atypical and unlikely – in
other words, a situation that is absurd. This particular heuristic is applied when a claim or a
belief seems silly, or seems to defy common sense.
 "Common sense" is a heuristic that is applied to a problem based on an individual’s
observation of a situation. It is a practical and prudent approach that is applied to a
decision where the right and wrong answers seems relatively clear cut.
 "Contagion heuristic" causes an individual to avoid something that is thought to be bad or
contaminated. For example, when eggs are recalled due to a salmonella outbreak,
someone might apply this simple solution and decide to avoid eggs altogether to prevent
sickness.
 "Availability heuristic" allows a person to judge a situation on the basis of the examples of
similar situations that come to mind, allowing a person to extrapolate to the situation in
which they find themselves.
 "Working backward" allows a person to solve a problem by assuming that they have
already solved it, and working backward in their minds to see how such a solution might
have been reached.
 "Familiarity heuristic" allows someone to approach an issue or problem based on the fact
that the situation is one with which the individual is familiar, and so one should act the
same way they acted in the same situation before.
 "Scarcity heuristic" is used when a particular object becomes rare or scarce. This approach
suggests that if something is scarce, then it is more desirable to obtain.
 "Rule of thumb" applies a broad approach to problem solving. It is a simple heuristic that
allows an individual to make an approximation without having to do exhaustive research.
 "Affect heuristic" is when you make a snap judgment based on a quick impression. This
heuristic views a situation quickly and decides without further research whether a thing is
good or bad. Naturally, this heuristic can be both helpful and hurtful when applied in the
wrong situation.
 "Authority heuristic" occurs when someone believes the opinion of a person of authority
on a subject just because the individual is an authority figure. People apply this heuristic all
the time in matters such as science, politics, and education.

Q6. Explain Human computer Interaction in Software Process?


 Software engineering provides a means of understanding the structure of the design
process, and that process can be assessed for its effectiveness in interactive system
design. n
 Usability engineering promotes the use of explicit criteria to judge the success of a
product in terms of its usability.
 Iterative design practices work to incorporate crucial customer feedback early in the
design process to inform critical decisions which affect usability.
 Design involves making many decisions among numerous alternatives. Design rationale
provides an explicit means of recording those design decisions and the context in which
the decisions were made.

The software engineering life cycle aims to structure design in order to increase the reliability
of the design process. For interactive system design, this would equate to a reliable and
reproducible means of designing predictably usable systems. Because of the special needs of
interactive systems, it is essential to augment the standard life cycle in order to address issues
of HCI. Usability engineering encourages incorporating explicit usability goals within the design
process, providing a means by which the product’s usability can be judged. Iterative design
practices admit that principled design of interactive systems alone cannot maximize product
usability, so the designer must be able to evaluate early prototypes and rapidly correct features
of the prototype which detract from the product usability. The design process is composed of a
series of decisions, which pare down the vast set of potential systems to the one that is actually
delivered to the customer. Design rationale, in its many forms, is aimed at allowing the designer
to manage the information about the decision-making process, in terms of when and why
design decisions were made and what consequences those decisions had for the user in
accomplishing his work.

Q7. Define Cognitive neuroscience?

Cognitive neuroscience
The field of Cognitive Neuroscience concerns the scientific study of the neural mechanisms
underlying cognition and is a branch of neuroscience.
Cognitive neuroscience overlaps with cognitive psychology, and focuses on the neural
substrates of mental processes and their behavioral manifestations.

The boundaries between psychology, psychiatry and neuroscience have become quite blurred.

Cognitive neuroscientists tend to have a background in experimental psychology, neurobiology,


neurology, physics, and mathematics.

Methods employed in cognitive neuroscience include psychophysical experiments, functional


neuro imaging, electrophysiological studies of neural systems and, increasingly, cognitive
genomics and behavioral genetics.

Clinical studies in psychopathology in patients with cognitive deficits constitute an important


aspect of cognitive neuroscience.

The main theoretical approaches are computational neuroscience and the more traditional,
descriptive cognitive psychology theories such as psychometrics.
9. What do you mean by negative and positive appraisal in HCI? Why it is
important in interface design?

According to the Appraisal Theory of emotions, how we feel about a certain situation is
determined by our Appraisal or evaluation of the event.

Let's say you went on a job interview and you believe that the interview went well - you gave
good answers to the recruiter's questions, and you have all the qualifications needed for the
job, then you will have a positive emotion about it. You might feel happy and excited. But if you
perceive that it did not go well, then you will feel negatively about the event. You might feel
sad, dejected, or helpless.

The cognitive evaluation and interpretation of a phenomenon or event. In theories of emotion,


cognitive appraisals are seen as determinants of emotional experience, since they influence the
perception of the event. See cognitive theory.

APPRAISAL: "The person with a negative appraisal of the class felt unhappy, whereas the
person with a positive appraisal of the same class felt satisfied. "

“Appraisal” theories provide much greater predictive power than category or hierarchy-based
schemes by specifying the critical properties of antecedent events that lead to particular.
Ellsworth (1994), for example, described a set of “abstract elicitors” of emotion. In addition to
novelty and valence, Ellsworth contended that the level of certainty/uncertainty in an event has
a significant impact on the emotion experienced. For instance, “uncertainty about probably
positive events leads to interest and curiosity, or to hope,” while, “uncertainty about probably
negative events leads to anxiety and fearCertainty, on the other hand, can lead to relief in the
positive case and despair in the negative case. Because slow, unclear, or unusual responses
from an interface generally reflect a problem, one of the most common interface design
mistakes—from an affective standpoint—is to leave the user in a state of uncertainty. Users
tend to fear the worst when, for example, an application is at a standstill, the hourglass remains
up longer than usual, or the hard drive simply starts grinding away unexpectedly. Such
uncertainty leads to a state of anxiety that can be easily avoided with a well-placed, informative
message or state indicator. Providing users with immediate feedback on their actions reduces
uncertainty, promoting a more positive affective state (see Norman, 1990, on visibility and
feedback). When an error has actually occurred, the best approach is to make the user aware of
the problem and its possible consequences, but frame the uncertainty in as positive a light as
possible (i.e., “this application has experienced a problem, but the document should be
recoverable”). According to Ellsworth (1994), obstacles and control also play an important role
in eliciting emotion. High control can lead to a sense of challenge in positive situations, but
stress in negative situations. Lack of control, on the other hand, often results in frustration,
which if sustained can lead to desperation and resignation. In an HCI context, providing an
appropriate level of controllability, given a user’s abilities and the task at hand, is thus critical
for avoiding negative affective consequences. Control need not only be perceived to exist but
must be understandable and visible, otherwise the interface itself is an obstacle (Norman,
1990). Agency is yet another crucial factor determining emotional response .When oneself is
the cause of the situation, shame (negative) and pride (positive) are likely emotions. When
another person or entity is the cause, anger (negative) and love (positive) are more likely.
However, if fate is the agent, one is more likely to experience sorrow (negative) and joy
(positive). An interface often has the opportunity to direct a user’s perception of agency. In any
anomalous situation, for example—be it an error in reading a file, inability to recognize speech
input, or simply a crash—if the user is put in a position encouraging blame of oneself or fate,
the negative emotional repercussions may be more difficult to diffuse than if the computer
explicitly assumes blame (and is apologetic). For example, a voice interface encountering a
recognition error can say, “This system failed to understand your command” (blaming itself),
“The command was not understood” (blaming no one), or “You did not speak clearly enough for
your command to be understood” (blaming the user). Appraisal theories of emotion, are useful
not only in understanding the potential affective impacts of design decisions, but also in
creating computer agents that exhibit emotion. Although in some cases scripted emotional
responses are sufficient, in more dynamic or interactive contexts, an agent’s affective state
must be simulated to be believable.

10. What do you know about Haptic Interface?

A haptics interface is a system that allows a human to interact with a computer through bodily
sensations and movements. Haptics refers to a type of human-computer interaction technology
that encompasses tactile feedback or other bodily sensations to perform actions or processes
on a computing device.

A haptics interface is primarily implemented and applied in virtual reality environments, where
an individual can interact with virtual objects and elements. A haptics interface relies on
purpose-built sensors that send an electrical signal to the computer based on different sensory
movements or interactions. Each electrical signal is interpreted by the computer to execute a
process or action. In turn, the haptic interface also sends a signal to the human organ or body.
For example, when playing a racing game using a haptic interface powered data glove, a user
can use his or her hand to steer the car. However, when the car hits a wall or another car, the
haptics interface will send a signal that will imitate the same feeling on user’s hands in the form
of a vibration or rapid movement.
11. Discuss how gain or C:D ratio is used to compute the effectiveness of an
input device?

Gain Also known as control-to-display (CD) gain or C:D ratio, gain is defined as the distance
moved by an input device divided by the distance moved on the display. Gain confounds what
should be two measurements—(a) device size and (b) display size—with one arbitrary metric
and is therefore suspect as a factor for experimental study. In experiments, gain typically has
little or no effect on the time to perform pointing movements, but variable gain functions may
provide some benefit by reducing the required footprint (physical movement space) of a
device.Indirect tablets report the absolute position of a pointer on a sensing surface. Touch
tablets sense a bare finger, whereas graphics tablets or digitizing tablets typically sense a stylus
or other physical intermediary. Tablets can operate in absolute mode, with a fixed CD gain
between the tablet surface and the display, or in relative mode, in which the tablet responds
only to motion of the stylus. If the user touches the stylus to the tablet in relative mode, the
cursor resumes motion from its previous position; in absolute mode, it would jump to the new
position. Absolute mode is generally preferable for tasks such as drawing, handwriting, tracing,
or digitizing, but relative mode may be preferable for typical desktop interaction tasks such as
selecting graphical icons or navigating through menus. Tablets thus allow coverage of many
tasks, whereas mice only operate in relative mode

12. What are Wearable Computer? Explain with example?

Wearable computing is the study or practice of inventing, designing, building, or using


miniature body-borne computational and sensory devices. Wearable computers may be worn
under, over, or in clothing, or may also be themselves clothes.

Wearable computing as a reciprocal relationship between man and machine

An important distinction between wearable computers and portable computers (handheld and
laptop computers for example) is that the goal of wearable computing is to position or
contextualize the computer in such a way that the human and computer are inextricably
intertwined, so as to achieve Humanistic Intelligence – i.e. intelligence that arises by having the
human being in the feedback loop of the computational process, e.g. Mann 1998.

An example of Humanistic Intelligence is the wearable face recognizer (Mann 1996) in which
the computer takes the form of electric eyeglasses that "see" everything the wearer sees, and
therefore the computer can interact serendipitously. A handheld or laptop computer would not
provide the same serendipitous or unexpected interaction, whereas the wearable computer
can pop-up virtual nametags if it ever "sees" someone its owner knows or ought to know.

In this sense, wearable computing can be defined as an embodiment of, or an attempt to


embody, Humanistic Intelligence. This definition also allows for the possibility of some or all of
the technology to be implanted inside the body, thus broadening from "wearable computing"
to "bearable computing" (i.e. body-borne computing).

One of the main features of Humanistic Intelligence is constancy of interaction, that the human
and computer are inextricably intertwined. This arises from constancy of interaction between
the human and computer, i.e. there is no need to turn the device on prior to engaging it (thus,
serendipity).

Another feature of Humanistic Intelligence is the ability to multi-task. It is not necessary for a
person to stop what they are doing to use a wearable computer because it is always running in
the background, so as to augment or mediate the human's interactions. Wearable computers
can be incorporated by the user to act like a prosthetic, thus forming a true extension of the
user's mind and body.

It is common in the field of Human-Computer Interaction (HCI) to think of the human and
computer as separate entities. The term "Human-Computer Interaction" emphasizes this
separateness by treating the human and computer as different entities that interact. However,
Humanistic Intelligence theory thinks of the wearer and the computer with its associated input
and output facilities not as separate entities, but regards the computer as a second brain and its
sensory modalities as additional senses, in which synthetic synesthesia merges with the
wearer's senses. When a wearable computer functions as a successful embodiment of
Humanistic Intelligence, the computer uses the human's mind and body as one of its
peripherals, just as the human uses the computer as a peripheral. This reciprocal relationship is
at the heart of Humanistic Intelligence.

Concrete examples of wearable computing

Example : Augmented Reality

Augmented Reality means to super-impose an extra layer on a real-world environment, there


by augmenting it. An ”augmented reality” is thus a view of a physical, real-world environment
whose elements are augmented by computer-generated sensory input such as sound, video,
graphics or GPS data. One example is the Wikitude application for the iPhone which lets you
point your iPhone’s camera at something, which is then “augmented” with information from
the Wikipedia (strictly speaking this is a mediated reality because the iPhone actually modifies
vision in some ways - even if nothing more than the fact we're seeing with a camera).
Augmented Reality prototype

Photograph of the Head-Up Display taken by a pilot on a McDonnell Douglas F/A-18 Hornet
13. Discuss the catastrophic effect of human error in HCI?

Human capability for interpreting and manipulating information is quite impressive. However,
we do make mistakes. Some are trivial, resulting in no more than temporary inconvenience or
annoyance. Others may be more serious, requiring substantial effort to correct. Occasionally an
error may have catastrophic effects, as we see when ‘human error’ results in a plane crash or
nuclear plant leak. Why do we make mistakes and can we avoid them? In order to answer the
latter part of the question we must first look at what is going on when we make an error. There
are several different types of error. As we saw in the last section some errors result from
changes in the context of skilled behavior. If a pattern of behavior has become automatic and
we change some aspect of it, the more familiar pattern may break through and cause an error.
A familiar example of this is where we intend to stop at the shop on the way home from work
but in fact drive past. Here, the activity of driving home is the more familiar and overrides the
less familiar intention. Other errors result from an incorrect understanding, or model, of a
situation or system. People build their own theories to understand the causal behavior of
systems. These have been termed mental models. They have a number of characteristics.
Mental models are often partial: the person does not have a full understanding of the working
of the whole system. They are unstable and are subject to change. They can be internally
inconsistent, since the person may not have worked through the logical consequences of their
beliefs. They are often unscientific and may be based on superstition rather than evidence.
Often they are based on an incorrect interpretation of the evidence.

14. Define User Interface Management System?

A User Interface Management System( UIMS) should not be thought of as a system but rather a
software architecture (a UIMS is also called a User Interface Architecture) "in which the
implementation of an application's user interface is clearly separated from that of the
application's underlying functionality". A large number of software architectures are based on
the assumption that the functionality and the user interface of a software application are two
separate concerns that can be dealt with in isolation. The objective of such a separation is to
increase the ease of maintainability and adabtability of the software. Also, by abstracting the
code generating the user interface from the rest of the application's logic or semantics,
customisation of the interface is better supported. Some examples of such architectures are
Model-View-Controller (fundamental to modern Object Orientation, e.g. used in Java (Swing)),
the linguistic model (Foley 1990), the Seeheim model (first introduced in Green 1985), the
Higgins UIMS (described in Hudson and King 1988), and the Arch model (a specialisation of the
Seeheim model; see Coutaz et al. 1995, Coutaz 1987, and Coutaz 1997).
Such user interface architectures have been proven useful but also introduce problems. In
systems with a high degree of interaction and semantic feedback (e.g. in direct manipulation
interfaces) the boundary between application and user interface is difficult or impossible to
maintain. In direct manipulation interfaces, the user interface diplays the 'intestines' or the very
semantics of the application, with which the user interacts in a direct and immediate way. It
thus becomes very problematic to decide if these intestines should be handled by the User
Interface or in the application itself.

15. Discuss Non-speech sound in HCI?

You might also like