Archive

Monthly Archives: December 2000

User Sovereignty
This was posted to the london_usability eGroup this week – and the author, Michael Andrews
has kindly given me permission to repost it here. It’s an interesting
read, and Michael really wants to open up debate on these issues – so
mail him if you want.

Some food for thought.

Is usability really about the “user’s experience” when companies use it to
increase “customer loyalty”? More and more, interface designers are
resorting to so-called push technology to force “choices” on people. I
scanned a book today in this vein: “Eternal E-Customer: How Emotionally
Intelligent Interfaces Can Create Long-Lasting Customer Relations”
Hardcover, 268 Pages, McGraw-Hill Companies, January 2001 ISBN: 007136479X
Author: Bergeron, Bryan

From the book jacket; “Emotionally Intelligent Interfaces (EEIs) act as
portals into customer wants and needs. The interface is driven by data from
previous interactions, explicit customer preferences, and based on customer
profiles”… “this unique guide is essential for turning customer data
collected by intelligent agents into a strong competitive advantage”.

There is a problem with this approach: I am being told what I want, rather
than deciding it myself. Sure, I may be free to ignore recommendations
forced on me, but I am less able to find my own choices when they are buried
behind the “intelligent agent” choices. Why is the computer presumed the
intelligent party, while the user is persumed stupid?

Push technology can easily become manipulation. Jay Newman, a Canadian
philosopher, has written a wonderful book called “Inauthentic Culture and
its Philosophical Critics” (McGill-Queens Press). In it he says: “a person
cannot bring authenticity to cultural products if she lacks a concept of
personal autonomy that enables her to believe that she is somehow able,
through self-definition and self-direction, to transmute to some extent, in
a personal and creative way, the material given to her as a result of
determining factors…Manipulation robs us to some extent of our freedom,
which is one reason why it offends us.”

In his book “Coercion”, Douglas Rushkoff asserts: “Microsoft has an entire
department dedicated to ‘Decision Theory and Adaptive Systems’–the study of
how human beings relate to data and interfaces. Although much of the
department’s work is geared toward creating more user-friendly interfaces,
my contacts at the company claim the much-shrouded division’s true purpose
is to determine the decision points in online behavior and how to manipulate
them effectively.” I don’t know if the statement is accurate, but it does
sound plausible.

Push technology by definition can not be user-freindly because it arrogantly
assumes to know what is in the user’s best interest, without informed
consent of the user. I see it as the current day equivalent of the social
engineering schemes and disasters of the 20th century. Designers need to
speak out against “usability” being highjacked by well financed companies
whose idea of “effectiveness” relates only to their own bottom line, and not
the user’s interest.

Comments welcome.

Follow

Get every new post delivered to your Inbox.

Join 4,934 other followers