[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
So you want to be on the desktop
So you want to be on the desktop?
A Linux programming call to arms.
By William A. Housley
Forward:
How would you like to see Linux PC’s for sale at your local general
retailer? Impossible you say? Linus Torvalds (in a chat interview) seems to
think that it will take about three years for Linux to become commonplace. 3
years!?! In this industry 3 years may just as well be 100. At LinuxWorld he
called for an effort on the desktop. Lately, I have been studying Linux and
the open-source model and I can tell you that consumer Linux PCs will be in
stores for the year 2000, 2001 at the latest. It has all the signs of an
upcoming fad, lacking only one thing more and the closed-source software
community knows what that is and is working fast even as we speak. Wait!
Closed source? If closed-source developers capture the user base then who
will have control of Linux? I submit to you that there is a golden
opportunity here to show the world how fast the open-source model can get
things done. It is all up to you open-source programmers. I mean at the
moment that I am writing this, old WinTel closed-source developers like
Corel, Novell, and even (ack!) Microsoft are said to be developing
application code for the Linux platform! Well, the only thing they have that
you don’t is experience building end user products for the traditional
Mac/WinTel consumer. Many hardware venders are preparing to build and market
PCs with Linux as the OS, and those hardware vendors will prefer to package
software that is open-source because it actually serves their needs better,
but only if good, popular, open-source desktop application software is
available.
For the past 18 years I have been learning about computers, programming
computers, designing and installing computer software and networks, and
teaching others about computers. I have worked with some Ultrix, Harris, and
VMS systems, however most of my work has been with small business on WinTel
machines. I have done a little system programming, and worked on canned
vertical market application products, all of it closed-source. But most of
my programming and analysis career has been working closely with the end
user on in house application development, building their NT and Novell
clients and servers, and adapting their manual procedures to the information
age. I have worked very closely with the average desktop computer user for
virtually my entire career, building their apps, designing their networks,
solving their problems, and teaching them how to use their machines. Much of
the time I was the only computer literate person in a building full of users
with only software that I wrote as their computer interface. I have learned
what they like, what they do not like, what they buy and what they do not
buy. What they forget, and what they remember.
Back when programming was most of what I did, my mother asked me why someone
so skilled with people as I am would make a career out of programming
computers. I told her that the programs I write are not for computers, but
for people. This paper is an attempt to share with you what I do, what I
have learned as a programmer, software consultant, and overall industry
watcher in the WinTel marketplace. Not what I have learned about the
computers mind you, but what goes on in the heads of most PC and Macintosh
computer users. In short, it is a paper on how to write programs for people.
If you consider yourself a computer novice, then this paper is not for you,
not because you won’t understand it, but because you already know what it
says.
Recently, at a local Linux users group meeting, I had the opportunity to
hear Eric Raymond speak. Not only did I learn for the first time how and why
the open-source model works, but I also understood why no closed-source
developers can compete with open-source solutions (accept maybe in the
vertical market) if they are available in a competitive time frame.
In his presentation, Eric spoke about how to explain the open-source model
to big name executives, get inside their head and speak their language. I
like the analogy I read by Dale Carnegie in which he explains that to catch
a fish one should not use strawberries as bait. Fish do not eat strawberries
but worms. So to catch a fish we use a worm as bait. Well, to make Linux
common in the home and desktop market place and keep Microsoft off of it,
Linux must "catch" the attention of the average desktop user, and to do that
you must present them with the right bait. Linux must become what Eric calls
a "category killer" on the average desktop. We must completely replace
everything that Windows does, we must go for the throat, and we must do it
quickly. I will start this paper by referring to some points out of Eric’s
writings that I will use later on. Then, to adapt those same principles to
the WinTel market I will need to redefine a few of the terms Eric uses to
describe the open-source model. After that I will present you with a lengthy
"man page" on the average desktop user’s needs based on my experiences with
them. I will share my theories as to what needs to be done by you and what I
think Microsoft and other closed-source developers are already doing, and
why you, the open-source developer, must act quickly. I will conclude with
my thoughts on why I think Microsoft will win a battle between closed-source
solutions, and how you, the open-source Unix community, is very close to
snatching defeat from the very jaws of victory.
See attached HTML document.
Title: So you want to be on the desktop
So you want to be on the desktop?
A Linux programming call to arms.
By William A. Housley
Forward:
How would you like to see Linux PCs for sale at your
local general retailer? Impossible you say? Linus Torvalds (in a
chat interview) seems to think that it will take about three
years for Linux to become commonplace. 3 years!?! In this
industry 3 years may just as well be 100. At LinuxWorld he called
for an effort on the desktop. Lately, I have been studying Linux
and the open-source model and I can tell you that consumer Linux
PCs will be in stores for the year 2000, 2001 at the latest. It
has all the signs of an upcoming fad, lacking only one thing more
and the closed-source software community knows what that is and
is working fast even as we speak. Wait! Closed source? If
closed-source developers capture the user base then who will have
control of Linux? I submit to you that there is a golden
opportunity here to show the world how fast the open-source model
can get things done. It is all up to you open-source programmers.
I mean at the moment that I am writing this, old WinTel
closed-source developers like Corel, Novell, and even (ack!)
Microsoft are said to be developing application code for the
Linux platform! Well, the only thing they have that you
dont is experience building end user products for the
traditional Mac/WinTel consumer. Many hardware venders are
preparing to build and market PCs with Linux as the OS, and those
hardware vendors will prefer to package software that is
open-source because it actually serves their needs better, but
only if good, popular, open-source desktop application software
is available.
For the past 18 years I have been learning about computers,
programming computers, designing and installing computer software
and networks, and teaching others about computers. I have worked
with some Ultrix, Harris, and VMS systems, however most of my
work has been with small business on WinTel machines. I have done
a little system programming, and worked on canned vertical market
application products, all of it closed-source. But most of my
programming and analysis career has been working closely with the
end user on in house application development, building their NT
and Novell clients and servers, and adapting their manual
procedures to the information age. I have worked very closely
with the average desktop computer user for virtually my entire
career, building their apps, designing their networks, solving
their problems, and teaching them how to use their machines. Much
of the time I was the only computer literate person in a building
full of users with only software that I wrote as their computer
interface. I have learned what they like, what they do not like,
what they buy and what they do not buy. What they forget, and
what they remember.
Back when programming was most of what I did, my mother asked
me why someone so skilled with people as I am would make a career
out of programming computers. I told her that the programs I
write are not for computers, but for people. This paper is an
attempt to share with you what I do, what I have learned as a
programmer, software consultant, and overall industry watcher in
the WinTel marketplace. Not what I have learned about the
computers mind you, but what goes on in the heads of most PC and
Macintosh computer users. In short, it is a paper on how to write
programs for people. If you consider yourself a computer novice,
then this paper is not for you, not because you wont
understand it, but because you already know what it says.
Recently, at a local Linux users group meeting, I had the
opportunity to hear Eric Raymond speak. Not only did I learn for
the first time how and why the open-source model works, but I
also understood why no closed-source developers can compete with
open-source solutions (accept maybe in the vertical market) if
they are available in a competitive time frame.
In his presentation, Eric spoke about how to explain the
open-source model to big name executives, get inside their head
and speak their language. I like the analogy I read by Dale
Carnegie in which he explains that to catch a fish one should not
use strawberries as bait. Fish do not eat strawberries but worms.
So to catch a fish we use a worm as bait. Well, to make Linux
common in the home and desktop market place and keep Microsoft
off of it, Linux must "catch" the attention of the
average desktop user, and to do that you must present them with
the right bait. Linux must become what Eric calls a
"category killer" on the average desktop. We must
completely replace everything that Windows does, we must go for
the throat, and we must do it quickly. I will start this paper by
referring to some points out of Erics writings that I will
use later on. Then, to adapt those same principles to the WinTel
market I will need to redefine a few of the terms Eric uses to
describe the open-source model. After that I will present you
with a lengthy "man page" on the average desktop
users needs based on my experiences with them. I will share
my theories as to what needs to be done by you and what I think
Microsoft and other closed-source developers are already doing,
and why you, the open-source developer, must act quickly. I will
conclude with my thoughts on why I think Microsoft will win a
battle between closed-source solutions, and how you, the
open-source Unix community, is very close to snatching defeat
from the very jaws of victory.
Chapter 1: Eric
Raymond's Writings.
Chapter 2: Definitions.
Chapter 3: The Big Myths.
Chapter 4: What to do about it.
Chapter 5: How Microsoft does it.
Conclusion
Chapter 1: Eric Raymonds
writings
In Erics "The Cathedral and the Bazaar", he
lays out some fundamental issues in the open-source software
development model which he found contributed to the success of
the "fetchmail" project. There were a few points that
struck a cord with me because I have seen other examples of
success in the WinTel closed-source arena using variations of
some of those important issues. Here are a few:
Release early release often.
Listen to your users.
Praise your users for work that they do.
Let your users find the bugs.
The programs interface should be well structured to its
intended purpose.
If the product becomes useful enough, it could become a
"category killer".
To avoid redundancy, and because I am lazy, I will not
explain what these principles are or why they are important to
the open-source model. If you need that, go back and read
Erics writings. What I will do is reapply them to some
areas that I think are crucial to understanding why certain
closed-source products in the WinTel and Macintosh environments
have enjoyed success while others have failed. I will also
explain how to redirect the open-source energies, using these
concepts, to defeating the powerful closed-source developers who
even now are gearing up to capture control of the Linux desktop.
Chapter 2: Definitions
Eric has clearly laid out and defined how to mimic the Linux
open-source development cycle for system software projects like
Fetch Mail. The principles he describes are broad in scope the
way he explains them, and seen from the necessary perspective can
be applied to any software project. I intend to specialize them
here just a bit by adjusting some of the definitions to the
purposes of this paper
most importantly, the term
"user".
The term "User" as applied in "The Cathedral
and the Bazaar" refers to the users of Fetch Mail which
seems to refer mostly to fellow programmers, or at least system
admins. To design and build software for use by the average
desktop user the term "user" must be altered to refer
to the average consumer of WinTel and Macintosh products. In
building this average you should of course include any user of
the WinTel or Mac computer of any level of skill, but it also
wouldnt hurt to include users of fax machines, VCRs,
stereos, palmtop computers and the various kiosks as well. This
is because these users are beginning to view all of those devices
as being pretty much in the same category. It is important to
mention that over half of these users would not be considered
computer "literate" even by WinTel standards. By most
of your standards only WinTel admins and programmers like myself
would be considered at least computer knowledgeable, and we are
well under 10% of the WinTel user base.
The code one develops for these users of course moves up one
layer, from system applications and utilities (fetchmail), to
desktop applications (mail clients, accounting software,
spreadsheets, word processors, general and specific purpose
database clients, etc.)
Priorities get shifted also. Good code is still needed, but
size and speed must be sacrificed (sometimes brutally) at the
alter of interface convenience. So the definition of the word
"code quality" shifts from size and speed of execution
to user "ramping up time" and speed of user.
Speaking of interfaces
for our purposes here
the
definition of the term "interface" as used by Eric in
his writings needs to be expanded beyond just program to program
connections, but also must by necessity include the user
interface. This interface must be simplified and tuned to the
special abilities and needs of the ultimate of programming
achievements
the human brain.
But wait! There are those in the programming community which
have not yet "Read the Freaking Manual" on the human to
computer interface. Fear not! Here it is.
Chapter 3: The big myths.
There are many myths that technical people often have about
the desktop computer marketplace and those who run around in it.
For the most part these myths reflect a very noble idealism on
the part of the computer experts and their view of what computing
"nirvana" would be. Now my wording on these things is
going to seem a little cruel (I call it anecdotal humor
my
wife calls it sarcasm), but understand that I am a tech first and
agree with you on how things should be. I am only going to try to
play devils advocate for a minute to try to show you how the
users think. Here are a couple of the most common myths that we
must get rid of before we proceed.
Myth #1 We must evangelize the "All information should
be free" concept.
I now largely agree with you on information needing to be
free
but I need to reflect on what the average desktop user
thinks when they hear the phrases "free information"
and "free code". I do this because you must know how to
present it to them in a way that attracts instead of repels them.
The truth is that many people in the real world give things
away for free for no other reason than because no one would buy
it, or to persuade someone to buy something else. Traditional
information sources charge for content, and treat their works as
intellectual property. Does that make the publisher greedy?
Perhaps it does, or perhaps the printing and binding equipment
that they use to make books doesnt grow on trees. Neither
does the fuel and family needs of the truck driver who drives the
books to the bookstore.
This is why the common computer user does not understand free
information quite the way that you and I do, because our
application of it is not what they are accustomed to. Now I
published this paper the way I did for my own reasons. Mostly
because it is the best way to reach you, the target audience, but
also because I now believe in the open-source (i.e. Free
information) model and this paper is kind of my own little
experiment to see for myself how to best apply it. However the
end user does not have a vision about this.
Some parts of the concept are not completely lost on the
common folk though. We have a somewhat more limited model we use
called "freeware" or "public domain" that
have been around for a while. Now while these are not accurate
examples of open-source, they are closer than most in the WinTel
marketplace ever get to giving things away. We also have software
for WinTel and for the Mac that we call "shareware"
(usually closed-source) that has a free version of the software
with some of the functionality removed. When you pay for the
software, a built-in patch is applied that activates the disabled
code. Many PC users like it when you give something to them
without charge, so they understand and agree with your
generosity; they call it a "freebee". But their
expectations of it are not very high. You will even find some
that will still try to give you money
feeling guilty about
benefiting from something that they havent paid for. Others
distrust free things as being tainted somehow, without value, or
come with some kind of "catch". The users have to be
shown by demonstration that they can get better software service
from open-source solutions, because up to now they have been
indoctrinated otherwise. So they must be allowed to be exposed to
good open-source products in order to catch our vision of it. You
have already seen how quickly the users accept an apparently free
product like IE that appears to them to be nearly equal in
quality to a charge product.
Myth #2 Computer users should read the manual.
Unix techs are famous for this one. My dad (who incidentally
hates Unix) always used to say, "if at first you dont
succeed, try reading the instructions". That it good advise,
and makes perfect sense. Nonetheless, you wouldnt need to
read the manual on a common hammer and many feel like something
as common as computers should be likewise.
Another reason is based on the "new toy" attitude
that users have regarding neat things like computers. When you
were a kid, and you got up on Christmas morning and ran into the
living room to see what cool STUFF you got, Ill bet that
the instruction manual and safety warning were not the first
things you grabbed, now were they. Be honest! Of course not, you
grabbed the new toy and did everything to it that you thought you
should be able to do (and if you were like me you took it apart
to see how it worked). When you could not get it to do certain
things your parents (who themselves may or may-not have read the
instructions) helped you. In fact, do you even remember if the
toys you opened had written instructions of some kind with them?
They did you know. We should remember that the next time we laugh
at a computer user who threw the manual (and the driver disks)
away with the packaging. Just cheerfully tell them "Merry
Christmas" as you charge them for your time while you hit
the manufactures web site to download whatever it was they
threw away. The programmer must understand that a new computer
product is a cool toy/thing in the eyes of the more productive
category of user, and that an adult with a new computer component
becomes a child at Christmas time. Furthermore, the computer
manuals (especially if you or I wrote them) are just not
interesting or understandable to the average adult, who is more
accustomed to mysteries and romance novels (if they read much at
all). I mean face it, when you write docs, do you write them for
what I call the "poodle groomers" of society, or your
fellow computer experts? Dont answer that, Ive read
some of your docs, and most of what I have read was written for
us geeks. By the way, the less productive category of user is
quite a large group of people, and is intimidated and afraid of
the computer and the docs together, believing that the computer
is actually superior to them, both in design and intellect. My
father swears that computer manuals are written in Greek! These
people absolutely must be put at ease immediately with the
product through an overwhelmingly helpful and friendly first
experience or they will never touch it again. Others hate
computers because they resent the fact that they cant
figure them out on their own. I have a sister that hates
computers precisely for this reason, she used to call them FREDs
for "F*****g Retarded Electronic Device". Lastly, both
the WinTel and Mac users have very long memories, so after their
first impression with a product or feature, it is hard to change
most of their minds. They will be unlikely to easily give you a
second chance saying, "Ive been there". An
example of this was the first release of the Mac Power PC which
was heavily marketed to run "all of your favorite Windows
software". The truth is that the release version of Soft
Windows for the Mac only ran older (obsolete) Windows Software
that functioned on 80286 machines (no 386 enhanced mode). WinTel
users never forgot it, and even though later versions of Soft
Windows ran enhanced mode software, the users ignored it and cost
Apple millions in lost sales of Power PC computers.
Myth #3 Users are stupid.
One of my favorite Dilbert comics was while Dogbert was
running for President. He said to Dilbert, "From now on I am
not going to try to reason with the idiots that I encounter, I
will just shake my paw at them and say Bah".
"Dogbert" Dilbert scolds, "just because people
disagree with you doesnt make them idiots".
"Bah" said Dogbert
shaking his paw.
Users and (gasp) even marketers are not idiots. They are very
often every bit as skilled at what they do as you and I are. This
is one of the myths that techs like us frequently seem to have
about the user base.
Do you really believe that those all around you in the office
area of your workplace are stupid and that you (and your fellow
hackers) are the only really smart ones? You and I see
self-righteousness in others as arrogance most of the time. Then
we turn right around and interpret a disagreement with a user as
being just stupidity on their part. Think about it, whats
the difference? Most of the time it is just a matter of
perspective based on a different expectation of the computing
experience. If the user saw computers exactly the way we do, it
would be because they know the same things about them that we
know, then what would they need us for?
Dale Carnegie said in "How to Win Friends and Influence
People", that every person that you bump into on the street
has something that they do better at than you. I had a Macintosh
user say to me once, "(We) do not believe that it should be
necessary for us to be experts at whatever we do for a living AND
be experts with computers just to make use of them in what we
do". He was trying to complain to me about what he saw as a
fundamental difference between Apple users and WinTel users. But
he actually hit the best fundamental description of the average
desktop computer user in general; it is just that Mac users are
probably the more dramatic example of this. What he describes is
a very reasonable expectation, should you have to know how to
build a car just so you can drive one? Should you know how to fly
an airplane before buying a ticket to travel in one? Of course
not.
Myth #4 Marketing is slime.
Actually, this is not so much a myth as it is a misdirection
of purposes. I have never met a sales person who was hired for
their technical knowledge. Some have it, but it doesnt
serve their needs or make their money. Sales and marketing folks
are judged and paid directly from how much $$$ worth of
merchandise they move out of the door. They have few preferences
at all for selling one thing or another on that things
merits alone. For the most part, the only thing that excites them
about a product is the likelihood that someone will give them
money to take that product away. This is not so unusual really
since teachers are hired more for their teaching skill than for
what they know. In fact I had some difficulty making it in the
teaching world because I am a tech first.
I guess it is true though that marketing IS slime, and
everybody admits it. But like lawyers (who are also slime) and
teachers, they serve a necessary purpose and specialize within
that purpose. Heres a thought, how would you like to have
some of that slime bubbling on your side for once? You will have
to do a little fantasy role-playing and think like them for a
moment (ick!). There is a lot of complicated personal confidence
and psychology stuff involved, but I will not sicken you with the
details of that. The bottom line is that they are sales folks not
just because they love to sell things, it is also a very
potentially lucrative profession and most sales and marketing
folks spend much of their time roaming from product to product
looking for the money. Sales and marketing turn persuasive skill
into hard cash in the same way you and I turn programming skill
into good solutions. I will also tell you that I have tried sales
and it is no fun at all if it doesnt work. It is true that
a really good sales person should be able to sell anything, but
it is also true that they follow the path of least resistance. If
they dont think they can sell it, then they will spend
their time and resources selling something else. They want to
make the most money in the least amount of actual time possible,
so they will spend their time working what will sell.
Now I would not try to persuade you to sell out your
principles of quality code for marketing advantage, what would be
the gain? It would be pointless. But if Linux PCs are going to
take the place of Microsoft PCs and Mac PCs on the shelves, then
they have to out sell them. They have to be made to look good
next to the Barbie dolls, VCRs and lawn mowers where WinTel
machines are currently being sold.
Even if the internals of Linux are so much better than
Windows, that interior quality has to be packaged in such a way
so as to fool all those slimy sales folks into thinking that it
is just another stupid toaster. Do that, and it becomes the NEW
fad, the toy that I spoke of earlier, the "pet rock" of
the millennium. You see, WinTel sales have stagnating a bit, so
if you provide a new product that ordinary people like and the
slime can sell, then the sales and marketing folks will suddenly
love Linux (but for their own slimy reasons). They will salivate.
They will get all exited about the open-source model and start to
view your favorite OS as the next new popular (i.e. money making)
computer product/market and vehicle to their own success. They
will pay to advertise it in some stupid half-time commercial
during the Superbowl. They will put it on their shelves next to
those lame home gaming systems. They will hang their stupid red
ribbons and balloons all over it, and blow stupid little whistles
at every 1000 units sold. They will use their pathetic knowledge
of the cutsier parts of the new Linux GUI you are going to write
and froth up the customer into a state of artificial uforia of
how that one box contains the answer to all of their problems.
The consumer will then dig deep, and go home and put a Linux box
on their desk instead of a Mac.
Myth #5 Size and speed.
The desktop user doesnt care about size or speed. Well,
they do care, but they will never get to see how fast the program
is if they never figure out how to start it up and make it do the
things that they want to do the way they want to do them. They do
not care about fast features that they dont use, and here
is the most important part, they do not care about speed that
they do not notice. In short, they do not care about speed for
the sake of speed itself.
First, they care about interface conveniences on the first
layer. Those are the mechanical issues that they do with great
frequency, like open files, type on the keyboard, click with the
mouse, push the buttons, use the scroll bars...etc.
Second they care about similar "handles" to
activate similar features in different types of software, so that
things are consistent and they dont have to relearn
different mechanisms to do the same thing the same way. Example:
The key press combination for a non-destructive copy to the
"clipboard" in virtually all software packages on both
Windows and the Mac is Ctrl-C. You can call it a de-facto
standard.
Third they care about the availability of features they like.
This can change somewhat rapidly from time to time as new
inventions open new possibilities.
Forth, they care about interface convenience on the second
layer. These are the things that are implemented through buttons
and drop down menus and the like, features that they use only
occasionally but do not want to have to relearn each time. For
this reason, complicated options must somehow be featured in the
very most informative and intuitive way possible. The most common
combination of options must be available in the most brainless
way you can implement, while still being "open" enough
to conveniently alter those options. For example, the find
command in the bash shell would most commonly used by thusly by
the average desktop user; "find / -name file*
print". In the slightly more intuitive DOS, the same
command would be; "dir /file*.* /s". The
"dir" command in DOS is the equivilent of "ls in
Unix and is case insensitive automatically, the "/s"
means to scan the subdirectories of the given path. The DOS
version also has many other more advanced options, but of course
not as many as the bash "find". The users still dislike
even this simple DOS command. In the GUI, they invoke the
"find" command from a drop down menu somewhere and fill
out a simple dialog box. The more advanced options are hidden
from the default screen by an option tab like a rolodex tab. The
dialog window itself is made just large enough to contain the
most common and popular choices, while covering as little of the
background screen objects as possible.
If the user makes a choice they do not like in a step-by-step
process (like an install, xconifgurator, or Kernal
configuration), they should be able to re-run the easy interface
to the feature and alter only that option that they want changed.
The default options of course should then be their most recent
choices for that object or session, not the original defaults of
a new object or session. Otherwise they will be afraid of
changing one option for fear of messing up the stuff that works
the way they like it.
Fifth, they care about the availability of cool features that
they have never seen before, have wished that they had, and that
seems to solve new problems. Microsoft used to serve this user
need better than they have lately.
Somewhere in there they care about having fun. So the product
usually needs to be entertaining to some degree.
Then, when all of the above are pretty much equal
they
begin to compare products based on speed and code size.
You doubt? How many politicians do you know actually get
elected for their stand on the issues only? Even those
politicians who care about the issues cant get elected
until they hold their noses and do the slimy marketing things
necessary to get elected if they want to ever have the power to
effect those issues.
Myth #6 Bugs.
To the user, anything they expect to find and dont is a
missing feature, and every missing feature is a bug. Therefore
from the perspective of the average WinTel or Mac user Linux
currently is a very backward environment that reminds them of
(dont hit me) DOS! They see it as being full of bugs and
lacking in real world usefulness. They say this because most of
the usefulness is hidden beneath the bottom two layers of the
interface where the typical desktop user never sees them! They
are also very picky and narrow minded about what constitutes a
useful GUI. This is often based on familiarity alone.
So what needs to be done?
We need to do for the user with Linux what Windows is
currently doing for them. Not convinced? Read on.
Chapter 4: What to do about it.
Again, I am not saying that it is ok to get sloppy with your
code. What I am saying is that the biggest difference between
Linux and Windows remains that Windows and the Mac have the more
intuitive, interactive UI, and cross product familiarity that the
users want, and Linux still doesnt. At this writing, Linux
and BSD seem to have solved all of the most important problems to
getting to the average users desktop accept that which is
the most important to the user. Linux needs a very strong, very
obvious, open-source, graphical front-end. I am not talking about
changing or replacing the X server. I am talking about an open X
client interface and suite of open-source application software
which compare closely too, or beat Windows and the Mac at the
issues the average desktop user cares about the most.
After that, the benefits of the open-source model, the speed
of the code, and the strength Unix has at the third, forth, and
fifth layers will win the game.
At this writing, numerous developers of closed-source WinTel
software including Microsoft and Corel are said to be working
hard to fill this need. Even if you in the Linux world have not
seen this, those of us in the WinTel world who are open minded
enough to accept Linux see it clear and obvious. The
closed-source WinTel community also view you as being too
arrogant, too ignorant, and too closed minded and caught up in
"small and fast" code philosophy and a RTFM attitude to
fill this need quickly. They see this as an opportunity to
exploit you and your platform for slimy marketing reasons. They
think that they must show you what the users need and make some
bucks off of YOUR Linux before you will see what I am telling you
here today. If they are right, then Microsoft (or some other
closed-source developer who may be worse) will build their own
loyal user base with Linux. They will then take control of the
Linux platform in the same way that you feared Microsoft would
take control of the Internet when they released an HTML client
and made it part of Windows. That will keep your influence off
the desktop for another 3 years just as Linus Torvalds predicted.
I know that most of you hate writing UIs. So do I. Think of
it as doing the dishes, it needs to be done and the quicker you
get started the quicker it will be over. If you dont do it,
then someone you dont like will.
I have spoken of intuitiveness before, this is defined as the
presenting of a required sequence of user actions that pretty
much teaches each step to the user as it goes along or does
things in a way that the user automatically expects. Familiarity
is one important issue here
a ball, whatever the size and
color, is still round and still bounces in almost exactly the
same way. That shape and bounce must be similar to that of other
balls for people to recognize it as a ball. But with the
complexity of computers, that consistency of interface should go
even further. When teaching MCSE students about Windows NTs
Macintosh network gateway product, I have to make sure to warn
them about a particular user interface problem. While the same
files stored on the NT machine are equally accessible from both
platforms (NT/95 and the Mac) over the network, the "look
and feel" of those platforms on the desktop are different
enough to create confusion for some Mac users. Seeing the icons
for their files on someones Windows 95 file manger
(Explorer) screen, they will not recognize them. They will think
that there is something wrong with the system, that the files
have been lost or something because to them those are Mac files
and should still look like Mac files even on a Windows screen!
Seem stupid? Well if you looked out the window of your home
tomorrow morning and your brown car had turned into a green car
over night, you might wonder if something were up. Mac and PC
users both see their computer desktop as such a literal extension
of their physical universe so as to make this sort of thing quite
common, though it is usually just a certain category of user that
takes it to the extreme described above. Personally I see little
difference at all between the two interfaces (except that the Mac
lacks the ever-useful command shell prompt). But the fact remains
that the regular users dont like to see their toasters
change shape. The Linux interface (at this writing I am using
RedHat 5.2, your mileage may vary) out of the box does not seem
to even have a GUI. There is nothing after startup and logon that
tells the user that they need to type the words
"startx" or "init5" to get into the GUI, they
have to read it in a book somewhere. But in order to make it on
the desktop, you have to assume that all that the user has access
to is the machine, and no docs whatsoever because they rarely
read the docs anyway. When they do read the docs it is only
to look up something specific and then get back to work. That is
why there are world wide computer training organizations making
millions offering application level training to other
corporations employed computer users. Did you know that
"startx" is not even that easy to look up in the
documentation? There is no "man X". Without someone
telling them what they need to do some will wander the directory
structure, find X, and run it (ok, stop laughing). It would do no
harm to have a logon welcome screen configured from install that
says simply, "If you want to enter the Windows-like
environment, type startx below".
Once in X, there seems to be no apparent copy/cut/paste
ability between different applications, which is one of the most
popular features of the Windows and Mac GUIs. Scroll bars in
different programs work differently, sometimes requiring
experimentation or doc searching to figure them out. Context
sensitive (directly feature specific) help is weak or
non-existent, no file browsing with pre-configured
associations
the list goes on. Now to you, these are not
serious problems, but to the unskilled/non-hacker users these
things are crippling.
Now for the users perspective of speed. The measure of
speed for them is similar to yours, how fast can they get done
what they want done. Except that for them this is more a product
of their speed on the interface and a shallow learning curve than
the raw processing power of the code. In the first 2 or 3 minutes
of interactive time after startup, and right after install, the
new user must be able to wander each part of the system that
stands between them and printing a test file from their favorite
program. They need to be lead from scene to scene like the steps
in the plot of a good movie, or the chapters of a good book with
each phase setting the stage for navigating the next. I say
interactive time because the user fully understands that the
Internet dialer and the printer require a specific period of time
to perform their functions, but they become increasingly
frustrated and intimidated as the machine sits and waits for them
to figure out what to do next. If they have to spend a minute or
two researching each step, they will wonder if they will remember
all those steps the next time or if they will they have to work
this way everyday. I have seen them do it, they eventually throw
up their hands saying something about "(not having) time for
this crap", box up the machine and take it back to the
store. The goal when building for these people is Immediate
Personal Productivity. Sales slime just wants to sell the machine
and desktop users just want to start doing their work. For the
user an in-efficient user interface is a strike against it on
speed because they will say they could get their work done faster
with the some other system.
It has been said that disappointment is the frustration of
unrealized expectations. Well here is a small sample of key
things that the user of PCs and Macs expects out of an OS, and
are accustomed to getting in abundance from Windows and Mac
software.
-An automated step by step install process. The install
program should be called "install" or
"setup". It should also optionally have a completely
brainless "minimum questions asked" install mode.
-The user should be able to implement most if not all of the
crucial functionality, with the most common set of options by
simply closing their eyes and pressing ENTER at every step
throughout nearly all of the install process.
-The user must be able to rerun the entire install and keep
all pre-selected settings, accept for one that they want to
change, with the choices they selected in the most recent install
being the defaults. Sometimes the user does want to go back to a
clean, new default install, so they do still need that as an
option as well, but as the exception rather than the rule.
-All prompts for anything anywhere should list and explain
each of the more popular choices, in some instantly and
brainlessly available form.
-All documentation must be available online at 2 or 3 mouse
clicks, and should be context sensitive whenever possible (use a
special "help key" method for each object for which
context sensitive help seems reasonable, use parameter passing to
a subroutine to customize the lookup to the docs for that
object). The user should be able in two or three clicks to find
exactly what they need to know before reading no more than 300
words (and preferably a lot less) or so to solve the most common
problems. The time span between the decision to go to the docs
and finding the solution they seek should not exceed that of a
typical television commercial break. This is no coincidence.
-It is best if professional technical writers (i.e. less
technical skill, more writing skill) write the docs but I do not
know how to do this under the open-source model.
-Scroll bars and other common on screen productivity devices
should look and act in nearly the same way on all apps, or
explain their different functionality clearly in docs that the
user does not have to go find.
-Take every opportunity to tell users what to do with things
they find on the screen. At the bare minimum there should be
onscreen links to good docs, sentence long descriptioms should
appear in the status bar when command objects are selected but
before their default methods are activated and object labels
should appear in screen pop-ups resembling what HTML does with
ALT tags.
-While having a "welded on GUI" like the one that
Windows uses is clearly not necessary; the default X client
should be powerful in terms of user features and consistent in
most details. The radical differences between the "old"
Windows and the Windows 95 desktops created significant stress on
the market and nearly killed Windows 95. It was all the money
that Microsoft spent on hype that made the difference. Even today
there are users who hate it still, despite the fact that it is
superior to and more productive than the older Windows interface.
If you in the open-source community do less than this with
Linux some closed-source developer like Microsoft will do it for
you and will endanger your control over Linux.
Now, is this hard to do? For you people, certainly not. But
in the next section I will tell you why Microsoft has trounced
its competitors every time its products have had similar
exposure to the user. What you may have heard on this is only
partly true, and is not complete. The old "well they own the
environment so they have the control" argument is only
half-right
the rest is even more frightening, and uses a
couple of the more successful components of the open-source model
in some interesting ways. No you say? Read on.
Chapter 5: How Microsoft does it.
You are about to learn what many closed-source WinTel
software developers never learned and died trying to fight. They
were simply to closed-minded to succeed.
Ever wonder why Microsoft application software is so fat and
slow? Some of you have alleged that it is a lack of attention to
such things in their programming priorities, but what you may not
know is that the mechanism of those priorities is
Microsofts favored GUI development environment, Visual
Basic.
Users like features. Following is an example of how Microsoft
implements "category killer" features. It is not an
isolated example:
A long time ago I was writing application software for DOS
and I did some debating with some other programmers about what to
do with that right mouse button (looking at some application
software under Linux I can see that you have struggled with this
question as well). With the PC, the race has been to continue to
provide new features, but at the same time not complicate the
interface too much because the average PC user is typically very
unskilled and results centric. Well, somebody came up with
programming the right-mouse-click event to provide a small menu
in the place clicked containing things to do to directly to the
object clicked. Not only that, but an option to edit the object
properties (that is what Visual Basic calls "instance
variables") directly could be presented as one of the menu
choices. Then they would built a GUI screen that organized
(somewhat) the object properties to be changed. The menu choices
that were implemented were limited to the very most common second
level operations. It saved the user from having to wander around
the regular drop down menus to try to remember how to invoke
those features. Key among those choices were cut/copy/paste. This
is because the average user does not remember that the common
quick key for those things are Ctrl-X, Ctrl-C, and Ctrl-V
respectively and using the regular drop down menus for this
frankly just takes too long and limits the features
usefulness. Anyway the "right click on object for quick
menu" very shortly became a popular feature among the users.
Microsoft noticed this and immediately incorporated it into the
next release of all of their office products (truncating the beta
process to do it), beating most everyone else (who also put it in
there new releases) to the punch. This is because Microsoft
office products are written largely in Visual Basic and even I (a
relative novice at Visual Basic programming) can write a quick
menu routine for the right click mouse event in about 17 minutes.
What they probably really did was to add a quick menu building
tool to VB first, then utilize that tool in building quick-menu
methods onto all of the screen objects on each office app in just
a matter of days. The quick menu choices consisted mostly of
features for which there were already menu driven routines, the
programmer just copy/pasted that menu entry object, with
its attached method from the dropdown menu code to the new
quick-menu object.
The next step was to market the quick-menu
"feature" as their very own great idea. I dont
think that it had been called a "quick-menu" previous
to MSs implementation of it in their products, that was the
name that Microsoft gave it. The code wasnt ready for
release by good programming standards, but the users did not care
about the bugs because those bugs were not all that intolerable,
especially when balanced against all the cool new features and
the newly simplified user interface. Besides, the users who call
Microsoft tech support with bugs get their call and help for free
and often get private praise from Microsoft. Ordinary people
often dont need public praise, just the private praise from
someone as big and well known as Microsoft is adequate, and often
they get the first copy of the patch as well.
As for the speed, who cares? Heavily interactive user
application features like the one here described spend most of
their time waiting for the user anyway. Heavily interactive
features under Windows that the user needs to see updated right
away like mouse activity and quick menus are dramatically
optimized over the other things in the multi-tasking interface.
The user sees what needs to be done, reaches for the mouse, and
by the time they are finished with that agonizingly slow thought
reaction and motor process, most of the other on screen things
have usually had time to happen. Strictly background things that
have no on screen presence can take all the time in the world
because the user is given things to do, so they dont end up
waiting for the background processes to finish. Sometimes things
like disk activity are masked behind a "user agent"
animation on the screen which, while it and the sounds it makes
eat up time slices, it also has the effect of distracting the
user so that the operation does not seem to take so long. For the
interactive activities on the desktop it is not the actual speed
but the perceived speed that effects the users actual
opinion of the speed of the product.
So what happens to the other closed-source competitors
writing their code strictly in C++ for super fast, super small
code? What happens to the competitor who first invented the
right-click mouse event menu thing but did not realize what they
had done or didnt market the feature very well? Or perhaps
they didnt apply it as accurately to the users needs?
Or first released it in a less visible product? What happens to
competitors that prefer to wait so they can find and smooth out a
few more bugs before they release? What happens to products like
early versions WordPerfect where the source code was not as well
structured as it needed to be, slowing the development process
and making upgrades more expensive to develop? Microsoft also
strategically times their product releases and announcements
around released advances in hardware speed and announcements and
releases by their competitors. They are very good at this one. It
is true that MS uses some marketing behaviors that leverage their
monopoly on the desktop and make it hard on competitors, but I
have noticed that the users actually do like to buy a new product
feature from whoever releases it first. Those other
"unfair" things that MS does seem mostly used to break
down marketing barriers for themselves and to maximize the damage
already done to their competitors by product development and
release strategy. You think you hate Microsoft? Stand in line.
In the above (closed-source) example, did you see some of the
things that Eric says makes open-source so effective?
Microsoft listens to users, though not like they used to.
Microsoft releases early, and often (if you count bug patches),
though not like the open-source model. They let the users do much
of the bug discovery, and the users sometimes are VB programmers
themselves and willing to offer suggestions and help so that
Microsoft might "notice" them. MS often praises others
for helping them though usually not publicly. They make use of a
rapid development, high level programming environment (VB),
though I think that you have some better ones and the open-source
approach itself should speed development of projects. MS makes an
effort to make their products rich in features, though they
filter those things through the marketing slime and often leave
out features that the users dont know that they need and
are not aware are possible.
Top Contents
Conclusion:
Now comes the scary part. Before the IBM-PC, Microsoft made
their money programming functionality into new
environments/hardware. That was what they did with CPM and the
Intel 8086 based IBM-PC but they had done it many times before
with other small computers that others were inventing at that
time. Usually the first step that they did was to develop an
assembler for the platform. Then Gates would write an
implementation of interpreted BASIC for the platform using that
assembler. Then they built the "user" software for that
hardware platform using Basic. Microsoft Basic was the first
higher level programming language available for the IBM-PC, and
some version their BASIC interpreter have always been packaged
with DOS along with an assembler. I even remember a time when the
IBM-PC would boot into a ROM based BASIC if there was no bootable
OS on any of the drives. I am told that a version of IE 4 is
available for Solaris, if that is true, then they probably have a
version of Visual Basic that they used for developing the UI for
the IE 4.0 port to Solaris. They love using the same source code
on multiple platforms and tying them together with a common
compiler that has been customized on the backend for the
different platform. NT uses the same source code for Intel and
all of the RISC platforms it runs on. NT also uses the same
source code for NT Workstation and NT Server. If the rumors are
true that MS is writing office products for Linux (they deny it
but it would certainly fit their pattern), then they must have or
are working on a VB for Linux. Once the VB is completed for a
platform, pretty much all that is left to do is to just recompile
already available proprietary source code into a product port and
release it at whatever time it would do the most damage to
its competitors. I would not be a bit surprised of they
wait for the success of Linux to get a little further and then
somehow release a version of Windows 2000 for it as an X client
or window manager, complete with IE 5 and the Office 2000 suite!
Then they will be there, and will take control of your platform.
I hope that you believe me, and that some of you act now to
find the best currently available open-source X client and GUI
interface products and improve them in whatever way needs to be
done. The front-end of Linux needs to stay dominated by
open-source projects and have the same power, flexibility and
capabilities that you folks have spent so much effort putting
into the back-end.
Please forward this anywhere you think it would be helpful.
This document may be freely distributed in
whole or in part so long as I am acknowledged in
print on as its author on any distributed copy,
printed or otherwise. As a favor though, I would
request that you also tell me if you use it just
so I can know what effect it is having.