[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Guidelines for future work
I send the following guidelines for discussion and comments.
=== BEGIN =====
Linux will remain a niche operating system as long as only a minority
will be competent enough to use it. Let's face it obstacles are
formidable: hardware support is spotty, software for the common man is
scarce, specially free one, and most programs are not internationlized
or their doc has not been translated. These problems will solve by
themselves when Linux will reach critical mass. For instance
manufacturers will provide drivers because not supporting Linux will
translate in too much lost sales.
A good distribution can hasten this by accelerating Linux growth. It
will not create application software out of thin air, nor drivers, nor
translations.
Let's see what we can do.
I) Forget about Unix tradition: Linux is NOT Unix.
a) Two different ways of life: Unix and Linux.
Unix was used in Universities and medium or big sized companies who
were big enough to buy the expensive Unix computers. It was not used
for mundane tasks because it was not cost effective against Macs or
Wintel machines. It was only handled to highly trained and educated
people. These people had had teachers to give them an introduction to
Unix and there was a system administrator to care for the box. Unix
could afford a difficult user interface because it was handled to
"elite" people who had learned it in a comfortable environment. But
Unix remained a niche system, and if Linux follows its steps it will
go to the same place than Unix. To obscurity.
Linux is not expensive. Sure it can do about everything proprietary
Unixes can do but the fact it is not expensive allows it to go where
Unix never went: in homes, in small companies and on the desktop of
average personnel. The needs, constraints and training in these
contexts are vastly different.
b) Linux at home
If you are at home that means you will be confronted to system
administration from minute one. Configure your box first and learn
later how to copy a file. In addition you won't have the luxury of
someone teaching you lesson one, then lesson two. Problem solving
will force you to tackle everything at the same time. Of course, you
should have a book, but a book is not a teacher: you cannot ask for a
better explanation of something you don't understand. Compound this
with different networking than in classical Unix and the fact he power
off your computer when he is not using it so cron never runs the
cleaning tasks who should take place at 3am. Don't forget that the
user still doesn't know about cron.
This is bad enough if you are a hacker, but to spread Linux everywhere
means that we have consider people like Mac users. Sure the Mac UI is
not as powerful than Unix command line but it allows Mac users to do
real work from day one without spending weeks in training, and many
Mac users are not interested in computers and don't _need_ to become
power users for their job. Obviously the Unix shell is not the tool
for attracting people like them to Linux and that means we have to
provide an alternative.
c) Linux in small companies
Linux could take small companies by storm in the server role: here the
guy taking the decisions is under direct supervision of the boss and the
money he spends is boss's money so there is a higher pressure to keep
costs down than in the bureaucratic big companies. But there are
three features in a small company we have to keep in mind.
First: A small company is... small. That means that it will use only
one computer as a server. It will buy only one license and that means
that if a Linux solution is significantly harder to set up then the
small company will find NT is cheaper: an employee RTFMing costs 200$
a day.
Second: There are small companies who cannot afford a full time guru:
they either externalize their system administration or have an
employee working part time on simple tasks. They clearly need a
simple system.
Third: Classic Unix servers like Sendmail and INN are an overkill for
them. They were designed having in mind the needs of big
organisations with complex needs.
d) Linux in the desktop
Just because Unix never made significant inroads in the dektop does
not mean Linux has to be restricted to a server and programmer's
workstation role. A secretary could write memos in a cheap Wintel
machine instead of an expensive Unix workstation so she was handled a
Wintel. But this no longer holds against Linux. Linux is cheaper than
Windows, can be administered from a distance, is not subject to virus
attacks and doesn't crash five times a day. The choice is clear. But
we have to include a good GUI and get rid of the "real men useTeX"
mindset. TeX is great in the hands of certain people and for certain
documents. Here what is needed is X-based Wysywyg word processors,
speadsheets and presentation software. When we cannot find a good
free tool we should ensure that the user knows about commercial ones.
e) Separating Linux from Unix.
It is now clear that Linux can go where no Unix has gone before, but
not by blindly following Unix tradition. Unix has not been designed
for the mass market. Before someone burns me for heresy let's remind
that this in fact the real Unix spirit. Ken Thompson never told Unix
had to be user hostile. In fact he separated the shell from the
kernel in order to allow different users getting different user
interfaces. Linux must keep the classic Unix interface for power
users, but there is no reason other people cannot get a different one.
II) Design guidelines.
a) Mindset.
A feature who is useful for 10% of cases but makes no difference for
the remainder is a good feature. A feature who helps 90% of users and
annoys the remaining 10% is a good feature (providing the immportance
of benefits and losses are comparable). A feature who annoys a hacker
and helps a beginner is to be added: the hacker can find easily how to
remove it. You have to forget about yourself and your tastes: you
must be able to think like someone who knows nothing, is not a hacker
and is learning Linux without assistance.
b) A superb installation is... relatively unimportant
The best installation we could do is quite simply get PC manufacturers
do the job. They will do it if Linux market share reaches critical
mass. But if the user only gets unfriendly programs, software who was
not designed for the job or cryptic docs then even that ideal install
will have achieved nothing. RedHat's 5.2 install is not so bad: it
detects hardware, can handle partitionning without user intervention
and package selection is pretty easy. Its two main drawbacks are
scarcity of online help and the fact it acts as if PPP didn't exist
the latter meaning that after install the PPP user has to configure
his networking without the same kind of handholding the LAN user gets
during install. Another point of improvement would be to boot directly
into XDM if X is configured (we can allow ourselves to do this now that
LILO tells the user how to reboot no X in case of problems).
c) Docs: the "Now what?" syndrom
After install many users find themselves completely at lost. Read the
newsgroups and you will see postings like "I typed X and only got a
grey screen". In addition there is a risk many of the nice programs
we include never being used due to bad advice got in books or
elsewhere like "Use VI, elm, pine". To avoid our efforts being wasted
it is important to have a small guide pointing the user to the right
tools and detailing how to get out of some common pitfalls.
Making the doc attractive, readable and easy to find the info. If you
look at a HOWTO collection you will find that this a reference rather
than pedagogical work, in addition there is no hiercrchy: important
docs and obscure ones, docs for advanced users and for beginners are
all at the same level and that makes hard to find where is what you
are looking for. I have seen people using Windows or NT because they
didn't knew about a Linux program: we will have to ship one of the
databases of Linux software like the LSM. Woven Goods could be a good
choice: it includes all the LDP documents, an LSM, plenty of info
about Linux software all in very nice HTML. Caldera used to include
it but no more and I think it is due to its size and the fact it has
not been translated AFAIK.
Speaking of translations, Linux could afford to have docs available
only in English as long as its users were computer nerds but normal
people are not good at foreign languages. Man pages and HOWTOs have
been translated but only few programs have been internationalized and
the same for special manuals like the GIMP manual.
e) Learning from Microsoft.
"Learn from your enemy and you will ever be victorious" Sun Tzu
"Microsoft makes crappy operating systems but they make good user
interfaces" Linus.
Look at the last versions of DOS before Windows 95. They made a file
manager (Dosshell) to be the default and forced experienced users to
edit the AUTOEXEC.BAT. In Linux we make the power user get his
favorite interface out of the box and the defenceless new user is
supposed to discover about "mc" and then the way to make it his
default shell. This illustrates what is wrong in Linux designer's
thinking: design for Linux the same way than for Universitry Unix. It
would be a good idea to provide an option at user creation "advanced
user" or "beginner" with the later being dropped into mc instead of
the shell. Of course in most cases this would be a moot point because
we hope to make XDM the default root level but it is still worthwil to
implement.
In Microsoft products like Windows 95 and MS Word the user gets a tip
each time he starts it. This ensures the user learns the main tricks
and ways to get out of problems. Microsoft engineers have been very
careful to make their programs crash often so the user learns
fast. :-) This a trick (also used in GIMP) we should take advantage
of: display a tip each time the user logs in.
f) Have you ever tried to fly a jet by RTFMing?
There was a time the Matrox Mystique was the most popular card. Too
bad it wasn't supported by XFree and for months after it was
supported there were people with old versions asking why it didn't
work. Of course they were mercilessly flamed. Nobody pointed that
the X configurator could have told flatly "This card is unsupported".
Nobody pointed that the X configurator could have told: "This version
is 11 months old, you should look for an upgrade".
"RTFM in case of problem" is the answer you give to a user who knows
were is the doc, is able to find its way through it, has only one
problem to solve (like when you try to add a service to a working box)
instead of half a dozen (like when you have just finished
installation) and isn't forced to look at the book to know how to
display a file. It is atonishing how often this simple principle "Put
the info under the user's nose" is forgotten in Linux world.
When there is a FAQ in newsgroups that doesn't mean users are lazy,
that means _we_ did something wrong.
g) Networking: the road to help
Commercial Unix users have hotlines, university users can ask to
friends or teachers but Linux needs to spread at home and Internet is
very often the only way to get help for a home user. That means that
we should introduce PPP configuration in installation. We also need a
good curses-based PPP configurator in case the user does not configure
at install time. KDE has a very good configurator but we can't rely
on it: What happens if the user needs help about X?
In many countries phone time is billed at extorsionist rates.
Therefore mail and news clients should allow offline reading. Some
small organizations could have net acces through PPP but for them
small mail and news servers would be better than offline readers and
that means that we have to ensure that traffic is routed when the PPP
link is up instead of at regular intervals like for a box with
permanent access to the net.
About small mail and news servers there is the problem they don't have
Linuxconf modules like sendmail. About mail IBM's "postfix" could be
a good compromise because it is able to use either sendmail
configuration files or a native (and simpler) mode, but we have to
carefully check its copyright
A proxy server can allow substantially faster (and thus cheaper) web
surfing, however IMHO the batch retrieving allowed by wwwoffle is more
important for small organizations and home users than the proxy
interconnection allowed by Squid (the proxy shipped by RedHat). Again
the problem is that wwwoffle has no Linuxconf module while squid has
one.
Another interesting proxy is "junkbuster" who removes the ads from web
pages. If we ship two proxies we have to design a clean and
transparent mechanism for chaining them
Finally UUCP is sorely neglected in Linux distrinutions. It could be
dead in America but still useful in Europe, ex-communist countries and
Third World.
h) Replace cron
We cannot longer live with the absurd paradigma that the user will
keep his box powered on 24 hours a day. The replacement for cron
should accept regular crontabs for compatibility, be self-tuning (use
cron mechanisms if the machine is powered on wehn the task is
programmed and alternative mechanisms if a task was not run at
programmed time) and unobstrusive (try to run tasks when load is low)
AFAIK "anacron" falls short on the unobstrusive part.
i) Get a better booter than LILO
LILO is a hacker's booter. You have to perform a special operation
when installing anew kernel, don't get menus at boot time, support for
national keyboards is tricky, you can do liitle at boot time.
Other booters should be investigated. The one I most like is the
booter you got in the now defunct Linux Universe. You got menus,
could add a kernel at boot time, explore the filesystem in case you
didn't rembered where it was, change booter parms at boot time and
change keyboard on the fly. It seems there will not be copyright
probems but there are parts who need a Microsfot or Borland assembler
unless someone translates them into as86 syntax.
j) Forcing the "mother of all Unix haters" to fade away
Every time I find a Unix hater I ask him why. So far _none_ has told
me about the shell or the cryptic commands as the main reason. _ALL_
of them pointed to VI as the culprit. (Emacs does not seem to cause
this kind of allergia :-). In fact this hate on sight feeling is
common to all modal editors not only VI. "VI is the editor you will
find in all Unixes" is a weak argument because in most jobs you can
install another one and in addition we couldn't care less about other
Unixes: Linux will replace them sooner or later so why should we
bother about their editor. Sorry but VI is NOT an adequate tool for
spreading Linux.
There is no question of removing VI, but
-Never confront an unwilling user to it. The Editor environment variable
must be set so that programs needing an editor don't call VI. It is simple
to set it differently according if we are using X or not.
-Never place the user in a situation where VI is the only editor: ensure
there is another editor in /bin with all needed shared libs in /lib so
the user can resort to it in case he cannot mount /usr. In the same way
place an alternative in rescue disks.
-Never have sentences like "use your favorite editor, for example VI" in
docs specifically written for this project. Notice that I am not
proposing censoring LDP docs.
k) The user should not have to recompile the kernel.
It is our duty to ship kernels complete and good enough for the user
never needing to recompile them except for sport and very experimental
features. In 2.0 the performance increase you get by recompiling the
kernel (in a hald decent distribution) is nearly nil, despite what
vulgata could say. See my analysis in the Independence-features RPM.
About the only case you need to do it is if you are using a
multiprocessor box. This could change in 2.2 specially when using better
compilers than gcc 2.7 so perhaps we will be forced to ship several
kernels and have the installation choose one according to processor
type and chipset.
l) X should be the default mode.
In our days the memory and CPU frugality of curses-based applications
is not important. People used to windows will dislike this kind of
apps and in addition they find themselves clueless in front of the
command line. We should make XDM the default boot mode (with LILO
telling how to boot non X).
m) Consistent GUIs
Tradtional X has had nearly every application having its own look and
feel. This is disconcerting for the user. Fortunately Linux now has
complete GUIs were every application shares the same look and feel and
in addition have richer intercommunication protocols than when using
different toolkits. About the KDE-Gnome rivality my position is
neutrality, ie ship both.
m) Selecting application programs
Programs must be user-friendly (of course) and good looking (to an
unexperinced user ugly programs cause an instaitive feeling of being
buggy and feature poor). If they need having resources set to make
them attractive then it is to us and not to the user to do the job.
Having easy programs is not enough, they need to cover a need for the
user. For that reason we should look for programs who are fun, allow
artistic expression (I would like to ship a GIMP with every plugin
available and the programs for using TV cards) oruseful for real life.
I on't want the user needing to boot DOS for managing his check book.
Agreed some of the free programs we can include are not as attractive
as their Windows commercial counterparts but by providing there is a
chance the user will be annoyed to boot DOS n order to use them and
stil more reluctant to buy them.
n) Games
I don't favor SVGAlib games because each one is a security risk and a
number of cards are not supported in SVGAlib. We can make exceptions
from time to time because unfortunately X is not adequate for action
games (not without GLX who is not supported in XFree). About
networked games they can be played in America and in Universities but
in Europe phone is too expensive and this restricts their use to
people having several computers in the same home.
o) Delivering ready to use applications
We must try to get installations requiring as little user intervention
as possible. X apps must get their way into menus, resource files
hacked for good looks, parms who can be deduced automatically must
find their way in config files (think in a networking client who needs
the machine hostname) and so on. Whenever possible we should avoid
automatic edition because it is relatively dangerous specially in case
the user has manually edited the file. Instead we hould use file
inclusion or directory scanning (look at /etc/profile.d for examples)
if at all possible.
p) Plug and play cards.
Using the isapnp tools is very difficult, this makes using most sound
cards a daunting ordeal. The Pacific High Tech distribution has a
GPLed tool whose main drawback is being curses based. There is also a
Gnome front end but for one part it is unfinished work and in addition
this lets KDE or classic users out.
q) Samba
Having Linux sharing disk space with Windows boxes is not uncommon be
it in the server or client role. Linuxconf allows to configure Samba
so unless there is a significantly easier configurator I think we
could let the things the way they are. What is needed is a tool for
on the fly mounting of Windows shares from Linux. There is a tool,
called TkSmb I think,from a russian programmer but last time I checked
it had a very primitive setup. But there have new versions since then
and it should be checked.
r) Printing
We ship ghostscript 5.10 and we should introduce in the printer
configuration database the additional printers respective to the
ghostscript 4 shipped in plain RedHat.
====== END ===
--
Jean Francois Martinez
Project Independence: Linux for the Masses
http://www.independence.seul.org