A Modern, Low Resources Linux Distribution and All the Real Reasons Why We Need It

by Marco Fioretti

GNU/Linux distributions keep improving at a very fast pace. Every release adds support for new hardware, new features and security improvements, both for server and desktop applications.

Unfortunately, this triumphant march has a pretty big down side. Although excellent software can be obtained freely, hardware never will be. Up-to-date, full-featured Linux applications, especially desktop ones, require almost as much hardware resources as proprietary ones.

Of course, everybody pretending to start serious video editing with a computer more than two years old should really change his mind, regardless of the operating system he happens to run. The problem is that even system administrators and experienced desktop users often find that the same things they were doing yesterday become slower after an upgrade. If your CPU is 20 times faster than ten years ago, why does it often take the same amount of time or even more to get from powering up to reading e-mail? Even, people with only obsolete computers available and who have limited programming knowledge are almost forbidden from entering the "Free" Software world.

Want the real reasons why this is a serious problem? Read on.

Economics

The standard attitude about unnecessarily heavy programs is "why should we care when desktop hardware is as cheap as it is today?". An example is this strategy letter, which, among other things, says, "I don't think anyone will deny that on today's overpowered, under-priced computers, loading a huge program is still faster than loading a small program was even 5 years ago. So what's the problem?"

The problem is that (even when it's true) this is a very limited and egotistic attitude: today's computer's are "under-priced" only for 20% of the world population. The rest still has to work many months or years to buy the stuff that makes KDE or GNOME look fast.

Even those with enough income to throw away a perfectly good, working PC every two years should not be forced to do so if their needs have not changed. Unfortunately, the two most frequent answers that one hears when raising this issue are:

  • "become a programmer and recompile/write from scratch yourself": snob answer, impossible in most cases.

  • "use an old distro": why? Why should anyone use a kernel with limited firewalling capabilities, compiled with obsolete libraries? Why should anyone run the open door that Sendmail was some years ago?

Schools, families, developing countries, public and private offices with almost null budget (pretty big segment nowadays) must save on all costs, no matter how low they already are. Often, the only PCs they can afford are donated and really old, and Free Software can't leave them alone. Besides, homework, word processing and spreadsheets don't need multimedia capabilities.

Long-Term Survival of Free Software

Domination of all table and wireless desktops is crucial in the long run. Whoever controls the majority of clients and unexperienced users eventually enslaves all servers too, regardless of quality.

New Technologies (or, again, long term survival of...)

Don't think that reducing the hardware requirements of Free Software confines it to die in some (big) obsolete hardware graveyard. Actually, the opposite is true.

Think of to all the new, low cost, internet appliances that are restricted to being only that because being able to run a current distribution would double their price. Even more important, think mobile computing. We are all supposed to surf, compute and produce wirelessly "real soon now" with really tiny boxes: watch-sized PDAs, third-generation cell phones, whatever. If mainstream Linux cleans up quickly its many existing desktop applications, it will dominate this market before the others finish saying "Hardware improves rapidly, let's just wait until they make the Pentium IV as small as a StrongArm."

Ecology

Computers are useful, cool and among the most polluting kinds of domestic waste. They should be dumped (separately) only when they physically break, not because Super OS 2002 is free but won't run slower than one gigahertz.

Culture and Freedom

Basic desktop computing is quickly placing itself next to the alphabet in the list of tools necessary to fully express oneself and build one's destiny. As such, it must be free not only from patents and licenses, as free as Free Speech, but it should also cost (hardware included) as close to zero as possible.

Equal opportunities is what Free Software is really about. I feel bad whenever I hear Free Software programmers still saying, "as long as I have the source and can program as I like, learn to compile by yourself and don't bother me"--even to grade school kids without money.

A Low Resources Distribution Option

Of course, I'm not asking anybody to give up his multi-everything desktop. The freedom to choose has always been our strength. However, I really feel that we have many reasons to stop for a moment, look at all the cool free software developed lately, and make it "free, even if you don't need all of it or even if you can't afford a pretty fast machine".

I'm not even suggesting that we need to start yet one more GNU/Linux distribution. As a matter of fact, advanced and very valid projects that do this already exist, but they are all practically limited to users with a lot of competence and spare time to install, maintain or add packages.

What we should do instead is make the base installs of mainstream distributions less resource-hungry. That way, nobody would be forced to change distros or buy more RAM because the new version is twice as big. And any newbie with an old machine could get started and find a lot of extant packages and documentation.

To make all this happen, I believe that some changes in the current attitude of all parties involved (users, developers and distribution packagers) are necessary.

Users must learn to use computers productively and (be helped to) start discriminating between real features and eyecandy. Developers should start to realize the importance of what they are doing, and make it easier to repackage their programs for low resource environments.

Above all, some changes of focus among packagers and maintainers of installation programs is required. To begin with, the install process should be as light as possible. I go nuts when I think that I could build my very own kernel that will run in 8MB of RAM, but only if I have 32MB for the installer that will put the code and compiler inside the PC.

The second thing that would really help is to have two packaged versions of all the most common applications: one to be used standalone, and another inside its original desktop environment. For example, one standard Konqueror for KDE and a stripped, statically linked version with all plug-ins if you have no other use for QT and KDE.

The same applies to graphical configuration tools. They can be very helpful for a newbie, but many of them require at least GTK, QT and often some other GNOME/KDE piece. The reason is, without those toolkits and libraries, building such tools would have been much heavier.

I completely agree with that setup, but I can't help but notice that the current packaging means that to reconfigure your firewall, for example you have to add many megabytes of stuff in order to avoid tons of dependency-failed messages.

Compiling and packaging all these things as monolithic binaries would give back to the user almost 50% of the capacity of an old hard disk, without any real performance loss (for occasional use on low RAM computers, at least). Note that such space savings are decisive for the success of an installation on obsolete hardware.

Once these prerequisites are satisfied, we would have all the building blocks for a predefined "Basic Desktop/Server" install option or, more exactly, three sub-options:

  • bare system, where bare means kernel, package manager and networking support. It should take not more than one hundred megs, possibly less.

  • text-only desktop: bare plus SMTP server, GPG, ssh, e-mail client, browser and text editor.

  • Full GUI desktop: all of the above, plus one or two graphical applications for desktop use, possibly not depending on GNOME/KDE, or at least packaged as explained above.

Choosing the second or third sub-option should give you, without other efforts, one or two tools for each normal desktop task.

What follows in this article is a partial list of applications that are in no particular order, partly based on my personal experience, and partly on what I've read across the Net, that should be included in the low resource distribution.

Please keep in mind that I am not a C/C++ programmer, so look first at the functionalities defined as necessary, and then look at specific solutions that I suggest. It is entirely possible that some of them are not practically feasible or just plain wrong. Please let me know when this is the case and suggest alternatives.

The common denominator behind everything that follows is my belief that all users, including newbies, who need to run GNU/Linux software on an obsolete PC should go mostly for character-based applications and not feel sorry for that, as long as they can still do quickly everything they need. There is nothing intrinsically bad in full GUI programs, but, at least in this context, they might just make the problem worse.

System:

  • kernel capable of iptables filtering and journaling, but without support of exotic filesystems or all possible types of protocols, with swap usage and other relevant settings tuned specifically for limited RAM;

  • xinetd/iptables set up to allow only web browsing (not serving), ssh, e-mail, FTP and Telnet. Of course, by FTP and Telnet we mean the client applications only, and by e-mail we mean simply sending outgoing messages to one's ISP servers and retrieving messages from them.

    In other words, an iptables firewall script written for typical SOHO use should be provided: dial up on ppp0, trusted interface on eth0, unlimited traffic on the internal network, ppp0 only accepts HTTP, FTP, e-mail and a few others clients inside the PC.

  • XAsk to add pop-up menus to shell scripts, so that full GUI-based interaction with the system doesn't necessarily require a full desktop environment

  • CDBKup to backup whole directories on CD-ROM

  • No font server

  • TinyX X server instead of X11. An alternative way to look at multiple windows simultaneously at a high resolution; could be framebuffer or SVGAText coupled with the character window manager TWIN.

  • GPG (of course, at least for digital signatures)

  • AIDE (GPL alternative to Tripwire)

Window Manager: Blackbox with bbtools, all preconfigured to be fully usable without the mouse.

Internet Connection: shell scripts attached to the root menu, instead of graphical or distro-specific tools.

E-Mail: Mutt, light (text-based, but e-mail is text) and more capable than you will ever need; configured by default for digital signature. To retrieve e-mail, procmail with Tk-RED recipe generator, fetchmail and one of the several tools that can be driven from Mutt to cancel unwanted e-mail on the server instead of downloading it, popfilter, popsneaker or animail. To send e-mail, ssmtp (or postfix), configured at install time to just send to your ISP and answer to fetchmail.

Browsing: wget, w3m and choice among Galeon, Skipstone or Konqueror, packaged as above. Actually, it would be interesting to port the embedded version of Konqueror back on old Intel processors and see what happens. The Dillo browser also sounds very attractive in this context.

Printing: pdq instead of lpd; full Postscript support, but not through Tex/Latex, and packaged in such a way that doesn't force you to install every possible font on the planet.

Fax: shell script-based on fax/efax, with cover support, TCL/TK front-end to enter file location (disk or scanner) and phone number.

Scanning: Sane, of course, packaged in a script attached to the root menu when you just need to dump something to your home directory or with its Xsane graphic interface.

File Manager: midnight commander and ROX; yes, I know that although real men do it with just the prompt and the find command, but in order to sort the photos of all your vacations you do need a GUI file manager. ROX, with its thumbnail option and all the other real features, does the job well.

Text Editing: basic vi for system administration when everything else fails, choice between GNU EMACs (repackaged to occupy less disk space and with the chapter numbering and HTML generation capabilities provided by Jari Aalto's tinytf package) and vim for more sophisticated use. Of course, both EMACs and vim should have full colour support even in console mode.

Productivity: for normal needs, AbiWord, SIAG and MagicPoint for simple presentations. If opening *that kind of file* is necessary, OpenOffice--not lightweight by any means (not yet at least) but the best solution around; in RPM or DEB form, installed in a system directory according to the LSB.

Small Business Tools: let's help small business owners too: packages for accounting, inventory, medical records, pizza ordering systems and so on. Of course these should be available and listed for the user's choice, not all installed by default.

We Are Working on It

After many discussions on the Red Hat Enigma mailing list (and on several others) others and myself have just started the RULE project (Run Up2date Linux Everywhere), which aims to add an installation option modeled according to these guidelines to standard Red Hat Linux.

The RULE home page (this is the URL while I'm writing, but it will change soon) is always linked from the project site. The FAQ also explains why the project is Red Hat-based and why and how even users of other distributions can benefit from (and help) RULE. Please visit us, and subscribe to the RULE mailing list if interested.

Conclusion

Advanced, text-based configuration with vi is a user space task, and I think everybody should (be helped to) learn it: programming from scratch is different. Users deserve more than the forced choice between becoming programmers from the very first moment and spending unnecessary money to use "Free" Software. I encourage you to try all the tools I mention, if you don't know them yet. The home pages of some of the least known ones are listed below. Run them for a while and, if it's needed and you know how to do it, put them in your distribution packaging format and make them available. Even on more powerful machines, doing more, faster, with less, can be lots of fun.

Load Disqus comments