Ruminations on the Digital Realm

Jan Stedehouder

Archive for the month “April, 2007”

8 reasons why Linux won't make it to the desktop

The Linux Proliferation Agreement is intended as a means to promote the use of Linux on the desktop and asks endusers to become structurally involved in making Linux visible in the public domain. Apart from the support there came a wide range of counterarguments of things that Linux would need to change before it would ever become a viable choice. Let’s look into the counterarguments, summarized in the 8 reasons why Linux won’t ever make it to the desktop. But do they stop the Linux Proliferation Agreement as well?

(1) Some programs are not available for Linux
Correct. No question about that. For the most part programs from the Adobe/Macromedia suites were mentioned, but no doubt there are more programs for which there is no suitable Linux counterpart yet. If you use these programs intensively on a day to day basis the migration to Linux is cumbersome. But let’s not forget a few things. This problem is valid for a group of endusers, but hardly for all. Not everyone is a Dreamweaver or Photoshop power user. Most software has far more functionalities than are used from day to day or even regularly by the largest group of users. When promoting W2L migration we should focus on the functionalities actually used, instead of the programs as integrated packages. In those cases where the software is intensively used we have two possible solutions: emulation (which can not be called ’emulation’, so maybe ‘pretendization’ is better) and virtualization. The mentioned Adobe/Macromedia suites are in one way or another supported by Wine and CrossOver Office. Virtualization through VirtualBox, Xen or VMware gives desktop access to a Windows installation and the software that requires it. Yes, it might require the investment in extra RAM, but that is hardly expensive in exchange for saving all other proprietary (and paid for) software by open source counterparts.

(2) Linux is not a platform for games
Correct as well, but should that stop the Linux Proliferation Agreement? Vista isn’t really a gaming platform as well, but who is stopped by that? Gaming is used as a benchmark and it is a multibillion industry, but in the end it is only a part of day to day use of computers. And hardcore gaming only influences a niche of users, a niche that has people that won’t mind paying € 600,– for a new graphics card in order to play a game that will be released next month. No doubt an interesting group from the perspective of the hardware industry, but hardly representative of most endusers.
Now, there is something strange with the hardware requirements of PC games as well. I remember Thief 3. When it was released, it could be played (officially) on a very limited range of graphics cards (which, mind you, didn’t bother the hardcore gamers). However, the xBox version had the same graphical effects on a console that has far lower hardware specifications. Is it strange when I ask myself how this is possible? Could it be that the hardware specifications and the OS limitations are more artificial than actual?
Anyway, the number of games released for Linux is limited in comparison to Windows. Cedega provides a limited solution for some of the most popular games. But should that stop us? Should this prevent the use of Linux in schools? Should we hold back in properly educating and empowering endusers because of it?

(3) Linux needs a unified desktop
Having used Windows for years I can understand the argument. Quite a few endusers panic when the icon is no longer in the same place. Endusers are trained in using the Windows GUI, but how many users are actually able to find the various system tools and use them? The majority can’t and don’t care.
Linux is different. Linux doesn’t equate the GUI with the operating system. What you do and don’t do with the GUI is a matter of personal preference, an expression of your personal identity. A unified desktop seems like a good solution, but it isn’t. Proper education and training in the use of Linux is. It will provide the endusers with a choice and a freedom to use the computer as they see fit. As we see fit.

(4) Linux should drop the commandline stuff
I don’t hope so. It would mean that the GUI has become the operating system and that my choices have become limited. Recently I gave a workshop “Installing software under Ubuntu Linux” to a group of about 20 endusers. They had little previous experience with Linux, but a substantial number had experience with DOS. We started the workshop with Install/Remove…, moved on to Synaptic and continued with apt-get and aptitude. The participants learned four ways to manage software on Debian-like systems. The commandline stuff could be repeated and understood more rapidly than than the GUI alternatives and most went home feeling empowered to move on with Linux. The commandline is less scary when properly explained. Understanding the commandline does prevent a lot of “it won’t work anymore”s when the GUI is frozen.

(5) Linux needs better hardware support
It does. But did anyone install Vista recently? And, did it work? Did all of it work? Could you install all software? No, we couldn’t. Yes, there is a need for improvement. There are too many accounts of monitorsettings not being correct out of the box, of wifi being hard to get working etc. etc. There are solutions for most hardware problems. In my early Linux days I had to buy a new external modem, since my winmodem wasn’t supported. Before that I didn’t even know a winmodem existed. My old iMac also gave me some headaches with x, but searching the internet provided the proper solutions. See, empowerment does help.
One thing should not be forgotten. When the number of endusers reach a critical mass this problem will be moot. We have to educate the endusers to buy the proper hardware, hardware that has “Linux compatible” on the box. The market forces will take care of the rest of the problem.

(6) Linux needs to get better first
It needs to and it will. But that doesn’t change the basic skills we need to use and troubleshoot Linux on a day to day base, hence the basic skills we need to teach new groups of endusers. Besides, with a larger group of empowered and involved endusers we will see more and better bug reports which will aid the developers.
But what are the areas for Linux to improve? Which functionalities are now non-existent that make most applications unfit for day 2 day use? Yes, there are gaps, but those are not relevant to the largest groups of endusers. For some (even quite a few) lacking a decent Thunderbird- Exchange server connector is a serious problem. However, should this prevent us to promote the use of Thunderbird in all other organizations that don’t use Exchange server?

(7) Linux is too splintered
The Linux world knows many, many distributions. Software needs to be offered in various shapes and sizes (RPM, DEB, tarball etc.) and we might want to keep our fingers crossed even then. Managing software through the distribution’s own repositories won’t cause too many headaches, but a little venture outside the trotted path gives us a serious migraine. The various communities are -alas- equally divided. With religious fervor users defend their own distribution, calling down heavenly fires upon the other distributions and ostracizing those from within who are not as ‘pure’.
Honestly, I don’t have much use for this kind of involvement. Endusers aren’t bothered with distribution A of B, or with softwaremanagementtool X, Y or Z. They are doing their work and use programs to finish their tasks. Most of them hardly care whether the OpenOffice.org runs on Windows, Ubuntu, Fedora, Mepis or Mac OS X. It doesn’t matter and it shouldn’t stop promoting the use of Linux and open source software. We do need proper IT-education and that is exactly what the Linux Proliferation Agreement is about. Endusers can learn to deal with the choices Linux provide. Compare that to gaming. When was the last t
ime hardcore gamers compla
ined that a game could only be played on the xBox and not a Playstation 3. They don’t (or at least don’t stop gaming because of it), because they understand the system. Education it is.

(8) Linux is a philosophy
Fortunately it is. Linux is build around the principles of cooperation, transparency, participation and free distribution. Call me an idealist, but I can’t find much of a flaw in these principles. When things go wrong in the Linux world it goes wrong when individuals or groups forget these principles.

In the end I appreciate the discussion and the arguments against the viability of Linux and/or the Linux Proliferation Agreement. Most likely there are more counterarguments and I look forward to hearing them. But one thing is clear to me as well: there is nothing seriously hindering the advance of Linux from a technical perspective.

Tags: Linux

Advertisements

The need for the Linux Proliferation Agreement

The year of Linux on the Desktop (May 1st, 2023)
“There is something brewing in the world of Linux, because this year really seems to become the year Linux makes serious headway on the desktop. For Linus Torvalds it is the addition of synchromatic memory support into the new kernel 6.4.12-28 that is a serious and major step in the right direction. ‘We now have support for voicecontrolled systems at the kernel level’. In a few months time we will also see the first distributions with ext9 as the default filesystem, like MepubianSpire and Novell Red Hat. Mark Shuttleworth, the ever active CEO of MepubianSpire, is thrilled about the prospect. ‘Wonkey Willy will be the most stable innovative distribution we present to our users and with another delay of Microsoft Himalaya at hand this is the window of opportunity to aggressively increase our market share on the desktop’.”

So, when will it really happen? The breakthrough of Linux on the desktop is expected for years now and each year we draw the conclusion that it didn’t happen. Yet. And 2007 looked promising. Windows Vista is way to expensive, requires a serious hardware upgrade and for what? Eye candy that was available for Linux 6 months ago! Well, it was available for that group of Linux hackers that could install the right drivers for their graphical cards and had no qualms adding the repositories for Beryl/Compiz. Linux mainstream will have to wait until Ubuntu ‘Gutsy Gibbon’, which may have that feature. Or not.

What is lacking in modern Linux distributions?
But let’s step back for a moment and ask ourselves the following question. “What is missing from the current generation of Linux distributions which prevent a regular and day to day use of Linux, both at home as in organizations?” And -please- don’t start pointing at your iPods. Why not? Because iPod is a relatively young technology and it is just a matter of time before support is added. It did take time for Windows and Mac OSX to properly support the new toy and every once in a while even Apple screws up an update. So, what is missing? Word processing? Calculation? Email and groupware? Access management for users in organizations? Multimedia functionality? Security? I am sorry, but I can’t think of anything that would be needed before 90% of the current Windows userbase van migrate to Linux. In fact, I am even willing to state that Linux has been ready to breakthrough to the desktop for a few years now. But why is it not happening?

Now, all in harmony: “Yeah, but what do you expect. Every computer you buy has Windows pre-installed”. That’s correct. Even a staunch Linux supporter as HP sells desktop computer with Windows, not Linux. “Indeed”, do the channel partners say, “but there is no demand for anything else”. Which is correct as well. No demand, no supply. And, vice versa, when there is no supply it is hard to get any demand as well. When discussing the breakthrough of Linux it is usually a discussion between the technophilia. For them it makes sense, gadgets will sell themselves, right?…. Wrong!

The commercial disadvantage of Linux
Microsoft is not just a dominant factor when it comes to the supply chain, with or without serious discounts to the channel partners. The people in Redmond know exactly what to do to promote demand for Windows. Why do you think hardware suppliers and the store owners where elated with Vista? You think it was the alleged technological superiority of the new operating system? Think again. Vista needs at least 1 Gb of RAM, twice the amount of XP. Plus a graphics card with at least 256 Mb of RAM on board. And… Just add to your own list. Vista means dollars for producers and merchants. Oh, and the complaints of some softwarefirms had less to do with the alleged (lack of) quality of Vista, but more the fact that they didn’t belong to the first tier partners at the Vista launch. They wanted their own spot on the Windows marketplace. Dollars….

And now Linux? Sorry, but hardly interesting in a commercial sense. Linux runs pretty well and very fast on yesterday’s hardware. But that hardware is sold already. Linux doesn’t bring a new revenue stream or an opportunity to increase prices for the same functionality.

But.. what about consumer demand? That could change things? In theory, yes, but only when there is a critical mass of users that request Linux. But consider the following:

– At teacher training colleges teachers have the opportunity to get certificates of the European Computer Drivers License (ECDL). Most text books only deal with Windows and Microsoft Office;
– For most children, the first experience with computers, at home or at school, is Windows, MSN and Microsoft Office;
РUsers of internetcaf̩s or e-centers sit down behind computers with Windows, MSN and Microsoft Office.

So, when do you think the critical mass of Linux users is reached to make a difference in the market place?

Moving towards critical mass
Let’s make a few things clear. Right now there are more than enough distributions that can be used by endusers at home and in a business environment from day to day. The remaining problems with hardware and software can be fixed quite easily. For a small group those problems can be a good reason to postpone a wholesale migration. We don’t have to, shouldn’t wait for the next generation of new features in Linux. And it definitely is of no use to wait for the box builders to see the light and watch them exchange Windows for Linux. Linux will not breakthrough on the desktop because we wish it so. What do we need to do, other than wait until 2023, for Linux to make it to the desktop? We need to do more.

The development and distribution of Linux is mostly determined by two groups: the developers and the companies that use Linux as a competitive tool towards Microsoft. We can see some local and national governments contributing by ‘demanding’ the use of open standards and open source software. It’s a start, but -again- we shouldn’t wait for nor trust in it. More than one government reversed it’s decision to migrate to Linux and who knows what happens after the next elections. As long as Windows is dominant in the public domain (education, e-centers, internetcafé’s) we will hardly see a change in it’s use. We – the Linux desktop users- are but a marginal group. Some of us try to convince friends, relatives and neighbors and we are almost jubilant when we have made a ‘convert’. We almost ‘live’ in Linux forums and IRC channels to lend a hand to newbie users with problems. And please, keep on doing that.

To reach a critical mass of Linux endusers takes time. What we need is the establishment of an international community, a movement that is dedicated to the worldwide promotion and distribution of Linux as a system for endusers, analogous to the international developer communities that made Linux possible as a system. Why should it only be the hackers that altruistically devote their precious time and energy? Don’t you think it is time for us – the early adopters- to follow their example in a structured and organized way? I seriously endorse launching a Linux Proliferation Agreement, a charter which will the basis for a pro-active promotion of Linux by qualified endusers.

The Linux Proliferation Agreement
What should the Linux Proliferation Agreement entail? In short: to make Linux ubiquitous in the public domain. This means establishing public e-centers, well maintained, with constant personal support and where people can get training and courses in the use of Linux and open source software. It means offering low level training and courses in community centers and on schools, for children, parents and professionals in branches other than IT. It means using all sorts of media to make Linux visible, in the local media and targeted media. No general purpose articles, but indepth articles providing Linux and open source solutions for real day to day problems in organizations. It also means developing educational tools for LPA courses and training, but also for use in schools. We need an ECDL that is completely build around open source and with which candidates can get certified.

The Linux Proliferation Agreement shouldn’t be a rag tag band of volunteers, but should be organized analogous to the Debian Project. The LPA charter sets out a clear goal and purpose and quality should be first and foremost. The organization and it’s reputation needs to be build in such a way that enlisting the help of an LPA volunteer is synonymous with getting a high quality endresult. To achieve this the LPA needs a system of internal training and coaching. Plus some form of certification. Organizations should have access to an online register of certified LPA volunteers. Branding is one of the key tasks of the LPA organizations, with the development of marketing kits etc. The quality and the network are required to attract funding and sponsorships.

Towards a roadmap
What does it mean for you and me? Well, maybe we are asked to assist in our local e-center a few hours each week. Or to provide an extensive training in OpenOffice.org in our local community center, on Saturday afternoon, for the next 20 weeks. One thing needs to be clear from the outset: joining the Linux Proliferation Agreement brings a new set of obligations. But is it different for the developers of Linux? When there is a serious problem with the kernel, don’t we expect a solution or at least a quick fix within hours? I will work on a roadmap for the Linux Proliferation Agreement in the months ahead. Who will join me?

Tags: Linux, education, Linux Proliferation Agreement

Column: Polemic research

What may we expect from a research report? Is it possible to be objective, neutral, when doing research? While I was in university majoring in societal history this was one of the fundamental questions. Can any research into social phenomena be without bias? As a blogger and columnist I present opinions, mine. But what if I wanted to report seriously about the strengths and weaknesses of developing open source software. I strongly believe in the classical scientific method: formulate a hypothesis, determine the instruments, collect data, analyze them and see what the results lead to. No matter whether I like what comes up or not.

Unfortunately there is too little research of this type in IT. Commercial and/or ideological interests play too great a part that the conclusions are written beforehand. I do think it is worse when a blogger/columnist gives personal opinion the odium of research. Like ‘The sorry state of open source’ which was published at Planète Béranger earlier this week.

The 25 html-page piece is more like a j’accuse, a polemic instead of clear and objective analysis. Which is a shame since the issues mentioned do deserve a closer inspection.

They key argument is that the development of Linux has deviated from solid Unix traditions. As a result there is more attention for features instead of functionality, for innovation instead of stability and for quick hacks instead of structural solutions. PB’s examples to support the argument are quite disturbing. It reminds me of my own thoughts when I read about the inclusion of KVM in the Linux kernel, a new technology by a complete outsider.

The drive to become userfriendly is blamed for the deviation from Unix traditions. PB pushes for education over userfriendliness. The lack of proper Unix/Linux training shows itself in the dramatically reduced quality of help and advice in forums and IRC. The last argument I have been heard before, but is it really necessary to go all the way back to the man pages? I agree that online documentation should improve, but the man pages are simply not accessible enough for most endusers.

It seems like -from the viewpoint of PB- that userfriendliness and quality are mutually exclusive. FreeBSD fell into that trap and apart from Slackware most Linux distributions as well. Some releases are okay, but mostly PB encourages to take the example of *BSD.

And this is where the article looses it. OpenBSD and NetBSD are not for the desktopuser. Different target group, different philosophy of development. To start here and judge the (non-)value of Beryl/Compiz and the loss performance because of it is nonsense. PB is full of dédain about Gnome versus KDE debates. You won’t find such wastefull debates among ‘serious’ users of Flux and windowmaker. Yep.

I can imagine -and the article makes a solid case for it- that there is a tension when releasing distributions. There are benefits to working with predetermined release moments, but this could also encourage using quick hacks as last minute solutions. The iso’s have to be on the mirrors by tomorrow, so… There isn’t much wrong with this approach, but one wonders whether the quick hack will be replaced by more permanent solid solutions. The new releasedate is rapidly approaching and -be honest- it is way more fun to work on new feature. ‘Bleeding edge’ does sound so much better in the Blogosphere then ‘stable’.

When we look at the large group of Windows 2 Linux migrators (W2L) the release cycle doesn’t have to be that fast. Windows users are accustomed to working with a version of an operating system that lasts three to five years, with a service pack or two to fix the holes. The same goes for most of the desktop software they use. The largest and most important group of Linux users keeps all kinds of servers up and running and from a professional perspective they also don’t need 6 monthly releases. So -it seems to me- there won’t be much complaint when a release is one or two months late, especially when this visibly benefits the quality of the release. Sure, there will be a cry of complaint from bloggers and online tech-reporters, but who do they write for anyway.

‘The sorry state of open source’ is a polemic wrapped around a number of very valid arguments. But it is not research and the whole article doesn’t contain anything that justifies the weird copyright notice that comes with it. Promoting a more solid method of developing Linux and better documentation, both following the best Unix traditions is fine, as is impressing the need for better enduser education. Sadly, the overall presentation might the factor that prevents most from accepting that.

This column is part of a series I write for Digiplace.nl, a Dutch website where Ubuntu users meet and learn.

Tags: Linux, Open Source

Sorry state of open source today

Planete Beranger has quite an extensive piece on the state of open source software. One thing that rattled some cages is the copyright notice for the article:

LEGAL NOTICE: As an exception to the rest of the site, this article falls under the following specific legal terms: Copyright 2007 Radu-Cristian Fotescu and JEM Electronic Media, Inc. No reprints nor reposts without written permission from both copyright holders.

But this shouldn’t hold anyone back in actually reading the arguments. For me it will have to wait until the next weekend, but I am curious.

Tags: Linux

Column: What's his beef with Linux?

“What’s his beef with Linux?” I can imagine some people wonder if I have something against Linux. The first two contributions to Digiplace.nl were quite critical in tone. Well, to clarify one thing: I have nothing against Linux. On the contrary, I would take any opportunity to promote Linux among Windows users. On April 1st one of my online buddies deleted Windows from his harddrive, after receiving a step by step guidance through all the problems he encountered with Ubuntu. After a demonstration, a church pastor recently started promoting the use of Ubuntu Linux among church members in order to reduce the use of illegal software. And in a few weeks time I will explain the joys of software installation via synaptic and apt-get to a group of novice users. That is how I contribute to the spread of Linux.

But, I am not blind and deaf during my promotional activities. At each event I hear real life problems and frustrations. I know there are solutions to most, if not all of them. The solutions start with the phrase: “you have to do a Google search for..”, after which the majority of users is bombarded with a variety of possible solutions. Why? You have to know how to phrase the real problem, know what really is wrong.

Thanks to Microsoft and years of IT-education most users are unable to do that. “It just doesn’t work anymore”. When was the last time you contacted an IT helpdesk? I won’t do that anymore. I’d like to get my assistance on my level of expertise and refuse to talk to a well-meaning assistant that first has to ask whether I plugged in the computer properly. Unfortunately, such questions are necessary for many other users and it helps the helpdesk to isolate the real problem step by step.

There are various methods to promote Linux. The first method -the least effective as far as I am concerned- is the RTFM approach. Let the users find their own solutions. An alternative approach is used by the likes of Ubuntu, Linspire and Xandros. They want to make W2L migration as easy as possible and simplify all management tools. Add closed source drivers and proprietary codecs in the mix and we have a first Linux experience that is not the cause of post-traumatic stress. Granted, this approach is more effective than the RTFM approach, but it is still not enough. Why not? Murphy’s Law, of course! Something will go wrong and what do you do then? Besides, even these distributions require rudimentary skills with the commandline interface. And most W2L migrators haven’t seen a commandline since Windows95.

If Linux needs to gain ground on the desktop, we don’t need to wait for the operating systems and applications require a future level of maturity. The maturity is there. I would argue that it is no problem to migrate most home and business users right now. It might require the support of Wine, Cedega, Crossover, VirtualBox, VMware en/of Xen, but it is possible. Adding Click-and-Run to Ubuntu might look like a good idea, but it doesn’t solve the real problem: bad IT education.

And this is where we -you and me- come in, to add effort where our mouth is. Sending around free CD’s with Linux is good. Writing helpful How-to’s is very useful. Offering tips and tricks in IRC and in forums, please continue with that. But we are talking proper education here, helping people to understand the operating system, teaching them problem-solving skills. No install fests,but educational programs about the how and why of Linux.

And we have to learn to listen in order to send back the problems and obstacles to the developers and distro builders. To help them build real solutions. Until those solutions have materialized I consider it my obligation to point out the shortcomings. Not out of spite, but out of love for Linux and it’s scores of developers. Both of which deserve a better place on the computer desktop.

This column is part of a series that appears on Digiplace.nl, a Dutch weblog run by Jos Herni.

Tags: Linux, Ubuntu, Education

Column: "But mine is free"

Didn’t you ever drive behind one of those? One of those vehicles on four wheels that are legally considered cars, whose proud owner has a bumper sticker with the line: “Smile. But mine is paid”. Before the Berlin wall came tumbling down there were proud owners of Eastblock produce. Lada, Skoda en Yugo. During my years on Curaçao I had the Lada sportsmobile, the Lada Samara. It was ideal! The thing was a tank with a platework so thick no rust could get a handle on it. And when someone drove into it, the other car had the damage. You only needed to paint it army green, put a cannon on the roof and the monster was ready for the Cold War.

But, what was the sales pitch for these cars from the workers’ paradise? They were cheap, functional and robust. The owner were almost elated about the barren simplicity. These cars were so simple you could repair them yourself. You had to. The market value was zero point nothing and sales were less than sluggish. No, then the Japanese cars. More rust then metal. But sales were soaring, because they did have a deluxe shine with that metallic coat of paint. Hmmm, reminds me of an operating system I know.

Fast forward to the present. The workers’ paradise is no longer behind the Iron Curtain. The communist continues life as a Linux hacker and the Communist Manifesto is transformed into the GPL. World domination is to be achieved via an operating system and applications. Do you know the main sales pitch? Linux is free, secure, stable and functional. Yes, you can, are allowed, no, even should look into the source code. Ha, you won’t see that with those capitalist pigs from Redmond.

Yes, Linux even has a female spokesperson nowadays, a slender your woman who can tell you that Linux has 30 million users. Vista achieved 20 million in the first few months of it’s existence. Okay… put yourself in the position of a hormone driven teenager with his modded gamerig. Neon, shining lights, black and chrome. Got that? Picture Suse on it…. Or worse, Debian Woody, default install, because it is developed by a group of bickering hackers bound to a social contract…. See the problem? A nice paint job still does miracles, even 25 years later. Yes, yes, I can hear some of you think: “What about Compiz and Beryl?” True, but by the time that has reached mainstream Linux Vista has sold another 20 million. We have to change that.

It didn’t matter in those days for the average car buyer that the Yugo was a fine car for the money you paid for it and it doesn’t matter for average computer consumer Linux is superior in terms of security, stability and price. To push it’s way to the desktop, both in the business as at home, Linux needs a paint job. Free and reliable don’t do well in a sales pitch. It might, for the nerdy sysadmin, but not for his boss who controls the budget. By the way, I can still see the IT professionals during the employment heydays, waiting to pick out their company cars. And no, there was no Lada to be seen. The boss knows what sells.

Oh, and for those who are interested, you can download this bumper sticker for your computer: “Smile. But mine is free”

Tags: Linux

This column is part of a series I write for Digiplace.nl, where it appears in Dutch.

Gaining attention in the Blogosphere

I am a content blogger these days. Last weekend I decided to write a first impressions article on Debian and send it to Ladislav Bodnar of Distrowatch. I like Distrowatch and found it a valuable resource to get information on current developments in Linux and BSD.

This time I was the first one to have a review about Debian Etch on Distrowatch. Completely by accident, but I enjoy every moment of it. As blogger you hope that someone actually reads what your write. With the many, many millions of weblogs every response you get gives satisfaction. The Debian review gives a lot of it ;-).

First, there has been a lively interchange in the comments themselves. Besides that, the article has reached some other websites like Tuxmachines, LinuxLookup, Debian News and Linux Today. Each website contributed to getting new visitors to my website, somewhat more than 5.000 in the first two days.

Well, I just made the screenshots of the various websites, as a reminder and an encouragement. Thank you Blogosphere.

Tags: Debian, Linux

Debian Etch: first impressions

Debian deserves some extra attention. The latest release is being distributed and I have no doubt that it will be installed on quite a few machines over the coming days and weeks. Personally I want to try it on the iMac Indigo and on a virtual machine under VMware. The netinstal images were a breeze to download and that was enough for now. I did not feel like downloading three DVD images or 22 CD images at the moment.

Who should give Debian a try? Besides all the geeks and nerds I think it would be a good idea for the Ubuntu, Mepis, Linspire, Xandros, Knoppix crowds to at least take a good look at the distribution that makes their userfriendly distributions possible. Without the enormous work done by the Debian project their favorite distro’s might not have existed. You might compare it to various large see and land animals that swim or move around very lowly with little fish or birds picking the scraps from their skins or teeth. Debian is moving slowly (though it did have five or six intermediate updates of Debian Sarge), but the best pieces are picked away by the fast moving smaller distributions. Anyway, when such a hue animal crosses your path it is a grand and beautiful sight to behold. Maybe there is talk of extinction, but for now it still moves with grace and power and it deserves some quiet attention.

The install on the iMac went well, but I ran into the same problem as a few months ago: a completely frozen graphical user interface. To it’s credit, Debian did have all the settings for xorg.conf as they should be (in contrast to Yellowdog and almost every other Linux distribution for the PPC I tried). For now, I will have to ask some questions here and there to see whether there is a solution (though last time I asked the solution should have been implemented by now). This article will only deal with my first impressions under VMware then.

Installing Debian Etch

I still remember the first time I installed Debian on a computer. I knew it had a reputation of being difficult, but I was confident enough to think I could beat the odds. Remember, back then -must be about five years ago- I was using Windows98 as my main operating system and was barely scratching the surface of Linux. The installer asked all kinds of questions I had absolutely no clue as to what they meant. Various distributions have improved the way Linux is installed and for most W2L migrators there shouldn’t be any problem anymore. The Debian installer is still text-based with no new shiny live desktop to get you going. This isn’t a bad thing, because it the Debian installer is fast and responsive (even on the old iMac). The questions speak for themselves: language, territory, keyboard layout (where I do miss American English International). Then it is on to the network. You give your box it’s own name and tell it which domain it belongs to. In my case I left it blank.

The next step is to partition your harddrives. For me, it wasn’t really a big problem, having created a new virtual machine specifically for this tasks. The user can choose between three guided options (use entire disk, with LVM and LVM encrypted) or the manual option. I choose the first option, just use the whole disk. You can select the proper disk (if you have multiple disks that is) after which you are asked to select a partitioning scheme. There are three options: one partition, a separate /home partition of separate partitions for /home, /usr, /var and /temp. The first one is recommended for new users, but a separate /home partition might have been better especially if you want to change distributions later on. However, this a matter of taste. The installer then gives you a summary of the partition scheme and you have to confirm your decision. The confirmation questions default to “No”, a wise precaution.

The next step sets up the first two accounts (root and user) and their respective passwords. Since this is a netbased install you can select a network mirror in order to have more than a basic system. The installer gives you a choice of countries and servers in those countries. If you are behind a proxyserver, now is the time to enter the appropriate data.

Debian likes to know which packages you like and the next question is whether to participate in the popularity-contest, where each week statistics about the use of packages is send to the distribution developers. Why not, especially since it can only benefit you. The final step before the installer starts pulling down the packages is to select the predefined softwarecollections. The standard system and desktop environment are ticked by default. When you want to add a Web, Print, DNS, File, Mail server of a SQL database it is a matter of ticking the other boxes. The last selection is for laptop users. Then, you can sit back and wait until everything has been downloaded and installed. No questions will bother you during the next hour. Almost, because you do get a question about the resolutions the x server can use and about Grub.

Conclusion. The Debian installer is simple and straight forward. It might seem a bit more than the six step process under Ubuntu, but it is almost the same. I can imagine that old style Debian users miss an expert option, where you may have more control on what software to install. I guess experienced users will select to install only the standard system and proceed from there on. The current installer looks like a good balance and I like it.

Image gallery (articles continues below)

First boot

If you are waiting for a great looking desktop, Debian is not really the place to be. With Debian Etch you do get a major update as far as Gnome , KDE and Xfce are concerned. The netinstall gives you the Gnome desktop. But, the developers at Debian did put some work in the bells and whistles. GDM actually looks like something to show off with and the default Gnome desktop is no longer plain and boring vanilla, but almost appears bright. The mirrors also have KDE and Xfce disks for those who don’t want to boot into Gnome first. Still, I would recommend a visit to the gnome-look.org website to add some more shine to the desktop (personal preference, I know).

When you are used to Ubuntu this desktop shouldn’t pose too many difficulties. Applications from the Mozilla family had an identity change (Iceweasel instead of Firefox, Icedove instead of Thunderbird and Iceape instead of Seamonkey). You will also miss the Install/Remove… menu item for easy access to install new software. Synaptic is the tool for that (or apt-get of course), where adding new repositories is a bit more of a challenge. One interesting thing I did find was a small window selector in the upper left corner. Very small, but nice as an alternative to ALT-TAB. The default webbrowser is Epiphany and Evolution is the center for your email. The rest of the usual suspects are present as well. OpenOffice.org, GIMP, GAIM, Rythmbox and Totem. GnomeBaker for your burning needs.

Synaptic is there to expand your desktop. When you start up Synaptic during a session it asks your root password. Logical, but when you start it later again in the same session it remembers you already gave the password. Again, a small thing to make life and use a bit more enjoyable.

Overall, the default Debian install ‘feels’ fast. Firing up OpenOffice.org took only seconds and Abiword came up almost instantly. Synaptic, GIMP and Scribus also ‘felt’ faster than I am used to under Ubuntu. The only downside was that it didn’t add Scribus to the menustructure. Feelings are not a good measure of performance, I know, but the impression stuck that the desktop is really fast.

Image gallery

Conclusions

It is too early to draw major conclusions about this new release of Debian. I like the installer and -having used Ubuntu and Gnome since August 2006- the default Debian Gnome desktop provides a familiar environment with well-known tools. No doubt there will again be discussions on how to reduce the time between the various releases. The good thing about the long cycles is that you have automatic long term support. Stability isn’t necessarily a bad thing then. The way Debian is developed does make it possible for young and shiny (like Ubuntu, Mepis and Linspire) to run ahead and push the unstable to the mainstream and on the desks of a growing number of users.

For now, I will just stand aside and admire the old mastodont passing by.

Tags: Linux, Debian, Ubuntu

Debian 4.0 Etch is released

Surprise, surprise. In the end it is only a few months late. The next big Debian release is available. The official release announcement can be found here and the various possiblities to download the distribution are found here.

The wide range of supported hardware platforms and the huge, immense collection of software packages do set Debian apart. When you are using Ubuntu or any other Debian-based distribution most likely there isn’t much that is new or exciting. But with Debian you know that it works, that it is stable and that is has long term support out of the box. It won’t look sexy, but that can be rectified by the tinkerer.

Personally I am curious whether I can get it running on my old iMac, because in this release the problems with X should be solved. But why should you try out Debian Etch? Plain and simple: out of respect for one of the biggest (if not the biggest) projects to develop Linux and open source software and the many, many hours that scores of developers have punt into making this happen. Installing Debian is a sign of respect for a remarkable legacy.

Tags: Debian, Linux, Ubuntu

Throwing away years of Windows

It’s spring time. Nothing new there, but spring seems to be associated with cleaning. My wife wants to redo the bed room: walls, woodwork, floor and some new closets.  And -as most married men will know- when your wife wants something like this, it is better to get over with it. On the other hand, it is fun to spend time together and at the end of the day the bed room is a nicer place.

Getting in the spring cleaning mood I decided to take on our study room and fix that one as well. For the statistics: it is three by four meters with 30 meters of bookshelves on three walls, we have two desks in there and four computers, three of which are mine (not counting the laptop). All those cables! Twee meters of shelves were filled with zip folders packed with dozens upon dozens of CD’s and DVD’s gathered over the last six or seven years. I had bought some nice DJ boxes to compress the whole thing.

It was fascinating to see my IT history passing through my hands, remembering again why I stored a particular CD or why I bought a specific magazine. Some magazines don’t even exist anymore. CD’s and DVD’s that accompanied the magazines were great in a time when the internet was only 56K6 away on dial up. I loved the UK magazines because they had real software, not just trial or demo version, but versions you could deploy and use for as long as you liked.

But after a while those magazines and their disks disappeared from the folders. I got broad band and discovered free and open source software. Plenty of stuff to play around with. I found the disks I used to write my first articles, back in 2002/2003. Besides that, Linux found it’s way in my collection. First Suse 7.2 and then Red Hat 7.3. I can’t even recount how many ISO files I downloaded from LinuxISO. The site doesn’t even seem to exist anymore, but it was a great resource in a time that bittorrent still had to be invented.

From then on the collection branched in various directions. BeOS, BSD, Solaris, Netware. I sucked it all in. That must have been the time the book collection exploded as well. Some distributions still exist, others are gone, but I do remember the growth of Linux in the last few years.

And there they were, the CD’s and DVD’s with tons and tons of Windows based software. Program’s for document management. But no longer needed. Accounting, likewise, no longer needed. Flow charters, long ago replaced. Security software for Windows, no longer valid or even up to the tasks required. I cherish the memories because those disks also represent part of my growth in knowledge and skills. But in the time of spring cleaning I had to ask myself if I would ever use those disks again? Some even dated back to the Windows 98 days, so the answer was no.

With a few exceptions 90% of the disks went into the trashcan. Which -now that I think of it- represents the balance between my current use of Linux  (based software) versus Windows (based software) quite well.

Tags: Linux, Windows

Post Navigation