Ruminations on the Digital Realm

Jan Stedehouder

Ruminations is moving

I agree. This site has been too quit for too long, and it is time to change it.

First, some explanation. 2008 was a great year for me as a writer, columnist and journalist. I was given the opportunity to write two books on migrating to Ubuntu Linux and one one open source and open standards. I could contribute to a textbook for higher education and was co-editor of the Dutch open source yearbook 2008/2009. As editor of the online open source magazine Livre I was on top of the international open news. Regular readers know I have a thing for the BSD’s and it was an honor to be able to contribute to the new BSD Magazine. Apart from this I continued writing pieces for Digiplace and SoftwareBus, a magazine for a Dutch computerusers group. Well, I guess you can understand why I decided a brief writing sabbatical was in order.

But now, playing time is over. I used the sabbatical to refocus my writing, deciding upon the projects I wish to contribute (as most of my writing is volunteer work I have to spend my time wisely) and the topics I want to write about this year. One of those projects is Transparante Zaken, transparant affairs, which should develop itself into an independent news- and opinionsite for the open domain as well as a platform for the Dutch open communities to make themselves heard and known. This site is an initiative of myself and Brenno de Winter, the foremost ICT-journalist in the Netherlands who is currently involved in a massive Freedom of Information Act campaign in order to get a grip on the actual open source/open standards policies of various governments.

Ruminations on the Digital Realm has always been my playground in English with reviews about Linux distributions and -for instance- the two ’30 days with…’ series about PC-BSD and DesktopBSD. Another aspect has been my ideas on how to promote open source, like yesterday’s article ‘Embrace and extend‘. I believe it’s time to put some muscle behind Ruminations again and start writing those reviews and opinion pieces.

I did decide to say goodbye to this old blog. WordPress has been a great companion for the last three years, but the kind of articles I wish to write and the way I want to organize them for future reference require a different platform. In this case Joomla. The new site is ready to roll. Starting coming Monday the URL http://www.ruminationsonthedigitalrealm will point forward to the new Joomla-based website. This blog will remain, albeit at a different location.

For those who follow Ruminations via newsfeeds, please update the URL’s to:
http://www.ruminationsonthedigitalrealm.org/portal/rss (for the new site)
http://www.ruminationsonthedigitalrealm.org/portal/old/feed (for the old site)

Looking forward to meeting everyone at the new location.

Advertisements

Embrace and extend: a non-binary approach to open source promotion

Computers view reality in the simplest of terms. It is either yes or no, black or white, right or wrong. You can’t blame computers for this, their binary view on reality is is hard-coded. As Wikipedia explains, the computer’s memory cells only have two states. This binary outlook on life can take us a long way. Just consider the strengths and complexities of our modern desktop computer, our mobile phone of the AI of a few popular first person shooters. We spend a lot of time interacting with systems that think and work binary. As a Dutch saying goes, what you hang out with tends to rub off on you. And before we know it, we also begin to organize the world around us along simple binary lines: yes or no, black or white, right or wrong. In the open source world we find people who spend a lot of time behind their computers.

Granted, I am not a expert on human brains, but I think we agree that our brains are somewhat more complex than the memory cells of a computer. Humans are unique beings, shaped by a package of genes, passed along at birth, and a collection of experiences within a social, cultural and biological context (and then some more). Both ‘nature’ and ‘nurture’ collaborate to create greatly complex individuals. In order to grasp this complexity we tend to simplify reality, for instancy by stereotyping. Okay, this is enough Dr.Phil-psychology for now. In essence, humans and human organizations are not binary, they cannot be binary in my opinion. But, in the open source world we find people who spend a lot time behind their computers.

The downside to a binary view on reality
In this open source world we regularly see heated debates following clear and simple binary schemes. Just think about the classical vi versus emacs, or GNOME versus KDE, but also the ‘my distribution’ versus ‘the other crap’ debates. These flame wars are somewhat amusing, entertaining, at least for the onlookers. However, there is a downside to this binary outlook, one that seriously hinders the growth of open source software. There a various areas where this comes to the fore, but for now I’d like to focus on one area: Microsoft.

Or, in broader terms, the relationship of open source communities with large corporations. In binary Microsoft is, of course, always wrong, black, ans we should view everything this company does with the utmost suspicion. Heck, the key motto of the Redmond Boys is ‘embrace, extend and exterminate‘. Another major corporation, IBM, is one of the ‘good guys’. IBM has been investing in Linux for quite a few years, released software under open source licenses and uses part of it’s patent portfolio to defend the interests of Linux and open source. Sun Microsystems, just to name a third example, has a more awkward relationship with the open source world. It doesn’t really have the image of a stalwart defender of open source, but -on the other hand- it did open up serious crown jewels, not end-of-life software. Just think about OpenSolaris, OpenOffice.org, Java and ZFS. But does it mean that IBM and Sun Microsystems should be put on the good side in the binary scheme of open source?

IBM
Let’s take a closer look at IBM. It is an active partner in the development of Linux and has 600 people on it’s payroll who contribute to open source projects. Now that’s a nice number, but – for simple comparisons – IBM has 388.000 people working for it. Another contribution to Linux was to give free access to 500 patents in 2005. True, the core philosophy of free and open source software shuns applying for and using (software) patents, but in the current corporate reality it is nice to know that patents can also be used to defend Linux and open source. 500 patents seems like a a lot, and perhaps IBM even increased it’s number of free access patents, but we are talking about the king of patents. Year after year IBM has been the largest applicant for new patents (and obtained them). In 2008 IBM obtained more patents than Microsoft, HP, Oracle, Apple, EMC, Accenture and Google combined: 4.186. The man responsible for this patent portfolio, David Kappos, is to become head of the USPTO, the US patent office.

Does this in any way diminish the contributions IBM made and is still making to Linux and open source? Of course not, but it does show that IBM had multiple commercial interests, and Linux plays a part in at least some of them.

Opening up Microsoft
Microsoft and Linux didn’t really have a good start. Most of us still remember Steve Ballmer’s words, making comparisons with rapidly growing malignant cells and such. Not nice at all. But things have changed since those dark days and Microsoft did move towards more open source. The various steps Microsoft took are described in my latest book Open source en open standaarden. Voor niets gaat de zon op? (Open source and open standards. For free?, only available in Dutch).

No, Microsoft isn’t an open source company and the chances that it will release the source code of Windows under the GPL v3 license are virtually nil. But with Codeplex, Port 25, two OSI approved licenses and an ambassador like Sam Ramji Microsoft does have an open source policy. And a vision, which you can find in the white paper Participation in a world of choice. Perspectives on open source and Microsoft (PDF). You can argue about this vision, you can vehemently disagree with it, but there is a track record now.

And no, I am not blind to the patentprotection based deals with some distributors of Linux, not deaf to the threats that Linux violates a couple of hundred patents and the recent legal battle with TomTom is still fresh on my mind (by the way, for those who read Dutch, you might find this article interesting. It’s an interview I had with Keith Bergelt of the Open Invention Network, earlier this year). I do wonder how useful it is to develop .Net applications for Linux through Mono, or porting Silverlight via Moonlight, but I have a similar response reading about attempts to port KDE Applications to Windows using Qt. However, the world isn’t binary, people aren’t binary and that means I am under no obligation to make sense of it all.

Companies have commercial interests
Why does Microsoft get involved with open source? In my opinion for exactly the same reasons as IBM, Sun Microsystems, HP, Nokia, Google and TomTom doe it: it has commercial value and contributes to it’s strength in the market place. Linux, open source and open standards go from strength to strength among customers, both corporate and governmental. Any corporation with a strategic sense makes sure to pick up a trend like this sooner or later and thus cater to the demands in the market.

How does this involve us, as open source world? Well, to repeat myself, the world isn’t binary and human organizations aren’t binary. Companies aren’t monolithic structures where all think alike, especially not corporations that operate on a world scale. If we look at the aforementioned corporations I believe we can discern at least three different groups within each of them (see, no longer binary). You will find people that really believe in the commercial value, and perhaps even the superior quality, of open source software for their company. These are not the guys and girls that spend days after days designing the darkest of strategies to kill open source as soon as possible.

At the other side of the spectrum you will find employees, managers and (vice-)presidents who completely and absolutely reject a business model where you don’t sell software (or better, licenses), but give it away. They look at the history of their company and to those units that actually make profit and wonder who their ‘open minded’ colleagues really work for. Somewhere in between you can find a large group that just goes to work.

It is a serious weakness of our binary outlook when we, as open source world, automatically condemn open initiatives of ‘suspicious’ or ‘wrong’ corporations as being part of some ‘hidden agenda’. Just a quote from the Linux Collaboration Summit held earlier this year:

Much of the discussion related to Microsoft and its rocky relationship with open source software. Ramji, who runs Microsoft’s open source software lab and plays a role in influencing Microsoft’s open source strategy, faced some tough questions from fellow panelists and the audience. He was not flustered by the inquisition and responded with cogent thoughts and some witty retorts.

Now ask yourself the question whether this form of debate is really constructive. For the onlookers it might be entertaining to see and the participants might go home with a warm feeling they finally told it to the Sam Ramji’s of the wrong corporations (you can fill in any other name you like, it seems Miguel de Icaza is tossed in the same corner nowadays). Personally, I’d like to think at the upcoming budget meeting. “Hey Sam, they really did burn at that Linux conference last weekend. By the way, didn’t you request extra budget for your department?” Simplistic reasoning? Perhaps, but any student of corporate snake pits knows how little is eschewed when it comes to power over people, money and influence.

Towards an alternative strategy: ’embrace and extend’
I believe it makes more sense to adopt the ’embrace and extend’ strategy. Accept each and everyone who has an open source profile, even if they work for or in ‘suspicious’ corporations and organizations. Embrace them, make them public representatives of a worldwide open source community, the lost sons and daughters who are welcomed back in the fold.  Present them as the ‘good guys’ who have a perfect understanding of the market place, of customer needs. But don’t make them the prime targets for attacking their employers. The campaign to promote open source software is served more by increasing the numbers of open source people within major corporations and allowing their influence in strategic corporate decisions to grow.

This seems to be a more positive and constructive use of our time as well. Pursuing a negative, almost paranoid anti-Micro$oft campaign doesn’t bring new building blocks, creates no new open source software, doesn’t lead to new open standards and won’t convince users to switch to Linux. And then, who will have won?

First impressions: Sabayon Linux Four Oh!

Two years ago I ran into Sabayon Linux for the first time.  Version 3.2 was about to be released and I gave Sabayon a spin on my laptop. The article on my Dutch website about my experiences is still attracting a lot of readers, which indicates a consistent and growing interest in this Linux distribution. Strange enough, I wasn’t very lucky with later releases which simply refused to be installed. Two weeks ago Sabayon Linux Four Oh! was released. How far did Sabayon progress over the last two years?
Read more…

The Cost of Free

Furious! This describes the response of a portion of the Dutch free and open source afficionados when hearing about the idea that OpenOffice.org might get advertisements as part of the binary package. Jonathan Schwartz (Sun Microsystems), who launched the idea on his weblog (and already retracted it), was aware the idea could cause a furor. Strangely enough, the anger seemed limited to the Netherlands.

The newsfeeds archives and the ironclad memory of internet search engines reveal that the most vociferous opposition against the idea was heard in the Netherlands. The rest of the digital realm hardly paid any attention to it. Why not? Well, perhaps the rest of the world has a better understanding that free and open can and should not be confused with ‘gratis’ (i.e. free as in ‘free beer’). In the Netherlands, a country where being cheap is considered a thing of pride, ‘free beer‘ instead of ‘free speech’ seems to be more important. Of course, there is nothing wrong with this, but please refrain from making ludicrous statements like: “Sun (and others) don’t understand the GPL license (the LGPL actually) and advertisements and commercialization are not allowed under the license”. Funny, because if this is true, the GNU.org organization doesn’t understand it either, considering the article Selling free software.

It’s the attitude behind ‘hmmm nice, free beer’ that is flawed. Users of free and open source software, both corporate and private, need to consider the cost of free. Yes, developing, supporting and promoting the software is done by scores of volunteeers. But, developing the Linux kernel and bringing solid and reliable Linux distrutions to the market place also involves major corporations with commercial interests and needs. Development of the webbrowser Firefox floats on the millions made by an agreement between the Mozilla Foundation and Google.

Using free and open source software doesn’t come with freedom alone, but also with responsibilities, including the responsibility to contribute financially to the development of it. If you don’t want that and simply voice your ‘right’ to make ‘gratis’ use of the software. Well, you’d better stick to your illegally downloaded proprietary software.

Open letter: independent conformance testing needed for ODF and OOXML implementations

Tineke Egyedi, senior researcher of standardization at the University of Delft, The Netherlands,  president of the European Academy for Standardization and vice-chair of the International Cooperation for Education about Standardization, send an open letter (PDF) to software vendors with the title Who pays for interoperability in public IT procurement. In her letter she calls upon vendors to submit their implementation of the OpenDocument standard and the Office Open XML standard in software products for independent conformance testing and to verify the interoperability. She feels this is needed to make sure that governments and it’s citizens do not head into a new vendor-lock and to ensure vendors do not alter the open standards along the way.

The letter is as follows:

Who pays for interoperability in public IT procurement?
A public letter to the IT industry about document format standards

Delft, 16 November 2008

L.S.,
It is not uncommon for governments to voluntarily head for vendor lock-in. As a citizen, however, I have a direct stake in my government basing its public procurement of IT on open standards. This stake may be most evident for ‘civil ICT standards’ (Andy Updegrove), i.e., for standards that support access to government information and exchanges with government such as document formats (e.g., sustainable digital data). However, I also have a standards-related stake in IT procured for government-internal processes because, first, in practice government-internal and –external IT processes cannot be separated. Second, because of the increasing costs that accompany vendor-lock-in. Third, because government procurement is good for 16% of the European IT market and is therefore a means towards a more competitive and sustainable IT market.
A main reason for voluntary vendor lock-in is the fear of lack of interoperability of IT products in a multi-vendor environment. Experience shows that standard-compliant products from different vendors need not necessarily interoperate. As is known, a dominant vendor may design in incompatibility to break the integrity of a standard (e.g. Java platform). But usually incompatible standard implementations are the unhappy outcome of good intentions.

Problem of document format standards
In the field of document formats there is an additional complexity. For the external reader: ISO4 has ratified two competing XML-oriented standards for document formats. The first one, the Open Document Format (ODF, ISO/IEC 26300) was ratified in 2006 and stems from OASIS, a standards consortium. The second one, Office Open XML (OOXML, ISO/IEC 29500) originally stems from Ecma International, another standards consortium. Although ISO’s OOXML process has been widely contested, which caused a delay in its final approval, according to the ISO website the standards is to be published shortly.
ISO’s approval of a second, overlapping standard will not have lessened government fears about interoperability in a multi-vendor environment. The market has become less rather than more transparent by means of this standards effort. To re-create some transparency about the interoperability of applications and reduce the fear of post hoc expenses in public procurement, conformance and interoperability testing is needed. Plug-test events are needed to test the factual interoperability of standards-based products from different vendors. To be credible to all concerned, a neutral, independent testing centre such as ETSI may need to be involved to e.g. develop test-suites and coordinate plug test events.

Interoperability between multi-vendor OOXML applications
Current discussions on open standards highlight that multiple implementations are an important sign that standards are really open (see presentations by Rishab Gosh and by Thiru Balasubramaniam, The Power of Procurement). Regarding ISO’s OOXML, the contention is that no company has yet implemented the full standard, not even its primary sponsor Microsoft; and that the six thousand page specification is too complex and too inconsistent to implement. Are these contentions true?  If not, governments will want more than verbal claims to the contrary. Moreover, they can easily be countered with third party conformance and interoperability tests, including a plug-test event with multiple OOXML-compliant IT vendors.

Interoperability between ODF applications
All major vendors, Microsoft included, have agreed to support ODF ISO/IEC 26300, or are already doing so. That is, the availability of multiple implementations is not a problem here. Moreover, interestingly, two weeks ago OASIS initiated a technical committee to organize conformance and interoperability tests. Given its scope, this committee will provide transparency to governments about the degree of conformance of applications to ODF and the interoperability of ODF-documents. Less clear is whether the committee also intends to address interoperability between standards versions, or more general: what policy it has on standards change. To my knowledge, such policies have not yet been defined by any standards consortium or standards body. They would befit the area of civil ICT standards.
The OASIS committee explicitly does not address “identifying or commenting on particular implementations” or any certification activities. Government procurement officers will ultimately need testing at this level and want to involve an independent third party testing centre for this purpose. Moreover, OASIS, too, might at a later stage want to involve an independent third party in order to avoid credibility problems.

Having two overlapping standards brings about its own problems, as testifies a review of current ad hoc solutions – converters, translators, plug-ins – to re-create compatibility between ODF-products and Microsoft’s partial implementation of the OOXML standard. Those who develop a low quality and overlapping standard, qualifications which also OOXML supporters use, are not the ones who pay for the consequences. Regrettably, citizens will be paying the price for lack of interoperability.
Although there is no formal accountability to fall back upon in standardization, those who initiated the duplicating effort may feel a – corporate social – responsibility for what happened. Their help is needed to shift interoperability costs from governments and citizens (post hoc) back to IT vendors (ex ante), the source of the interoperability problem. As a start, will they fully cooperate and support OASIS’ initiative of conformance and interoperability testing? Are they prepared to shoulder the costs of independent, third party conformance and interoperability tests, tests that are needed to assure governments that no unexpected problems will arise ex post?

Kind regards,

Tineke Egyedi
Delft University of Technology
(T.M.Egyedi at TUDelft.nl)

The letter was send to HP, Microsoft, Sun Microsystems, IBM, ECMA and OASIS.

I completely agree with Tineke on this. I do believe it is time we see the end of vendor-sponsored ICT research on various issues. One can hardly expect independent verification of perfomance claims or -in this case- conformance claims by sponsored researchers. Tineke is correct to point out that the problem isn’t simply open source or open standards, but also the implementation of open standards in applications. Recent research already showed that issues that open source developers have with implementing open standards don’t necessarily reach the proper agencies to remedy the issues.

Thus, feel free to spread the new about this open letter and forward it to whomever you think needs to hear it.

Planete Beranger observes Linux distro hating week

Even for Planete Beranger, which never shies away from a strong opinion on the weaknesses of Linux distributions, the one week Linux distro hating week is quite a lot. However, the list of items that need fixing does give a lot to think about:

large enough repositories to satisfy both desktop and server users;

both GNOME and KDE3 should be offered as main options, alongside with whatever else is the main focus of the distro (smaller DEs/WMs or KDE4);

security updates and major bug updates that don’t break the system, that are provided in a timely manner and in the proper place (e.g. not it Debian’s “volatile” for tzdata; not in “testing” for VL; not ignoring FF 3.0.3 even by RHEL);

not to include functional regressions from a release to the next one (in the kernel or in the major applications);

not to force the users to upgrade because a release is supported for too short;

not to lack major applications in such a manner that the user should either build from source, or get them from several third-party repositories, thus compromising the intended advantages of using a certain distro with a certain quality and consistency of the repos and of the updates;

not to freeze each and every application to a fixed version for the whole supported lifetime of a release, ignoring the fact that building newer versions of the applications is possible when they don’t require newer versions of the system libraries;

not to ignore bug reports for years, especially when the fix would be easy, or especially when it’s about an enterprise distro, whose modest number of packages is small precisely because it’s supposed to be much better maintained than a community-maintained distro;

not to break the package manager every now and then, and not to change the default package manager from a release to the next one;

to provide the full sources in free download, not partial sources, nor just build scripts that would attempt to download the sources from upstream;

to be usable in X with only 256 MB of RAM — failing to do so is a clear sign of bloatedness, regardless of the fact that very few users have such low-end systems;

to have a GUI version of the package manager, and this version to be usable in terms of speed on low-end systems too;

and of course: not to require hours of post-install configuration and customization by the end-user!

clipped from beranger.org

Linux Distro Hating Week, Oct. 6-12

Observed by Planete Beranger, as a protest against the low quality of the distributions built around the Linux kernel

Linux turned 17 under shameful circumstances. Notwithstanding the decrease in quality suffered by the Linux kernel since the beginning of the 2.6 series, good quality distributions can still be build around it. Unfortunately, all of the mainstream GNU/Linux distributions fail to provide with acceptable quality, usability, trustworthiness and proper support, being it paid or not. Specifically, there is no GNU/Linux distribution in the known universe not to fail to at least one of the following requirements:
For the current week, Planete Beranger will ignore whatever is related to any known Linux distro.
blog it

The three worst Linux distributions?

Well, it seems there are some more writers out there that (a) love Linux and (b) debunk distributions that fall way short of their own goals. On the Internetling blog Gregor lists the three worst distributions he knows.

gOS is on three:

I checked out the last version of gOS, and again it s a meaningless pile of installed packages already available for every other major Linux distro out there.

Zebuntu/ZevenOS is on two:

I reviewed this distro a while ago and I though it’s cool that someone is aiming to create a distro in the spirit of BeOS. Looks like the developers didn’t hear the last part. It said ‘philosophy’ not ‘theme’.

And Linux XP is topping the list:

I’m still wondering whether this distro is violating the GPL. For Pete’s sake they have a 30-day TRIAL. Linux XP is a Fedora re-spin with a Vista skin, Wine and some other front-ends. It is being sold, you can also obtain a serial number.

Well, the fact is that the GPL allows reselling commercially and there are commercial distributions that won’t allow updates if you don’t pay. Anyway, I like the list and I like what Gregor is doing.

Kevin Mitnick doesn't travel light

Uber hacker Kevin Mitnick recently found himself detained after a trip to Colombia. The CNet article describes how Kevin felt. Personally, my jaw hit the floor when the article listed what was in his luggage: two laptops, a UMPC, three or four harddrives (I wonder where the confusion comes from), three iPhones and four Nokia cell phones. That is criminal in itself.
clipped from news.cnet.com

Kevin Mitnick detained, released after Colombia trip

In his luggage, they found a MacBook Pro, a Dell XPS M1210 laptop, an Asus 900 mini-laptop, three or four hard drives, numerous USB storage devices, some Bluetooth dongles, three iPhones, and four Nokia cell phones (with different SIM cards for different countries).
blog it

Peer2peer software not automatically infringing

The Center for Democracy & Technology asks the court in a case against Limewire not to deem peer2peer software illegal simply because illegal use is possible.

I would say this should be blatantly obvious, especially in a country where people wear t-shirts with “People kill people, not guns”.

clipped from www.cdt.org

Center for Democracy and Technology

Working for Democratic Values in a Digital Age

Legal Brief Stresses Proper Limits of Secondary Copyright Liability
CDT, joined by groups from EFF to ITAA, told a federal court today that the law requires caution in assessing whether to impose copyright liability on the makers of multi-use technologies. In a legal brief filed in a lawsuit against the peer-to-peer file sharing service LimeWire, CDT and its allies did not take sides but rather urged the court to decide the case within the careful framework established by previous Supreme Court cases in this area. Those cases make clear that distributing a technology with “substantial noninfringing uses” should not raise rise copyright liability concerns, as long as the distributor does not actively promote the technology’s use for infringement. Reinterpreting or expanding secondary copyright liability in ways that undermine this crucial limitation, the brief warned, it could significantly chill technological innovation. September 26, 2008

  • Amicus Brief
    [PDF]
    September 26, 2008
  •   blog it

    A touch of irony: uTorrent for Mac leaked on The Pirate Bay

    I can only consider it a touch of irony that a program that rose to popularity due to the need to have a good program to dowload Linux distributions and legal music/movies (none of the illegal stuff, no doubt) has already been pirated. On the PIrate Bay one can find an alpha version of uTorrent for the Mac. Oh, uTorrent isn’t open source….
    clipped from www.techcrunch.com

    uTorrent For Mac Makes Its Way to The Pirate Bay

    In what will surely make every BitTorrent lover jump for joy, a rough alpha version of uTorrent for the Mac has surfaced on The Pirate Bay and BitTorrent isn’t too happy about it.

    uTorrent, which was acquired by BitTorrent in 2006, has always been a Windows-only service. But ever since the acquisition, BitTorrent has promised that uTorrent would be coming to the Mac. For almost two years, Mac users have waited for uTorrent to make an appearance and it finally has — much to the dismay of BitTorrent.

    Speaking to TorrentFreak, BitTorrent’s product development VP Simon Morris said the leaked alpha version is not for public use and those that try it out should be warned that it’s still in development.

      blog it

    Post Navigation