FOSDEM is one of the largest gatherings of Free Software contributors in the world and happens each February in Brussels (Belgium). One of the developer rooms will be the CrossDesktop DevRoom, which will host Desktop-related talks.

We are now inviting proposals for talks about Free/Libre/Open-source Software on the topics of Desktop development, Desktop applications and interoperativity amongst Desktop Environments. This is a unique opportunity to show novel ideas and developments to a wide technical audience.

Topics accepted include, but are not limited to: Enlightenment, Gnome, KDE, XFCE, Windows, Mac OS X, general desktop matters, applications that enhance desktops and web (when related to desktop).

Talks can be very specific, such as developing mobile applications with Qt Quick; or as general as predictions for the fusion of Desktop and web in 5 years time. Topics that are of interest to the users and developers of all desktop environments are especially welcome. The FOSDEM 2011 schedule might give you some inspiration.

Please include the following information when submitting a proposal: your name, the title of your talk (please be descriptive, as titles will be listed with around 250 from other projects) and a short abstract of one or two paragraphs.

The deadline for submissions is December 20th 2011. FOSDEM will be held on the weekend of 4-5 February 2012. Please submit your proposals to crossdesktop-devroom@lists.fosdem.org

Also, if you are attending FOSDEM 2012, please add yourself to the KDE community wiki page so that we organize better. We need volunteers for the booth!

 

Red Hat‘s Matthew Garrett let the cat out of the bag about a month ago: when UEFI Secure Boot is adopted by mainboard manufacturers to satisfy Microsoft Windows 8 requirements, it may very well be the case that Linux and others (BSD, Haiku, Minix, OS/2, etc) will no longer boot.

Matthew has written about it extensively and seems to know very well what the issues are (part I, part II), the details about signing binaries and why Linux does not support Secure Boot yet.

The Free Software Foundation has also released a statement and started a campaign, which is, as usually, anti-Microsoft instead of pro-solutions.

Now let me express my opinion on this matter: this is not Microsoft’s fault.

Facts

Let’s see what are the facts in this controversy:

  • Secure Boot is here to stay. In my humble opinion, the idea is good and it will prevent and/or lessen malware effects, especially on Windows.
  • Binaries need to be signed with a certificate from the binaries’ vendor (Microsoft, Apple, Red Hat, etc)
  • The certificate that signs those binaries needs to be installed in the UEFI BIOS
  • Everybody wants their certificate bundled with the UEFI BIOS so that their operating system works “out of the box”
  • Given that there are many UEFI and mainboard manufacturers, getting your certificate included is not an easy task: it requires time, effort and money.

Problem

The problem stems from the fact that most Linux vendors do not have the power to get their certificates in UEFI BIOS. Red Hat and Suse will for sure get their certificates bundled in server UEFI BIOS. Debian and Ubuntu? Maybe. NetBSD, OpenIndiana, Slackware, etc? No way.

This is, in my humble opinion, a serious defect in the standard. A huge omission. Apparently while developing the Secure Boot specification everybody was busy talking about signed binaries, yet nobody thought for a second how the certificates will get into the UEFI BIOS.

What should have been done

The UEFI secure boot standard should have defined an organization (a “Secure Boot Certification Authority”) that would issue and/or receive certificates from organizations/companies (Red Hat, Oracle, Ubuntu, Microsoft, Apple, etc) that want their binaries signed.

This SBCA would also be in charge of verifying the background of those organizations.

There is actually no need for a new organization: just use an existing one, such as Verisign, that carries on with this task for Microsoft for kernel-level binaries (AuthentiCode).

Given that there is no Secure Boot Certification Authority, Microsoft asked BIOS (UEFI) developers and manufacturers to include their certificates, which looks 100% logical to me. The fact that Linux distributions do not have such power is unfortunate, but it is not Microsoft’s fault at all.

What can we do?

Given its strong ties with Intel, AMD and others, maybe the Linux Foundation could start a task force and a “Temporary Secure Boot Certification Authority” to deal with UEFI BIOS manufacturers and developers.

This task force and TSBCA would act as a proxy for minorities such as Linux, BSD, etc distributions.

I am convinced this is our best chance to get something done in a reasonable amount of time.

Complaining will not get us anything. Screaming at Microsoft will not get us anything. We need to propose solutions.

Wait! Non-Microsoft certificates? Why?

In addition to the missing Secure Boot Certification Authority, there is a second problem apparently nobody is talking about: what is the advantage mainboard manufacturers get from including non-Microsoft certificates?

For instance: why would Gigabyte (or any other mainboard manufacturer) include the certificate for, say, Haiku?

The benefit for Gigabyte would be negligible and if someone with ill-intentions gets Haiku’s certificate, that piece of malware will be installable on all Gigabyte’s mainboards.This would lead to manufacturer-targetted malware, which would be fatal to Gigabyte: “oh, want to be immune to the-grandchild-of-Stuxnet? Buy (a computer with) an MSI mainboard, which does not include Haiku’s certificate”

Given that 99% of desktops and laptops only run Windows, the result of this (yet unresolved) problem would be that manufacturers will only install Microsoft certificates, therefore they would be immune to malware signed with a Slackware certificate in the wild.

If we are lucky, mainboard manufacturers will give us an utility to install more certificates under your own risk.

The solution to the first problem looks easy to me. The solution to the second looks much more worrying to me.

 

A while ago I said Koen from Emweb made an interesting proposal at FOSDEM about emerge, the KDE Windows build tool.

Yesterday, Jarosław Staniek and I reaffirmed our commitment to ’emerge’. Today, I’d like to go a bit further: let’s bring more developers to emerge by opening it up to other projects. Keep reading!

What is emerge, why is it important and what was Koen’s proposal?

Fact: Microsoft Windows is very different to Unix in regards to development.

On Unix platforms -that includes Linux and Mac OS X-, software is usually installed to /usr: applications in /usr/bin and /usr/sbin, libraries in /usr/lib, headers in /usr/include, common resources in /usr/share, etc. Also, dependency management is usually something you can count on: when you install kdelibs5-dev in Ubuntu, it will automatically install libqt4-dev, kdelibs5-data, libfreetype (runtime), etc That makes setting up a development environment a very easy task: look for shared libraries, header files, etc in the common places and you will probably find them.

On Windows there is nothing like that. When you want to compile an application, you need to provide (build and install) all its dependencies, and you need to tell Visual Studio where to find everything. Even CMake usually needs some help in the form of a hint for CMAKE_PREFIX_PATH. As you may imagine, building KDE, which has more than 200 third party dependencies and tens of modules (and with the move + split to git, many more) becomes an almost insurmountable task.

‘Emerge’ to the rescue: inspired by Gentoo‘s emerge, Ralf Habacker, Christian Ehrlicher, Patrick Spendrin and others (yours faithfully included) developed a tool which downloads the source, configures, builds, installs and packages KDE and its dependencies. It makes a world of difference when building KDE. Actually, it makes building KDE on Windows possible. Once more: thank you very much guys, impressive tool.

There are two well-differentiated parts in emerge, the ‘engine’ and the ‘recipes’.

Read More →

I have been packaging for Debian for a few years now. My first “serious” package was Wt back in 2007, but I had been backporting for Ubuntu for at least 2 years already, which means I have been doing .deb packaging for about 5 years (!).

Last week I decided it was about time stop nagging my sponsors (Vincent Bernat, Thomas Girard and Sune Vuorela) every time I wanted to update the packages I maintain (witty, ace and libmsn), and I finally started the Debian New Maintainer process.

The main reason I had not applied for Debian Maintainer yet was it requires some bureaucracy and, well, I’d rather spend my time coding or packaging than doing paperwork 🙂

I sent my Declaration of Intent and soon after, Thomas and Vincent replied and supported my application with very very nice and kind words. Thank you, guys! I’m flattered! 🙄

Had I known I would be buttered up so much, I would have certainly applied a long time ago! 😀

But you know what is the best part of this? It shows how open source projects take advantage of all the tools and communications channels we have (IRC, mailing lists, sprints, conferences, etc), and make distributed development work very well: here we have a 900-developers project in which two French guys are praising an Spanish guy they have never, ever met face-to-face (only e-mail, occasional IRC, and the most important of all: code review). Meritocracy at its full extent. Have you ever seen that in a traditional 100,000 workers company with hundreds of developers working in a single project?

We just learned about the Nokia and Microsoft strategic partnership today and many people are concerned. I think the agreement is worrying for Qt, but even more worrying for Nokia. I foresee an exodus of current developers and Nokia mobile device users. Many Windows developers will come. Hopefully?

I really wonder who is actually benefited by this strategic partnership.

Nokia already had many mobile devices with Symbian, one with Maemo (and there could have been many more with Maemo, but for some unknown reason there weren’t) and Meego could have been adopted already (but again, for some reason, Nokia had not).

On the other hand, Microsoft had an operating system but no users. Virtually nobody wanted Windows 7. It was either Android or your own solution (MeeGo, WebOS, Bada, etc).

In my opinion, the next move we will see is Microsoft buying Nokia to be able to compete one-to-one with Apple and Google. I think it will not take too longer, probably they are just waiting for Nokia stock to fall a bit more.

So what should have been the change of direction Nokia should have done a few years ago?

In my opinion, when Nokia acquired TrollTech, they should have released at least 10 devices with Qtopia. Immediately. And dump Maemo.

In parallel, they could have added more and more Maemo features to Qtopia. By doing that, they would have had a good mobile operating system and applications in no time. There was even an X11 compatibility layer for Qtopia back then.

I did not understand why Nokia adopted Qt and went on with Maemo and Meego based on Gtk+, and tried to keep as much backwards compatibility (regarding to source compatibility, development methodology,e tc) with devices which barely had users (N770, N800 and N810). I think noone understood that move. From my point of view, that was a waste of time, money and effort, which ultimately led to Nokia’s demise.

PS: Yes, I still am a KDE on Windows developer and part of the Debian Qt-KDE team

FOSDEM ended yesterday and here I am sitting at Charleroi Airport (also known as “Brussels South”, quite a misleading name given that it’s 80 Km from Brussels).

I have already passed all controls, check-in and everything. While I wait for boarding, I am watching the shameful spectacle of airport personnel (let me reiterate that: airport personnel, not Ryanair’s) enforcing RyanAir’s 10 Kg cabin baggage limit. According to RyanAir, they want to minimize the weight the plane carries to use less fuel. So far, so good.

Here is what I have seen: people who do not carry any baggage (very few, they have probably checked it in because it exceeded size or weight), people who are below the 10 Kg limit and people who are way over it (and have been told to check luggage in). I am OK with those cases.

There is still a fourth case: people who are slightly over 10 Kg.

I’ve seen a woman whose bag was 10.15 Kg to be told to pay 20 EUR to check her bag, or go back to the RyanAir desk to check-in the bag. She opened her bag, took a scarf, put it on and now the bag matched the weight limit.

Yes, RyanAir is charging 20 EUR/Kg for hand baggage from 10 Kg on. What a rip-off.

A couple of East-European girls were about 1 Kg in excess each. They put a couple of extra jumpers on and now baggage was under 10 Kg.

Many people were about 1 Kg in excess. When they were told their suitcase better got lighter or pay 20 to 40 EUR. Most of them just took something (camera, food, slippers, whatever) and put in the pockets of their coats. Fortunately, RyanAir is not charging for body and clothes weight (yet?).

In all those cases the plane will end up transporting the same weight and RyanAir won’t get one more dime, so why RyanAir? Why are you such a shameful company? Why are you enforcing ludicrous and pointless policies? Don’t you know after passing the control everybody just put everything back into the suitcase? Of course you do.

So after watching this ridiculous spectacle go on for a while, I had a devious idea: let’s organize a fat people conference and fly them all over to and from using RyanAir. Further, all of them should carry exactly 10 Kg hand baggage.

Another year, another FOSDEM. Yay, I’m attending again! The biggest open source event in Europe, with more than 200 conferences and 5,000 people.

This year I will be talking about KDE on Windows. Sunday, 10.15, CrossDesktop Room (H.1309). If you are interested in helping us with KDE on Windows development, in making use of KDE on Windows for your application, or just want to use your favorite KDE applications on Windows, then you should come.

Do not waste my hard disk!

I recently acquired an 2 TB hard disk drive, which I immediately formatted with ext4. Given that mkfs.ext4 defaults to 5% reserved blocks for too, that amounts to 100 GB of lost* space.

Wow. 100 GB. On a desktop machine. For root.

While mkfs belongs to e2fsprogs, I think this requires action on the distributions side. Let me explain.

If you are installing a server, 5% makes perfect sense: there are a lot of logfiles, updates, potential break-ins and other disasters-to-happen for which root-reserved space will be preciously required. 5% is a good choice here. If 5% is too much (or too little), it will not matter: any decent sysadmin knows mkfs.ext4 -m and tune2fs -m (if you don’t, Martin will teach you).

On the other hand, if you are installing Ubuntu on your laptop, you probably don’t know shit about mkfs, tune2fs, 5%, etc. You only know by removing Windows and installing Linux you’ve magically lost 100GB of space. Vanished.

So here is my request to Linux distributions: when installing a distribution in “desktop profile” (i. e. not Debian in “server” profile, RHEL, SLES, Ubuntu Server, etc), DO NOT RESERVE 5% to root. 100 or 200 MEGABYTES should be more than enough.

* Yes, I know it’s not really “lost” but just “unusable by normal users” but that technicality doesn’t matter to 99% of the people out there.

The screenshots.debian.net site is a public repository of screenshots taken from applications contained in the Debian GNU/Linux distribution and its derivates like Ubuntu.

It was created by Cristoph Haas to help people get an impression of what a certain software will look like on your desktop before you install it. Everybody can take screenshots and upload them through the upload form. So far there are about 3100 screenshots, which is actually a small number if you consider Debian Squeeze contains about 15000 source packages

After I hacked a bit on KSnapshot, the KDE snapshot tool, Sune asked me to add an export to screnshots.debian.net command. I had not thought of it before but that looked like a great idea, it would make the process or taking-screenshot, visit-upload-form, upload-each-screen so much easier!. So I started to develop my first KIPI plugin.

Here is the resulting Debian Screenshots plugin, as you would use it in KSnapshot:

Being a KIPI plugin, it is available not only from KSnapshot but also from Digikam, Gwenview and the other applications which support KIPI plugins:

Which makes me think: what about having a debshots (the software powering screenshots.debian.net) installation over at kde.org and have screenshots for all the KDE applications? What do you think? Promo team? Sysadmins? Setting it up shouldn’t be difficult: I’m no Python or Pylons expert (I’m more of a Ruby and Wt C++ guy 🙂 and got it running in a VM in an afternoon, even solving some SQLAlchemy 0.6 issues.

My name is Pau and I am a software developer at Arisnova, a Spanish company specialized in Linux and Windows software development for the supervision and control (SCADA) of industrial environments.

In Debian, I am the maintainer of:

  • witty, AKA Wt, a widget-centric C++ library for developing interactive web applications, with an API much like Qt‘s
  • libmsn, a library for connecting to Microsoft’s MSN Messenger service
  • (co-maintainer) ACE, the Adaptative Communication Environment, an object-oriented framework that implements many core patterns for concurrent communication software

I also happen to be a KDE developer. How’s this related to Debian? You’ll find very soon.