Author Archives: admin

Another Take on English vs Swedish

Some years ago, I visited the library to get a C book or two to read while waiting for the one I had ordered from Amazon. The only C book I found, however, was the Swedish translation of Stephen Prata’s C++ Primer. The problem was, though, that I never read computer literature in Swedish (I don’t even know the Swedish word for “encapsulation”) so procrastination ensued. I’d heard some good things about that book, though, so in the end, I had to try, Swedish or not.

But the book felt weird, as if Prata wasn’t really serious about programming. It was as if he had decided to make up the terminology as he went along, and as if I had two foreign languages (computer-Swedish and C++) to master, instead of just one.

And that, I guess, was the real problem. The whole computerised world speaks English, which means that if we need to talk about computing, we need to do it in English or risk not being understood. In Sweden, we’ll invariably borrow from English when discussing anything having to do with computers and IT, even if the main conversation is in Swedish. Sometimes the words are semi-translated and (sort of) accepted, but often we just use the English terminology directly, without even stopping to think about what the corresponding Swedish word might be. Yet nobody cares, because we all seem to agree that the Swedish words are, or at least sound, phoney.

It is a problem, however, one of the curses of the modern western civilisation and shared by every nation with a native language other than English. Precious few see it in that way; most accept the consequences without a second thought, with devastating consequences for their native tongues.

Icelanders have a different approach. The authorities will produce official Icelandic translations for every single new word that pops up in a book, television show or United Nations address. Every single one, and not just for the English ones either. Everything: German, Spanish, Swahili, doesn’t matter what the language is. See, the Icelandic language police regards every foreign influence as potentially harmful and acts accordingly. For example, they never use “television” or “TV”, they use “sjónvarp” (roughly “flying picture”). “Telephone” is “sími” (original meaning: “thread”), and “computer” is “tölvu”.

Cool, isn’t it?

I’m not entirely sure their way is better. The Icelandic market is far too small for translating every new book or film or United Nations address there is, and certainly not immediately, which means that if there is some new and important terminology to learn, it is not translated either. Which in turn means that the foreign-language terminology is actually needed after all if you want to keep up, at least for a transitional period. So, by the time the authorities come up with an Icelandic translation, the English terminology might already be in use and hard to get rid of.

There are not that many Icelanders. How many of them can keep up with the advances of the western computing?

As for me, I now know how to translate “encapsulation” (“datagömning”, which, by the way, is not quite as Swedish a word as it might look to you, dear reader, but that’s another story). It’s not a word I’ll use any time soon. It sounds downright silly.

Further Unity Comments

After a couple of days with Ubuntu and the Unity desktop, I have another couple of comments.

First of all, I don’t like the global menu bar. I really don’t like it. It takes up space for no good reason – the clock, etc don’t require it, not really, and if you turn off global menus, which I did, it is mostly empty.

Second, I hate the “Amazon Lens”. It’s not that I don’t shop on Amazon – I do, all the time – but I want to choose when to interact with Amazon (or other commercial providers) without my desktop interfering. So I’ve turned that off, too.

I also don’t like how newly opened windows are placed on the desktop. It’s probably configurable using some of the tweak tools that are available, but I can’t be bothered to look it up.

This morning, I switched back to KDE and the more traditional desktop metaphor, and immediately realised that it’s rather boring, too. It’s nice, with all kinds of extras and eye candy and stuff, but it’s boring and it should be possible for someone to come up with a more modern Linux desktop.

It’s just that Unity isn’t the answer.

New Distro, Part Two

After a couple of days of Kubuntu, my curiosity took the upper hand and I decided to install the Unity desktop along KDE.

It’s an interesting GUI, I have to admit, but I remain unconvinced. The search-oriented task bar thingy to the left is an odd bird, for example. It is as if Canonical were mixing their desktop metaphors. There must be a task bar because everyone’s got one, but it seems as if they’ve gone out of their way to ensure it is different. I’m not sure if it’s a good thing or a bad thing yet.

Worse is that the global menus do not work–as many others have pointed out, on a large screen the menus will simply be too far off from the programme window itself. I know, OS X does this also, but the difference, the crucial difference, is that the menus and their behaviour are consistent on a Mac, something they can never be on Linux.

Version 14.0 allows you to move the global menus to their respective windows, which solves the problem but also highlight a less serious one: the top bar, now mostly empty sans a few icons to the right, still takes up space but now provides no real benefit.

On the whole, though, the GUI looks nice, with better graphics than I remember from past Ubuntu versions. It looks like a finished product, something that, say, Debian Testing, doesn’t–the XFCE desktop I briefly tried when deciding ona new distro looks ghastly. I know it’s not supposed to have the bells and whistles of a Plasma desktop, Windows 7 or even Gnome, but my god, the damned thing put me off to an extent I didn’t think possible.

Submitted My Final Balisage Edit

I submitted the final edit of my Balisage paper, Multilevel Versioning for XML Documents, the other day. While I did try to shorten it (I seem to be unable to produce a short paper) and, of course, correct problems and mistakes pointed out by reviewers, there were no radical changes, and so I am forced to draw one of two possible conclusions:

I am deluded and simply don’t know what I’m talking about. This is an awful feeling and happens to me a lot after submitting papers.

The paper suggests something that might actually work.

(There is a third conclusion, obviously, one that is a mix of the two, but let’s not go there.)

My paper is about a simple versioning scheme for the eXist XML database, built on top of the versioning extension that ships with it. Its main purpose is to provide granularity to versioning, to provide an author of XML documents with a method to recognise significant new versions as opposed to the long series of saves, each of which comprises a new eXist version.

On the surface of it, my scheme is a typical multilevel versioning system,with integers, decimals, centecimals, etc (1, 1.1, 1.1.1, 1.1.2, 1.1.3, 1.2, …) identifying a level of granularity. The idea is that the lowest level (centecimal, in this case) denotes actual edits while the levels above identify significant new versions. Nothing new or fancy, in other words. What is new (to me, at least; I have not seen this suggested elsewhere) is how the scheme is handled in eXist.

I’m proposing that each level is handled in its separate collection, each using eXist’s versioning extension to keep track of new versions in the respective collections. When a level change occurs (for example, if a new centecimal version such as 1.3.1 is created from 1.3), the new version is created using a simple copy operation from the decimal collection to the centecimal collection. The operation itself (in this case, a check-out from a decimal version to a centecimal version) is kept track of using an XML file logging each such operation and mapping the eXist “standard” version to the new integer, decimal or centecimal revision.

A related task for the XML file is to map the name of the resource to its address; the XML file’s other big purpose is to provide the resources with a naming abstraction so a named resource in a specific significant version can be mapped to an address on the system. I propose using URNs, but most naming conventions should work just as well.

Implementation-wise, the XML version map abstraction is very attractive to me as a non-programmer (or rather, someone whose toolkit of programming languages is mostly restricted to those commonly associated with XML technologies), as I believe most of the operations can be implemented in XSLT and XQuery.

But I’m not there yet. I’ve submitted the final paper and now, I have to produce a sufficiently convincing presentation on the subject.

The presentation is on Tuesday, August 5th, and I’d love to see you there.

Time for a New Distro

Recently, I upgraded my work laptop with an SSD disk.The laptop, a Lenovo Thinkpad T510, has been pretty reliable but getting a bit long in the tooth. A conventional 2.5″ disk three years old is a cause for concern if used daily, and anyway, SSDs are amazingly fast these days. It’s almost like buying a new computer.

I should also mention the Nvidia Optimus graphics card. It’s basically two cards in one, an Intel graphics chip for the daily stuff that doesn’t require much graphics processing and an Nvidia chip for the stuff that does, the idea being that the OS switches between the two to either save battery or boost performance.

So, anyway, while I simply cloned the Windows partitions from the old disk (using Acronis software), I eventually decided to try a new Linux distro rather than fixing the cloned Debian Sid I’ve been running since 2010 or so. The Debian system was spread out over several partitions, which caused problems when booting the cloned system–apparently UUIDs changed when cloning, crashing the system.

I wanted something Debian-based, of course. Apt-get rules and all that, and Debian is pretty hard to break even if you run an unstable version of it.

First, I tried the new Linux Mint Cinnamon distro (v 17), having heard some very good things about it. The installation went without a hitch and I was soon able to boot into the desktop (what a pretty one, btw) using the open-source Nouveau display drivers. They were OK but not great, so I decided to replace them with Nvidia’s proprietary package and something called nvidia-prime that would allow me to switch between the two graphics chips. This seemed to work well, until I came to work the next morning, placed the laptop into a dock and booted using an external monitor only.

No desktop appeared, just a black screen.

Opening the laptop’s lid, I discovered that the desktop was actually there, after all, but only on the laptop screen. Nvidia Settings, the helper software that allows you to configure the X server and screens, was broken and so I couldn’t use it to configure the monitors. The Cinnamon display settings would only share the desktop between the two screens but not allow me to only use the external monitor.

Changing from the Nvidia chip to the Intel one did not significantly change this, but introduced a new problem: I no longer had the option to change back to Nvidia.

I looked around to see if there were newer Nvidia packages around, or perhaps a newer kernel, since that’s what I would always do in Debian Sid; there would frequently be something in the experimental branch that would help me. Linux Mint, however, while Debian-based, is far from Debian Sid. It is meant to be stable, and anything, um, unstable would have to come from somewhere else entirely.

I found a 3.15 kernel from a Ubuntu branch and installed that, but Linux Mint would then insist that a 3.13 kernel was actually an upgrade, so I gave up and realised Linux Mint wasn’t for me after all.

I then spent the following evening (and night) installing and testing Ubuntu 14.04 in place of Linux Mint, as a Google search suggested nvidia-prime would work out of the box in it. It did, but after a few hours of fooling around with Ubuntu, I realised I truly hated Ubuntu’s Unity desktop.

Discouraged, I wiped Ubuntu from the disk in favour of Debian’s Testing branch, but that didn’t go well. I downloaded an ISO, remembering that Debian’s installer would not support WiFi cards during the install, only to discover that they had a) switched to XFCE from Gnome as their default desktop and, more importantly, b) my WiFi card was still considered bad as it was non-free according to Debian’s rather strict criteria and so the firmware was not on the ISO and I had no wired network hooked up to that laptop.

I could have used the Windows partition or my Macbook Pro to download the missing firmware, of course, but I got annoyed and wiped the disk again, now installing the new Kubuntu 14.04 instead.

Which is where I am now. Kubuntu also handles nvidia-prime out of the box, but it also has the (for me) familiar KDE desktop. It’s not perfect (the system fonts, for example, are ghastly and I have to do something about that very soon) but it’s good enough for now.

Now, you may be tempted to point out that Nvidia Optimus works out of the box there, too, and with more finesse, but if so, you are missing the point.

Linux is fun, and the very fact that there are so many distros out there speaks in its favour. If something in Windows doesn’t work for you, you won’t have a Windows alternative. Well, you have Windows 8, but seriously?

ProXist Documentation, Etc

My XProc abstraction thingy for eXist, ProXist, is not the most well-documented open source project there is, but at least there is now something to read. It’s little something in DocBook, just a first draft and terribly incomplete, but something that I’m hoping to make more complete, given enough time.

I also feel it’s time to ProXist it as an eXist app rather than a set of misplaced collections.

Paper Woes

I managed to submit my Balisage paper on time, in case you wondered.

Also, I still think my basic idea is a good one. It’s simple and, I think, useful. So simple, in fact, that I’m worried that everybody but me thinks it’s perfectly obvious.

::sigh::

Just A Few More Days

The Balisage paper deadline is approaching fast. Deadline is on Good Friday, which is three days from now and close enough to keep me busy until some decidedly ungodly hours come Friday night. Basically I’m interpreting “April 18” as “early morning, April 19, GMT”.

My paper is, so far, a study in procrastination. There is an angle bracket or two, and possibly even a semi-intelligent thought, but most of all my long days so far remind me more of my uni days and approaching exams, when any excuse from sorting spoons (I had two) to counting dust bunnies was enough to keep me away from the books, than a serious commitment to markup theory and practice.

It is coming along, though.