Remote graphical applications with NX

I have been recently (re)made aware of NoMachine’s NX communication programs by my colleage Txema. NX technology is a way of stablishing a connecting from one computer to another one, and create some sort of tunnel through which displayed info (graphics) is transmitted compressed. The communication, of course, is made through SSH secure connection.




Molden opening a file at Arina, a supercomputation cluster I have connected to from Bart, my computer at work, to which I have stablished a NX connection from Heracles, my computer at home. Screenshot taken from Heracles.
(Clic to enlarge)

Veteran Linux users will say “So what’s the big deal?”. Remote connections via telnet, and later with SSH, have been available a long time ago. Exporting the display (that is, making graphical programs opened in the remote computer appear in the local screen) has always been a simple task, and more recently even SSH tunneling has been made available for that task.

However, the key point here is the compression. When running a NX connection, we open a communication channel, running a custom application in the remote machine (for example, we can open the desktop environment, and work as if we were sitting in front of the remote machine), and all the information is compressed, so that the responsivenes of the remote application is as close as possible to applications run in the local computer.

Even though the core NoMachine’s NX code is free software, the client and the server themselves are not, I think. That is a pity, but free alternatives, such as FreeNX are being built upon the free core. From here I wish the best of successes for that project.

Comments

My music collection hits 6000 songs

Following the “report” series started with my previous summary of info about the music collection I listen to, I will update that info in this post.

The information has been gathered in the following ways:

1) Music file count with the following Perl script:


#!/usr/bin/perl -w

use strict;

chomp(my $cc_mp3 = `find /scratch/Music/CC/ -iname "*.mp3" | wc -l`);
chomp(my $cc_ogg = `find /scratch/Music/CC/ -iname "*.ogg" | wc -l`);
chomp(my $cd_mp3 = `find /scratch/Music/CDs/ -iname "*.mp3" | wc -l`);
chomp(my $cd_ogg = `find /scratch/Music/CDs/ -iname "*.ogg" | wc -l`);
chomp(my $jam_mp3 = `find /scratch/Music/Jamendo/ -iname "*.mp3" | wc -l`);
chomp(my $jam_ogg = `find /scratch/Music/Jamendo/ -iname "*.ogg" | wc -l`);

my $cc = $cc_mp3 + $cc_ogg; # all CC music not from Jamendo
my $cd = $cd_mp3 + $cd_ogg; # all commercial music (most from CDs, some from other sources)
my $jam = $jam_mp3 + $jam_ogg; # CC music from Jamendo

my $allcc = $cc + $jam; # all CC music

my $all = $allcc + $cd; # all music

my $mp3 = $cc_mp3 + $cd_mp3 + $jam_mp3; # all music in MP3
my $ogg = $cc_ogg + $cd_ogg + $jam_ogg; # all music in OGG

printf "Files: %5i\nCommercial: %5i\nJamendo: %5i\nOther CC: %5i\n",$all,$cd,$jam,$cc;
printf "In MP3: %5i\nIn OGG: %5i\n",$mp3,$ogg;

2) Playcount and other statistics, from the music player I listen to music with (Amarok). It also gives the file count, which I used to check the results of the script above.

3) Data in my public Last.fm profile. Visit Wikipedia to know more about Last.fm.

4) The du Linux command, for getting the disk usage.

Now the data (in parentheses the difference with respect to last report, 4 months ago).

Files

Total files        6015 (+1000)
  - Commercial     4164 (+220)
  - Jamendo        1820 (+765)
  - Other CC       31 (+0)
Total playtime     16d (+2d)
Disk usage         27GB (+5GB)
Artist count       718 (+91)
Album count        515 (+102)
MP3 count          0 (-1562)
OGG count          6015 (+2547)

Last.fm

Playcount          16744
Most played artist Joaquín Sabina - 1442 plays
Most played song   La del pirata cojo (J. Sabina) - 29 plays

Amarok

Playcount          12446 (+1536)
Favorite artist    Frank Delgado - 91.5/100
Favorite song      Las cuatro y diez (L.E. Aute and S. Rodríguez) - 97/100

As you can see, I have converted all my MP3s to OGG, so I have dumped patented music file formats for good.

You can also notice that the Last.fm and the Amarok playcounts are not equal. This disagreement comes from three facts: both counters where not initialized at the same time, Last.fm is a web service that counts all the songs I play at work and at home, whereas the Amarok count I give is the one at my office computer only, and I am not sure that the threshold for Amarok for considering a song listened to is the same as the one for Amarok telling Last.fm that I listened to a song (e.g. if I skip a song after 10 seconds of playing, maybe it counts as “listened to” for Amarok’s database, but it is not long enough for Amarok to report the song as “listened to” to Last.fm).

It is also evident that my “favorite” song is not the one I have listened to most times. It has to be taken into account that a song is taken as “listened to” if it is played at least for some seconds, but not necessarily to the end. However, if you skip a song before it finishes, it receives negative points, even if it counts as listened to.

Comments

Peer to peer: the new distribution paradigm

This post will hardly talk about rocket science, but there’s still a lot of ignorance on the subject.

A lot of people associate p2p with “piracy”, and eMule and BitTorrent with some shady way of obtaining the miraculous software of the big companies like Adobe or Microsoft.

Well, the fact is that p2p is a really advantageous way of sharing digital information through the net. Actually, the philosophy behind p2p is applicable to any process in which information, or some other good, is spread. So what is this philosophy? Simply put, p2p opposes a distributed way of obtaining t the goods, with a centralized one (see figure below).



Figure 1: Scheme of operation of p2p network. From Wikipedia.

I use the BitTorrent p2p technology (with the KTorrent program) quite often, particularly to download Creative Commons music from Jamendo. Lately, I have used KTorrent to download some GNU/Linux CDs, particularly the 4.0 version of Debian, and the beta (and this weekend, the stable) version of Ubuntu Feisty Fawn. With the latter, I have come to feel more deeply the advantages of p2p over centralized distribution of files.

With a centralized way of downloading, there is an “official” computer (the server) that has the “original” version of the information to download, and all the people who want to get the info (the clients) have to connect to that server to get it. The result is quite previsible: if a given piece of software is highly sought, a lot of clients will flood the server, and it will not be able to provide all the clients with the info they request, slowing the transmission down, or even stopping it alltogether for further clients, once saturation is reached. This happened with the release of the Windows Vista beta, when the high demand of the program, and the low resources Microsoft devoted to serving the files, provoked a lot of angry users having to wait for unreasonable periods of time until being able to download it.

This problem could well happen with the release of Ubuntu Feisty Fawn, and in fact this morning connecting to the Ubuntu servers was hopeless. However, unlike Microsoft, Canonical decided to make use of the BitTorrent technology to serve the ISO files, and this made all the difference.

With a p2p way of serving the files, the first clients connect to the server to get the files. However, once they have downloaded a part of the files, they too become servers, and further clients can choose whether to download from the central server or from other clients/servers (usually the decision is taken automatically by the p2p program). As the net of clients grows, and the file flow is balanced, the download speed is maximized for all, and the load on the servers is kept within reasonable limits.

The advantages are clear: each person willing to download some files (e.g. the Ubuntu ISOs) does not become a leech, imposing a burden on the server, but rather a seeder, providing others with the files, and speeding up, not slowing down, the spread of the files. It is, thus the ideal way of distributing files.

However, it has two disadvantages that made Microsoft not use it to spread the Windows Vista beta: since there is no single server, controlled by a central authority, it is not possible to assert how many copies of the files have been distributed. Moreover, since the distribution net is scalable, it can not choke, and thus MS would not be able to claim that the demand for their product was so high that the servers were not able to attend it.

So, for promotional purposes, the p2p is not very good. If your priority is the client, and making the files as widely and quickly spread as possible, then p2p is for you.

Comments

300

Yesterday I watched 300 (IMDb|FilmAffinity), the movie about the famous Spartan last stand against the Persian army of Xerxes in the Thermopylae.

The movie brings mixed feelings to me, as mixed have been its reviews. First and foremost: the movie is visually astounding. It is really well done in this respect, with all the special effects carefully polished, and a brilliant use of the camera at all times. They even use slow motion in some scenes, not very different from some Matrix scenes, and the effect is great.

Now, when we pass from visual to conceptual… the thing crumbles down. I can understand that most of the verisimilitude has been sacrificed in the altar of epic, glory and visual appeal, but some things where just too much. The Persians are depicted as brutal, bloodthirsty and even physically monster-like, whereas Spartans (narrow-minded warmongers, with a nazi view of discipline) are portrayed as the defenders of the modern civilization, in the face of the Asian mysticism and ignorance. Well, maybe Greece was an example of modernity and democracy, but it was Athens, not Sparta, the motor of that movement. To top it all, Xerxes is depicted as a 2.5m-tall androgynous circus freak, for no reason I can gather.

Anyway, if what you expect of this movie is a frenetic show, with a lot of action and great photography, you won’t be disappointed.

Comments

Neo uses the same keyboard as I do!

They are showing the movie The Matrix on TV right now, and I was casually watching it (not really caring much, because I had already seen it many times), when a top view of Neo’s desktop shows… this:

The image comes from a video I had recorded (the previous time they put it on TV), and I apologize for the low quality. In any case, you have my word that the white keyboard that appears in the movie is the kind of keyboard I use at work: one of those curved ones, with two separated key areas, one for each hand. I think some go under the name “Microsoft Natural Keyboard”, and I must admit they are one of the very few things M$ got right.

As a side note: I am looking desperately for another one (some) of those keyboards, but computer shops just won’t deliver them :^(

Comments (2)

SSH connection without password (II)

About 5 months ago I made a post explaining how to use SSH to connect from computer A to computer B without going through the hassle of introducing the password each and every time.

As it happens, my instructions were far from complete, because they relied upon not setting any passphrase, and thus saving the SSH password unencrypted in the hard disk. That way, a malicious user, if able to read your account in computer A, can connect in your name to computer B with no restriction (thanks agapito for pointing this out in a comment to my post).

Next step is, thus, to use use passphrases, but avoiding mayor hassles with ssh-agent.

I will repeat here the instructions in my old post, and extend them. First generate a public/private key pair in computer A:

% ssh-keygen -t dsa

and answer the questions you will be asked, not forgetting to enter a passphrase.

This will create two files in your ~/.ssh/ dir: id_dsa and id_dsa.pub, whith your private and public keys, respectively.

Now, you have to copy the contents of id_dsa.pub into a file named ~/.ssh/authorized_keys in computer B. From that moment on, you will be able to connect to B through SSH without being prompted for your user password in computer B. However, you will be prompted for a password: namely the passphrase that unencrypts the wallet to your actual password (they one you set with ssh-keygen).

To avoid having to introduce this passphrase each time we want to make a connection, we can take advantage of ssh-agent, in the following way. First, we run the agent:

% eval `ssh-agent`

Then we add our key to the agent:

% ssh-add

The above will look, by default, for ~/.ssh/id_dsa, and will ask for the passphrase we introduced when generating it with ssh-keygen.

After the above, all further connections from that terminal (and its children) will benefit from passwordless SSH connections to computer B (or any number of computers that have your A computer’s public DSA key in their ~/.ssh/authorized_keys file). This benefit will be lost whenever ssh-agent stops running, of course.

OK, but I want to have passwordless connections from ALL my consoles!

Then you have to take advantage of the following sintax:

% ssh-agent command

where, command and all of its children processes will benefit from ssh-agent. command could be, of course, startx, or any command you use to start the desktop environment. You will still have to execute ssh-add, and enter the passphrase, but only once in your whole session. You will have to enter the passphrase again only if you log out of the desktop environment and log in again.

OK, but how do I make scripts benefit from this

You will find yourself automating the execution of some scripts sooner or later, for example putting some backups in a cron.

To do so, a ssh-agent must be already running, and you have to make the script somehow hook to it. To do so, include the following code chunks in your scripts:

Perl:

Create the following subroutine:

###################################################
#                                                 #
# Check that ssh-agent is running, and hook to it #
#                                                 #
###################################################

sub ssh_hook
{
  my $user = $_[0] or die "Specify a username!\n";

  # Get ID of running ssh-agent:
  chomp(my $ssh_id = `find /tmp/ssh* -name 'agent.*' -user $user`);
  die "No ssh-agent running!\n" unless $ssh_id;

  # Make this ID available to the whole script, through
  # environment variable SSH_AUTH_SOCK:
  $ENV{SSH_AUTH_SOCK} = $ssh_id;
};

and call it (before any SSH call in the program), like this:

&ssh_hook(username);

tcsh:

setenv SSH_AUTH_SOCK `find /tmp/ssh* -name 'agent.*' -user username`

bash:

export SSH_AUTH_SOCK=$(find /tmp/ssh* -name 'agent.*' -user username);

In all cases username is the user name of the user making the connection (and having run ssh-agent).

A lot of info was taken from this Gentoo HowTo and this HantsLUG page, and googling for “ssh without password”.

Comments (2)

How (legally) strong is the word "free"?

It seems that the answer is: a lot.

Perusing some old e-mails (I save all the e-mails I receive, except spam and stupid 2MB presentations), I found the following one, dated November 11, 2006:

Hello all,

I read in your page at:

http://www.linfo.org/index.html

That your “[…] project has the goal of providing high quality, comprehensive, accessible and free information about Linux and other free software”

How is it “free”, if the page also reads?:

“Copyright © 2004 – 2006 The Linux Information Project. All Rights reserved.”

Could you publish the information under a Creative Commons, or GNU Free
Documentation License? Either that, or remove the “free” part in the
paragraph above.

Yours sincerely,

Iñaki

As it follows from my e-mail, I was concerned for the use of the adjective “free” in an incorrect way. The reader might think they (of course) ignored my warning, because “free” is such a loose, multi-meaning, not-legally-binding word, much like “healthy”, “good”, “in a minute”, “you can do it yourself”, “natural”, “organic”… and all the jargon used in advertising to convey a positive look of the product, while still dodging potential sues for misguiding information.

Well, not quite. It seems that in software and information technology, “free” has a definite meaning, which linfo.org would not meet. As such, you can visit their current page, which now reads:

Welcome to The Linux Information Project (LINFO)! This project is dedicated to providing high quality, comprehensive and easily accessible information about Linux and other free software.

See any missing word? Why, the “free” is gone!

Maybe it sounds petty and nit-picking, but it isn’t. There is an increasing tendency to bastardize words like free software and the like, which I ascribe to closing the gap between “free and good” and “closed, for-profit, and evil”. Corporations have noticed how some terms are gaining progressive good reputation, like e.g. free software, and don’t want to lose terrain in the ensuing war.

This war has two fronts: first, demean everything that smells of “freedom”. For example, label “free software” products as “open souce software”. Why? Because it weakens its link with some freedom ideals, and conveys the idea that what makes that software different is simply that you can read the source code. You will also recognize bastards playing on this side because they will always refer to “free software” (software created and used with freedom) as “software that is free of cost” or “no-cost software”, or any other construction that tries to reduce all the benefits and characteristics of free software to the concept that it is free of cost, like mere freeware (read an example in a previous post[es]).

The second front is attaching the label “free” and/or “open” to any product that could conceivably (or inconceivably) bear it, much like “low-fat” would be attached to any food, be it naturally fatty or not (in which case little an achievement it would be), or even non-food (like tobacco), or “organic” to anything from food to clothes to shampoos.

In this confrontation, we start a slippery slope of giving blurry meanings to words, then end up having blurry concepts applied, like a “low-fat” super-hamburger that can single-handedly obstruct all your arteries with its cholesterol, but is called “low-fat” because it has lower fat content than another similar size burger, or a page showing information that they call “free”, but is under burdensome copyrights, that (for example) take from you the simplest right of copying the information and sharing it with others freely.

Comments

My uptime hits 50d

I am bored, working a little bit too much overtime, and I just realized that my work computer’s uptime exceeded 50 days today. Yes, this means that I last rebooted my computer 50 days ago.

I realize that this number is far from impressive, but I can’t help but compare it with the case of our only fellow workmate who uses Windows (XP). He once left his computer on and unattended for a couple of weeks (he went on a trip abroad), and to be fair Windows behaved: it didn’t crash. However, when he is working he religiously turns the computer off every evening, because (he says), “otherwise it eventually slows down to a crawl”. My beloved Debian runs as smoothly as the first day, after being on (and under heavy use) for almost two months.

Comments

First spam e-mail I actually found amusing!

I just received a spam e-mail with the following subject:

Linux is covered by the GNU General Public License (GPL), which allows free distribution of the code please read the GPL in appendix

Except poor punctuation, the sentence makes sense, and covers a subject I could actually be interested on :^)

This e-mail is one of the things I have found most interesting recently.

Well, at least among the ones I’ve sent to complete oblivion with a key press.

Comments

Linux in the metropolitan buses of Donostia

Today I have taken the usual bus to the city center, and noticed that the monitors they have in the buses, showing general info and commercials, where blank. Well, not exactly blank: some white letters littered the black screens. “Oh, dear” – I thought – “another Windows crash”. Not quite: the messages the monitors where showing corresponded to GNU/Linux!!

Below you can see a photo I took. Click in the picture for full-size version.

I also took a second pic, without flash:

I apologize for the poor quality of the pictures, but taking photos of low-luminance screens in bright ambients, and inside a moving bus is not trivial (and my digital camera is not the best ever).

If one forces one’s eyes, the following fragments can be read:

* search_bg_key: invalid format found in block [...]
* is_leaf: free space seems wrong: level 1
* reiserfs_read_inode2: i/o failure ocurred trying to find [...]
* /home/terminal/datos/backup/20070217-def[...]-md5
* Unable to handle paging request at virtual address [...]

From my ignorance, it seems one hard disk failed, or maybe a connection (say, the USB cable to an external disk) was broken, or a device’s capacity exhausted. Of course, it might well be a failure of the OS (albeit quite unlikely, being GNU/Linux).

In any case, I was shocked to discover that the city council has decided to give Linux a go. Well done, Mr. Elorza!

Comments (3)

« Previous Page« Previous entries « Previous Page · Next Page » Next entries »Next Page »