Archive for April, 2007

Pérez-Reverte y sus Termópilas

Acabo de leer el artículo de Arturo Pérez-Reverte en el XLSemanal de esta semana (Nº 1018), y veo que abunda sobre unos argumentos ya utilizados hasta la saciedad por cuatro demagogos.

El artículo trata sobre las opiniones vertidas alrededor de la película 300, de Zach Snyder. La película, como el lector sabrá, relata la épica resistencia en el desfiladero de las Termópilas de 300 soldados espartanos, bajo el mando del rey Leónidas, frente a las huestes de Jerjes, rey de reyes de Persia, en el año 480 A.C. Pérez-Reverte critica la aplicación, por parte de críticos de la película, de estándares morales actuales a una situación que tiene ya 2500 años (en esto estoy de acuerdo), pero se excede totalmente al hacer una defensa acérrima de lo buenos que eran los griegos.

Me tomo la licencia de citar sus palabras a continuación, y comentar algo sobre ellas:

Al morir de pie, espada en mano, hicieron posible que, aun después de incendiada Atenas, en Salamina, Platea y Micala sobrevivieran Grecia, sus instituciones, sus filósofos, sus ideas y la palabra democracia. Con el tiempo, Leónidas y los suyos hicieron posible Europa, la Enciclopedia, la Revolución Francesa, los parlamentos occidentales, que mi hija salga a la calle sin velo y sin que le amputen el clítoris, que yo pueda escribir sin que me encarcelen o quemen, que ningún rey, sátrapa, tirano, imán, dictador, obispo o papa decida –al menos en teoría, que ya es algo– qué debo hacer con mi pensamiento y con mi vida. Por eso opino que, en ese aspecto, aquellos trescientos hombres nos hicieron libres. Eran los nuestros.

Comete Pérez-Reverte un número de falacias que no necesitan de mucho conocimiento histórico para rebatir, pero este conocimiento ayuda.

Primero, comete la falacia de post hoc, ergo propter hoc, ya que asume que todas esas cosas “buenas” que hemos tenido y tenemos en Europa, se deben a que aquellos europeos hicieron frente a aquellos asiáticos… pero esto no está para nada demostrado. Complementa esto con una falacia de petición de principio, ya que da por supuesto que si los persas hubiesen ganado, no gozaríamos de las libertades que menciona, y esto genera el argumento circular de que, por tanto, el que ganaran los griegos fue lo que nos trajo lo que tenemos.

También olvida convenientemente Pérez-Reverte que la batalla ocurrió 500 años antes de que se fundara el cristianismo, y alrededor de 1000 antes de que se fundara el islamismo. Los Papas e Imanes que menciona aparecieron mucho después, y no están realmente muy conectados con la expansión del imperio de Jerjes.

Es más, aún suponiendo que la historia europea sea lo que ha sido gracias a que Grecia resistió ante Persia… ¿a qué viene una selección sesgada de lo “bueno”? ¿Acaso en Europa no hemos tenido una Edad Media llena de oscurantismo? ¿Acaso son los espartanos responsables, por ejemplo, de la Inquisición? ¿Acaso la Revolución Francesa no se llevó a cabo precisamente porque la situación anterior (realeza, nobleza y abusos) era tan nefasta? ¿Y esto no proviene de Grecia y Roma? ¿Gracias a Leónidas las hijas de Pérez-Reverte no llevan velo, pero no debemos a Leónidas el ascenso de Hitler? Es un poco inconsecuente. Al fin y al cabo, si los Persas hubiesen ganado, la Inquisición nunca habría aparecido, ni Hitler hubiera tenido huevos de promover el odio contra los que no fueran “arios”.

Por otro lado, parece que Pérez-Reverte atribuye fanatismo e ignorancia al Irán actual (antigua Persia), y deduce que tales males nos afligirían en Europa si Jerjes hubiera ganado. Pero es que 150 años después de Jerjes, hubo un muchacho macedonio rubio y de ojos azules que, antes de morir a los 33 años, unió a toda Grecia y extendió un imperio que llegó a derrotar a Persia, destruyendo su capital, Persépolis, y conquistó Anatolia (Turquía), Siria, Fenicia (Líbano), Judea, Gaza, Egipto, Bactria (Afganistan) y Mesopotamia (Irak). Este chico se llamaba Alejandro Magno.

Ahora bien, ¿no podrían los ciudadanos de esos paises quejarse amargamente de que sus hijas tienen que usar velo, y que pueden ser lapidados por sus ideas, por culpa del bárbaro de Alejandro? ¿No podría argumentarse que mejor les habría ido si Jerjes hubiera conquistado Grecia, y que el mal que puedan sufrir no es culpa del Imperio Persa, sino del Imperio Macedonio?

La respuesta, obviamente, es no. Ni nuestra Enciclopedia se la debemos a Leónidas, ni el fanatismo religioso en oriente medio se lo deben a Alejandro. Asignar responsabilidades y gratitudes históricas, remontándose a 2500 años atrás, es un acto tan irresponsable como el juzgar a aquellos griegos con nuestra moral de ahora. Lo triste es que Pérez-Reverte ha visto el error de esto último, pero no el de lo primero. Lamentable.

Comments (13)

Remote graphical applications with NX

I have been recently (re)made aware of NoMachine’s NX communication programs by my colleage Txema. NX technology is a way of stablishing a connecting from one computer to another one, and create some sort of tunnel through which displayed info (graphics) is transmitted compressed. The communication, of course, is made through SSH secure connection.




Molden opening a file at Arina, a supercomputation cluster I have connected to from Bart, my computer at work, to which I have stablished a NX connection from Heracles, my computer at home. Screenshot taken from Heracles.
(Clic to enlarge)

Veteran Linux users will say “So what’s the big deal?”. Remote connections via telnet, and later with SSH, have been available a long time ago. Exporting the display (that is, making graphical programs opened in the remote computer appear in the local screen) has always been a simple task, and more recently even SSH tunneling has been made available for that task.

However, the key point here is the compression. When running a NX connection, we open a communication channel, running a custom application in the remote machine (for example, we can open the desktop environment, and work as if we were sitting in front of the remote machine), and all the information is compressed, so that the responsivenes of the remote application is as close as possible to applications run in the local computer.

Even though the core NoMachine’s NX code is free software, the client and the server themselves are not, I think. That is a pity, but free alternatives, such as FreeNX are being built upon the free core. From here I wish the best of successes for that project.

Comments

My music collection hits 6000 songs

Following the “report” series started with my previous summary of info about the music collection I listen to, I will update that info in this post.

The information has been gathered in the following ways:

1) Music file count with the following Perl script:


#!/usr/bin/perl -w

use strict;

chomp(my $cc_mp3 = `find /scratch/Music/CC/ -iname "*.mp3" | wc -l`);
chomp(my $cc_ogg = `find /scratch/Music/CC/ -iname "*.ogg" | wc -l`);
chomp(my $cd_mp3 = `find /scratch/Music/CDs/ -iname "*.mp3" | wc -l`);
chomp(my $cd_ogg = `find /scratch/Music/CDs/ -iname "*.ogg" | wc -l`);
chomp(my $jam_mp3 = `find /scratch/Music/Jamendo/ -iname "*.mp3" | wc -l`);
chomp(my $jam_ogg = `find /scratch/Music/Jamendo/ -iname "*.ogg" | wc -l`);

my $cc = $cc_mp3 + $cc_ogg; # all CC music not from Jamendo
my $cd = $cd_mp3 + $cd_ogg; # all commercial music (most from CDs, some from other sources)
my $jam = $jam_mp3 + $jam_ogg; # CC music from Jamendo

my $allcc = $cc + $jam; # all CC music

my $all = $allcc + $cd; # all music

my $mp3 = $cc_mp3 + $cd_mp3 + $jam_mp3; # all music in MP3
my $ogg = $cc_ogg + $cd_ogg + $jam_ogg; # all music in OGG

printf "Files: %5i\nCommercial: %5i\nJamendo: %5i\nOther CC: %5i\n",$all,$cd,$jam,$cc;
printf "In MP3: %5i\nIn OGG: %5i\n",$mp3,$ogg;

2) Playcount and other statistics, from the music player I listen to music with (Amarok). It also gives the file count, which I used to check the results of the script above.

3) Data in my public Last.fm profile. Visit Wikipedia to know more about Last.fm.

4) The du Linux command, for getting the disk usage.

Now the data (in parentheses the difference with respect to last report, 4 months ago).

Files

Total files        6015 (+1000)
  - Commercial     4164 (+220)
  - Jamendo        1820 (+765)
  - Other CC       31 (+0)
Total playtime     16d (+2d)
Disk usage         27GB (+5GB)
Artist count       718 (+91)
Album count        515 (+102)
MP3 count          0 (-1562)
OGG count          6015 (+2547)

Last.fm

Playcount          16744
Most played artist Joaquín Sabina - 1442 plays
Most played song   La del pirata cojo (J. Sabina) - 29 plays

Amarok

Playcount          12446 (+1536)
Favorite artist    Frank Delgado - 91.5/100
Favorite song      Las cuatro y diez (L.E. Aute and S. Rodríguez) - 97/100

As you can see, I have converted all my MP3s to OGG, so I have dumped patented music file formats for good.

You can also notice that the Last.fm and the Amarok playcounts are not equal. This disagreement comes from three facts: both counters where not initialized at the same time, Last.fm is a web service that counts all the songs I play at work and at home, whereas the Amarok count I give is the one at my office computer only, and I am not sure that the threshold for Amarok for considering a song listened to is the same as the one for Amarok telling Last.fm that I listened to a song (e.g. if I skip a song after 10 seconds of playing, maybe it counts as “listened to” for Amarok’s database, but it is not long enough for Amarok to report the song as “listened to” to Last.fm).

It is also evident that my “favorite” song is not the one I have listened to most times. It has to be taken into account that a song is taken as “listened to” if it is played at least for some seconds, but not necessarily to the end. However, if you skip a song before it finishes, it receives negative points, even if it counts as listened to.

Comments

Peer to peer: the new distribution paradigm

This post will hardly talk about rocket science, but there’s still a lot of ignorance on the subject.

A lot of people associate p2p with “piracy”, and eMule and BitTorrent with some shady way of obtaining the miraculous software of the big companies like Adobe or Microsoft.

Well, the fact is that p2p is a really advantageous way of sharing digital information through the net. Actually, the philosophy behind p2p is applicable to any process in which information, or some other good, is spread. So what is this philosophy? Simply put, p2p opposes a distributed way of obtaining t the goods, with a centralized one (see figure below).



Figure 1: Scheme of operation of p2p network. From Wikipedia.

I use the BitTorrent p2p technology (with the KTorrent program) quite often, particularly to download Creative Commons music from Jamendo. Lately, I have used KTorrent to download some GNU/Linux CDs, particularly the 4.0 version of Debian, and the beta (and this weekend, the stable) version of Ubuntu Feisty Fawn. With the latter, I have come to feel more deeply the advantages of p2p over centralized distribution of files.

With a centralized way of downloading, there is an “official” computer (the server) that has the “original” version of the information to download, and all the people who want to get the info (the clients) have to connect to that server to get it. The result is quite previsible: if a given piece of software is highly sought, a lot of clients will flood the server, and it will not be able to provide all the clients with the info they request, slowing the transmission down, or even stopping it alltogether for further clients, once saturation is reached. This happened with the release of the Windows Vista beta, when the high demand of the program, and the low resources Microsoft devoted to serving the files, provoked a lot of angry users having to wait for unreasonable periods of time until being able to download it.

This problem could well happen with the release of Ubuntu Feisty Fawn, and in fact this morning connecting to the Ubuntu servers was hopeless. However, unlike Microsoft, Canonical decided to make use of the BitTorrent technology to serve the ISO files, and this made all the difference.

With a p2p way of serving the files, the first clients connect to the server to get the files. However, once they have downloaded a part of the files, they too become servers, and further clients can choose whether to download from the central server or from other clients/servers (usually the decision is taken automatically by the p2p program). As the net of clients grows, and the file flow is balanced, the download speed is maximized for all, and the load on the servers is kept within reasonable limits.

The advantages are clear: each person willing to download some files (e.g. the Ubuntu ISOs) does not become a leech, imposing a burden on the server, but rather a seeder, providing others with the files, and speeding up, not slowing down, the spread of the files. It is, thus the ideal way of distributing files.

However, it has two disadvantages that made Microsoft not use it to spread the Windows Vista beta: since there is no single server, controlled by a central authority, it is not possible to assert how many copies of the files have been distributed. Moreover, since the distribution net is scalable, it can not choke, and thus MS would not be able to claim that the demand for their product was so high that the servers were not able to attend it.

So, for promotional purposes, the p2p is not very good. If your priority is the client, and making the files as widely and quickly spread as possible, then p2p is for you.

Comments