Archive for Free software and related beasts

Be nice in Wikipedia

I just read, by chance, a very interesting essay a Wikipedia administrator wrote in his personal page. It deals with people spamming or vandalizing Wikipedia, for fun or for personal promotion/economical benefit.

I’ll quote the intro paragraph:

Let’s face facts: Wikipedia has become an important force on the Internet. If you’ve got a business to run or a belief to circulate there’s a big temptation to hit that edit button and do your thing. You’re on your honor here. Most people have honor, which is why Wikipedia is huge and (usually) pretty darn good, but then there’s that thought – what if you could harness this site and make it work for you?

You aren’t the first person to get that idea. And hello there, I’m a Wikipedia administrator.

Read more in the whole essay.

Comments

Compiz Fusion under Debian Lenny on my home desktop

I recently wrote (actually, my last post, 12 days ago), a howto of sorts with my experience installing Compiz Fusion on my laptop. Yesterday I came back from my vacations, and repeated the feat with my destop computer at home.

The setup is quite different:

CPU: AMD Athlon 2800+
Graphics: nVidia FX 5700 (256MB)

And the effort is also quite different: it took me much less! Partially, this was because of my previous exprerience, but mainly the reason is that the graphics card here is nVidia. Yes, let the world know that ATI cards suck on Linux.

The problem is that ATI cards need XGL to have Compiz running, but nVidia cards make use of AIGLX natively, so the installation has only two steps: (1) installing the nVidia driver, and (2) installing the Compiz Fusion packages.

Installing the latest nVidia driver

As with the ATI card in my laptop, I decided to use the proprietary drivers from the nVidia site. The choice-making interface is so similar, actually, to that of ATI. I had to go Graphics Driver->GeForce FX series->Linux x86->Go!, and download this installer.

BIG WARNING: before actually installing anything, remove a previous installation of the nVidia drivers, if you installed them “the Debian way”. For that, do:

% aptitude purge nvidia-glx

I have a friend who did not do so and… Ok, ok, it happened to me. If you do not do the above, everything seems to work fine, but everytime you reboot the X server will crash, and you might get incredibly annoyed by that.

To perform the installation, simply run, as root:

% sh path-to-file/NVIDIA-Linux-x86-100.14.11-pkg1.run

Then, just modify your xorg.conf file to contain the following:

Section "ServerLayout"
  Identifier     "Default Layout"
  Screen       "Default Screen" 0 0
  InputDevice    "Generic Keyboard"
  InputDevice    "Configured Mouse"
  Option         "AIGLX" "true"
EndSection

...

Section "Extensions"
  Option         "RENDER" "true"
  Option         "Composite" "Enable"
  Option         "DAMAGE" "true"
EndSection

Installing Compiz Fusion packages

The procedure is exactly the same covered in my previous post. In short:

1) Add the Shame repository to your /etc/apt/sources.list:

deb http://download.tuxfamily.org/shames/debian-sid/desktopfx/unstable/ ./

2) Get the signature for the repo:

% gpg --keyserver pgpkeys.mit.edu --recv-key 11F6E468
% gpg -a --export 11F6E468

3) Update and install:

% aptitude update
% aptitude install compiz-fusion-all --a

Any time you want to run Compiz, just execute:

% compiz --replace -c emerald

Shorter than the ATI thing, uh?

Comments

Compiz Fusion under Debian Lenny on my laptop

I have a previous post with what I’ve done to my laptop, and in that post it’s not mentioned, but I managed (quite a while ago) to make Beryl work under Ubuntu Dapper Drake. Dapper is getting older, but I am not having good experiences installing Edgy and Feisty on the laptop. I have managed to install Debian Etch with no problem, but the wireless driver was not working properly (for me, a showstopper) until Lenny.

So now I have a Debian Lenny partition, plus three other: the original WinXP, the Ubuntu Dapper I am still using as “main” OS, and a Fedora 7 I installed just because it came in a DVD with a magazine I bought for a train trip I had not brought any reading material with me :^)

Since I am on vacation, and I have plenty of time (although I don’t want to spend all of it on my comp), I decided to give Compiz Fusion a try, mostly after seeing what it its capable of.

First things first, the specs of my laptop are:

Fujitsu-Siemes Amilo PI1536
CPU: Intel Core 2 Duo T7200 2×2.0GHz
RAM: 2x1Gb
HD: 120Gb SATA
Display: 15.4 WXGA
Graphics: ATI Mobility Radeon X1400 (128Mb dedicated/512Mb shared)

The only relevant parts above are that it has an ATI graphics card (which, under Linux, sucks), and that it has Core 2 CPUs, which are amd64-capable (which is both great, for performance, and sucks, for drivers and software compatibilities). So, my second step was:

Installation of ATI drivers

If you want to take the best out of your ATI card, you have to tell your X.org graphics server to use the fglrx driver, and not the default vesa one. You can install this driver from the official Debian repositories, but for me those packages (fglrx-driver and related ones) didn’t do it.

So, I googled a bit, and followed the most widespread recommendation: to install the latest non-free (sigh) driver from the ATI site. For that, I chose the options: Linux x86_64 -> Mobility Radeon -> Mobility Radeon X1400 -> Go, reaching this page, and downloading this 38MB binary (for the record, the 32bit version of the drivers is exactly the same .run file).

Next, I followed the remaining information in this excelent thread in linuxquestions.org. Namely, I downloaded the needed packages (the code is copy-paste-able):

% aptitude install module-assistant build-essential dh-make debhelper debconf libstdc++5 linux-headers-$(uname -r) ia32-libs

Beware that the ia32-libs packages is not mentioned in the linuxquestions.org thread (assuming that you already have it installed), but it is required.

Next, run the ATI binary inside a dedicated directory (I did it as root, but it is not compulsory):

% mkdir /root/fglrx
% cd /root/fglrx
% mv wherever-I-downloaded-it/ati-driver-installer-8.32.5-x86.x86_64.run .
% chmod +x ati-driver-installer-8.32.5-x86.x86_64.run
% ./ati-driver-installer-8.32.5-x86.x86_64.run --buildpkg Debian/lenny
% rm or mv the .run file wherever you want

This generates a bunch of .debs in the /root/fglrx dir. Next, install them, and compile the driver (for this, you do need to be root):

% dpkg -i fglrx-*.deb
% cd /usr/src
% m-a prepare
% m-a a-i fglrx

The linuxquestion.org thread mentions modifying the /etc/X11/xorg.conf file in two ways. First, disable compositing, adding:

Section "Extensions"
Option "Composite" "Disable"
EndSection

to it, and then running:

% aticonfig --initial
% aticonfig --overlay-type=Xv

For me, both were superfluous, because I made a copy of my Ubuntu xorg.conf, and them made minimal changes (if at all). However, the first change (disabling compositing) was straightforward wrong. If I want to use Compiz Fusion, I need to have it. Relevant excerpts from my xorg.conf:

Section "Module"
  Load  "i2c"
  Load  "bitmap"
  Load  "ddc"
  Load  "dri"
  Load  "extmod"
  Load  "freetype"
  Load  "glx"
  Load  "int10"
  Load  "type1"
  Load  "vbe"
EndSection

...

Section "Device"
  Identifier  "aticonfig-Device[0]"
  Driver      "fglrx"
  Option      "VideoOverlay" "on"
  Option      "OpenGLOverlay" "off"
EndSection

...

Section "Screen"
  Identifier "aticonfig-Screen[0]"
  Device     "aticonfig-Device[0]"
  Monitor    "aticonfig-Monitor[0]"
  DefaultDepth     24
  SubSection "Display"
    Viewport   0 0
    Depth     24
    Modes    "1280x800" "800x600" "640x480"
  EndSubSection
EndSection

...

Section "DRI"
  Mode         0666
EndSection

Section "Extensions"
  Option "Composite" "1"
EndSection

After all this fuss, and to ensure you have it all running OK, try to insert the module as root:

% modprobe fglrx

Then, make sure it loads everytime you reboot (include it in /etc/modules if necessary, but it shouldn’t be).

Next, reload the X server, and check that now it is running the fglrx driver, by doing the following (as user is fine):

% fglrxinfo

It should display something like the following:

display: :0.0 screen: 0
OpenGL vendor string: ATI Technologies Inc.
OpenGL renderer string: ATI Mobility Radeon X1400
OpenGL version string: 2.0.6650 (8.39.4)

If, instead, it says something about mesa3d, it didn’t work.

Now, the second step is…

Installing Xgl

With the standard X.org server we have a problem. We can load the fglrx driver, but we can not activate compositing (see last three lines of my xorg.conf file above). If we activate compositing in the xorg.conf file, the ATI driver will not be loaded (don’t ask me why, it just seems to happen). If we deactivate compositing, the ATI driver gets loaded, but without compositing, we can not use Compiz.

The solution is to install Xgl which is an X server (or, I think, a kind of layer that runs on top of the X.org server) that allows for the above trick. There seem to be two “ways” of getting proper compositing: Xgl and AIGLX. The general agreement on the net seems to be that the latter is “better”, but only the former seems to work with ATI cards (read the “AIGLX with AMD (ex-ATI) Proprietary Drivers” section in the AIGLX Wikipedia article, because it hits the problem dead-on). With Xgl I can make use of the fglrx driver and have compositing at the same time.

We are lucky here, because there are Debian repositories for Xgl. I found out about them in this howto in tuxmachines.org. Most of the info there is mostly… ehem… useless (for me), but reading it I found a repo for Xgl. I just have to add the following line to my /etc/apt/sources.list (beware that the original mention in the tuxmachines.org page says “binary-i386”, and I had to change it to “binary-amd64”):

deb http://www5.autistici.org/debian-xgl/debian/ binary-amd64/

I then had to do aptitude update, and I (of course) got an error telling me that some signatures couldn’t be verified (read my own article about secure APT and/or the wonderful Debian wiki to know more). I think the key is 11F6E468, and it corresponds to Francesco Cecconi (mantainer of the repo). It is downloadable from pgpkeys.mit.edu (follow instructions on my previous post, or the ones in the Debian wiki). If you want, do not skip reading the parent page of the repository.

After the keys are OK, it’s just a matter of doing (as root):

% aptitude update
% aptitude install xgl

Now you are done installing, but will have to actually use Xgl. This gave me some headaches, not because I didn’t know where to put things, but because I didn’t know exactly what to put. I read, and followed, the instructions in freedesktop.org, and (after all, the blog seems to be useful for someone: myself) a previous post of my own.

I am using GDM, so my final setup was the following: first generate a suitable entry in the GDM menu, by creating a file named /usr/share/xsessions/xfce4-xgl.desktop (or whatever, but in the same dir, and ending in “.desktop”), and putting the following inside:

[Desktop Entry]
Encoding=UTF-8
Name=Xfce-Xgl
Exec=/usr/local/bin/startxgl_xfce
Icon=Type=Application

The string after “Name=” is the one that will appear in the GDM menu, and the one after “Exec=” what will be executed when selecting that entry.

Next, we have to create the string we promise above (/usr/local/bin/startxgl_xfce), and put the following inside:

# Start the Xgl server:
Xgl -fullscreen :0 -ac -accel glx:pbuffer -accel xv:pbuffer -fp /usr/share/X11/fonts/misc & sleep 5 && DISPLAY=:0
# Start Xfce:
exec xfce4-session

As you can see, I am telling Xgl to load a font (with -fp) that was giving me headaches, because the server would die saying that the font was missing when I didn’t include that option. Your mileage may vary.

Now, everytime we select the entry labeled “Xfce-Xgl” in the GDM menu, we will have the Xgl server running.

Installing Compiz Fusion packages

I think the aforementioned autistici.org repo has compiz packages, as well as the default Debian Lenny repos. But net consensus seems to be that they are not the way to go. Everyone praises two repositories: Treviño’s and Shame’s. I chose the latter, adding the following line to my /etc/apt/sources.list:

deb http://download.tuxfamily.org/shames/debian-sid/desktopfx/unstable/ ./

I think I went through the same chores as above for key verification, Shame’s key being A42A6CF5.

After that, I installed the following package (it installs all of the needed packages):

% aptitude install compiz-fusion-all

After that, and inside my “Xfce-Xgl” session, I just did the following, as some googling revealed:

% compiz --replace

But… it didn’t work :^( It complained in the following manner:

Fatal: Failed test: texture_from_pixmap support
Checks indicate that it's impossible to start compiz on your system.

I found a lot of pages, threads and howtos in the net stumbling upon this same problem (for example, this one at ubuntuforums.org), but none with the answer. Really. None. The most enlightening tips where the use of the -v, -h and --help switches for compiz. The first one requests verbose output, the second one help about “short” options, and the third one help about the “long” options. With the latter I discovered the --force-fglrx switch, which saved the day! Yes, I now use the following command to start Compiz:

% compiz --replace -c emerald --force-fglrx

I have two things to say at that point. First: this Compiz Fusion is visually astonishing! It is full of great ideas, and has a lot of settings to play with. The second thing is not so nice: some glitches are present. For example, my Konsole windows get transparent background for no reason, and the refresh is horrible (when text reaches the bottom on the terminal, it starts to overwrite itself. One must hide and un-hide the window for proper refreshing, which is unacceptable). The latter also affects other windows, which, all in all, makes it unsuitable for much comfort.

However, Compiz Fusion is new, hot and experimental. I love playing with it, but right now it can not be relied upon. On the bright side, in the three days from my installation, the packages have been updated three times! I suppose some aptitude upgrade cycles will fix the issues eventually.

And that’s it, dear reader.

Comments (6)

Compiz Fusion effects

I have no words. It is trully amazing what the guys behind Compiz/Beryl/Compiz Fusion can do, and how the contributions of free software enthusiasts can quickly surpass any lame Windows (and MacOS) “innovation”. From YouTube:

[youtube=http://www.youtube.com/watch?v=_ImW0-MgR8I]

Comments (1)

Comentario en enriquedans.com

Transcribo el texto de un comentario que he mandado al blog de Enrique Dans, y que por algún motivo no ha salido correctamente (lo voy a enlazar desde allí, para ver si sale así):

¡Cuánta ignorancia junta, Dios mío!

Dice #15:

“Para alabar a Beryl tiene uno que probarlo, no poner un video del youtube… Beryl esta verde, verde, verdisimo… “

Mentira. Yo llevo tiempo usando Beryl, y ha madurado mucho en muy poco tiempo. En mis ordenadores “de trabajo” no lo uso, porque soy más bien de los que tienen escritorios espartanos: Xfce con un solo panel abajo y un fondo de escritorio sin iconos. Y sin efectos 3D ni transparencias.

Ahora bien, cuando puse Beryl por primera vez, me funcionó de maravilla, y podía hacer un montón de cosas que no sé si Aero puede hacer (igual sí). Por ejemplo: Alt+Rueda del ratón sobre una ventana y cambio su transparencia de 0% hasta casi 100% (y puedo ver lo de debajo). Puedo ver dos o tres vídeos diferentes a la vez, con diferente nivel de transparencia, y ver los de abajo a través de los de arriba. Y eso mientras pongo un efecto de lluvia sobre todo el escritorio. Y eso mientras roto el cubo, o pongo las ventanas con los vídeos en una arista del cubo…

Y todo esto antes de que Vista saliera al mercado, y con ordenadores en los que Vista no funcionaría, porque “su alta tecnología requiere mejor hardware”.

¿Para qué sirve esto? Para nada. Igual que Aero. Simplemente mola, y si lo quiero usar, puedo hacerlo. Yo, la verdad, no lo uso, pero para gustos los colores.

Pero el escritorio Linux no se acaba en Beryl. Hace años que en funcionalidad el escritorio Linux (GNOME, KDE, Xfce… incluso Fluxbox y similares) ha sobrepasado ampliamente a cualquier Windows, incluido el Vista. Desde escritorios múltiples, hasta shortcuts de teclado ultrapersonalizables, colocación de ventanas automática más inteligente (al abrirse una aplicación), lista de tareas más eficiente, paneles configurables, “gadgets” (como los han rebautizado los sinvergüenzas de Redmond) como relojes y monitores gráficos de uso de CPU, red o I/O de disco etc. sobre el escritorio.

Por no mencionar la brutal capacidad de personalización de la interfaz, cambiando la decoración de las ventanas, el estilo de los botones, menús, listas, etc, la fuente de letra para los títulos de las ventanas, los menús, los iconos…

Leo en #37:

“Luego tengo que hacerles una aplicación cliente, sin que tenga que tirar más lineas de código que un tonto, y los dos únicos lenguages que tengo disponibles (no hay Visual Basic) son C y Java.”

Pero bueno, chaval, ¿estás diciendo que no usas Linux porque no hay lenguajes de programación? Si quieres aplicaciones para cálculo masivo tienes Fortran y C. Si quieres scripts rápidos, eficientes, y fáciles de coj*nes de hacer, tienes shell, Perl, Python, Ruby y otros. Si quieres aplicaciones gráficas fáciles hace tiempo que tienes Tcl/Tk, y más moderno GTK+ y Qt. Tanto Tk como GTK (Qt no sé) tienen una integración con Perl , Python y C que asombra por su simplicidad.

Me gustaría ver que programa “simple” de VB u otra basura similar es capaz de hacer lo que dos líneas de Perl o shell, con sed y awk. Para que te hagas una idea, Google usa Perl para pattern matching cuando te da los resultados de una búsqueda. Sobre máquinas Linux, claro.

Date una vuelta por la Wikipedia, y su lista de lenguajes de programación por categorías, y verás la de lenguajes diferentes que hay, y haz la cuenta de cuantos se pueden usar en Windows y cuantos en Linux.

Luego dice #44:

“Eso de que quien usa Windows es porque quiere no es verdad. Yo llevo años intentando emplear Linux y no hay manera. Hace siete años lo probé por primera vez con una distro de Mandrake y me volví tonto.”

Quizá no sea justo culpar a Mandrake de esto último…

Es laudable tu intención de usar Linux, y lamentable que no lo hayas conseguido, pero creo que tu negativa experiencia no es necesariamente generalizable.

Yo llevo 9 años usando Linux. Empecé con Slackware, donde uno se hacía todo “a mano”. ¡Qué tiempos! Era complicado a veces, pero aprendí muchísimo. Luego, cuando probé Mandrake, me gusto mucho, porque era tan fácil que hasta daba un poco de vergüenza.

Con el tiempo, volví a distros más “técnicas”, y ahora uso Debian (que es como el “Ubuntu para frikis”), porque me permite más flexibilidad que las distros “para tontos” (con todos los respetos), y me es mucho más fácil controlar lo que hace el ordenador, que con distros que se creen más listas que yo, y me “facilitan” el hacer las cosas como creen que quiero hacerlas, y no como quiero hacerlas.

“Desde entonces lo he vuelto a intentar varias veces y siempre me he topado con un muro de piedra: la conexión a internet.

JAMÁS he logrado conseguir conectarme a internet con una distribución de Linux. Ni cuando usaba un módem RTB, ni usando un módem ADSL, ni ahora con un router wifi.

Pues debes de ser el único, macho. Yo tuve problemas con el v.90/92, cuando intente conectarme con un winmódem interno. Pensé que Linux era una castaña, hasta que me compré un módem externo, y vi que era IGUAL de fácil de configurar que en Windows (y más fiable).

Cuando me pasé al ADSL (en realidad tengo cable, con Euskaltel), no tuve NINGÚN problema con Linux. Lo configuré en un tris. Y cuando me puse WiFi, me compré yo mismo el router (con lo cual me ahorré unos eurillos, respecto a pedirlo a Euskaltel), y me lo instalé sin problema en el de sobremesa (con cable). El portátil que conecto por WiFi no me ha dado ningún problema para conectar en modo abierto, y tampoco con encriptación WEP. Cierto es que para WPA tuve que hacer alguna cosilla, y que en Windows es más sencillo, pero solo marginalmente más sencillo.

Como comentario final, añadir que Vista no hace más que reinventar la rueda, reimplementando mil cosas que ya existían en Mac y en Linux (y generalmente, mucho mejor hechas), y cambiándoles el nombre, para que parezca que las han inventado ellos (como muestra un botón: los infames “gadgets”, que son el último S.O. del mercado, libre o no, que los implementa, y lo venden como que fueran los inventores).

Comments (2)

Inkscape tip: make arrow head’s color match that of its body

I have encountered the problem more than once, and it is a bit annoying to say the least. Basically, when you build a path/arrow in Inkscape, it starts as a black curve by default. You can edit it to put a marker in either or both ends (Click on the curve, then Object->Fill and Stroke->Stroke Style), to make an arrow, for example.

Now, the problem is that if you change the color of the body of the arrow, the head will remain black, as documented, for example, in A Guide to Inkscape, by Tavmjong Bah. Not nice, uh? The solution is given in the same site, and consists on using a plugin. To do so, select: Effects->Modify Path->Color Markers to Match Stroke.

If you are a Debian user, you might encounter a problem: a window pops up saying The inkex.py module requires PyXML. This has been reported as a bug, and also happens for Ubuntu. The solution is to install the python-xml package, which is not always installed by default when you install Inkscape, it is just “suggested”. This means that when you install Inkscape (aptitude install inkscape), aptitude will tell you something like “The package python-xml is recommended, but it is not going to be installed”, and will go on happily. If (like me) you ignore the suggestion, you will not have the python-xml package installed, and some extensions, like the above, will not work (however this allows the users that do not want to use the plugins to have a lighter instalation, if they so wish).

Comments (18)

Peer to peer: the new distribution paradigm

This post will hardly talk about rocket science, but there’s still a lot of ignorance on the subject.

A lot of people associate p2p with “piracy”, and eMule and BitTorrent with some shady way of obtaining the miraculous software of the big companies like Adobe or Microsoft.

Well, the fact is that p2p is a really advantageous way of sharing digital information through the net. Actually, the philosophy behind p2p is applicable to any process in which information, or some other good, is spread. So what is this philosophy? Simply put, p2p opposes a distributed way of obtaining t the goods, with a centralized one (see figure below).



Figure 1: Scheme of operation of p2p network. From Wikipedia.

I use the BitTorrent p2p technology (with the KTorrent program) quite often, particularly to download Creative Commons music from Jamendo. Lately, I have used KTorrent to download some GNU/Linux CDs, particularly the 4.0 version of Debian, and the beta (and this weekend, the stable) version of Ubuntu Feisty Fawn. With the latter, I have come to feel more deeply the advantages of p2p over centralized distribution of files.

With a centralized way of downloading, there is an “official” computer (the server) that has the “original” version of the information to download, and all the people who want to get the info (the clients) have to connect to that server to get it. The result is quite previsible: if a given piece of software is highly sought, a lot of clients will flood the server, and it will not be able to provide all the clients with the info they request, slowing the transmission down, or even stopping it alltogether for further clients, once saturation is reached. This happened with the release of the Windows Vista beta, when the high demand of the program, and the low resources Microsoft devoted to serving the files, provoked a lot of angry users having to wait for unreasonable periods of time until being able to download it.

This problem could well happen with the release of Ubuntu Feisty Fawn, and in fact this morning connecting to the Ubuntu servers was hopeless. However, unlike Microsoft, Canonical decided to make use of the BitTorrent technology to serve the ISO files, and this made all the difference.

With a p2p way of serving the files, the first clients connect to the server to get the files. However, once they have downloaded a part of the files, they too become servers, and further clients can choose whether to download from the central server or from other clients/servers (usually the decision is taken automatically by the p2p program). As the net of clients grows, and the file flow is balanced, the download speed is maximized for all, and the load on the servers is kept within reasonable limits.

The advantages are clear: each person willing to download some files (e.g. the Ubuntu ISOs) does not become a leech, imposing a burden on the server, but rather a seeder, providing others with the files, and speeding up, not slowing down, the spread of the files. It is, thus the ideal way of distributing files.

However, it has two disadvantages that made Microsoft not use it to spread the Windows Vista beta: since there is no single server, controlled by a central authority, it is not possible to assert how many copies of the files have been distributed. Moreover, since the distribution net is scalable, it can not choke, and thus MS would not be able to claim that the demand for their product was so high that the servers were not able to attend it.

So, for promotional purposes, the p2p is not very good. If your priority is the client, and making the files as widely and quickly spread as possible, then p2p is for you.

Comments

SSH connection without password (II)

About 5 months ago I made a post explaining how to use SSH to connect from computer A to computer B without going through the hassle of introducing the password each and every time.

As it happens, my instructions were far from complete, because they relied upon not setting any passphrase, and thus saving the SSH password unencrypted in the hard disk. That way, a malicious user, if able to read your account in computer A, can connect in your name to computer B with no restriction (thanks agapito for pointing this out in a comment to my post).

Next step is, thus, to use use passphrases, but avoiding mayor hassles with ssh-agent.

I will repeat here the instructions in my old post, and extend them. First generate a public/private key pair in computer A:

% ssh-keygen -t dsa

and answer the questions you will be asked, not forgetting to enter a passphrase.

This will create two files in your ~/.ssh/ dir: id_dsa and id_dsa.pub, whith your private and public keys, respectively.

Now, you have to copy the contents of id_dsa.pub into a file named ~/.ssh/authorized_keys in computer B. From that moment on, you will be able to connect to B through SSH without being prompted for your user password in computer B. However, you will be prompted for a password: namely the passphrase that unencrypts the wallet to your actual password (they one you set with ssh-keygen).

To avoid having to introduce this passphrase each time we want to make a connection, we can take advantage of ssh-agent, in the following way. First, we run the agent:

% eval `ssh-agent`

Then we add our key to the agent:

% ssh-add

The above will look, by default, for ~/.ssh/id_dsa, and will ask for the passphrase we introduced when generating it with ssh-keygen.

After the above, all further connections from that terminal (and its children) will benefit from passwordless SSH connections to computer B (or any number of computers that have your A computer’s public DSA key in their ~/.ssh/authorized_keys file). This benefit will be lost whenever ssh-agent stops running, of course.

OK, but I want to have passwordless connections from ALL my consoles!

Then you have to take advantage of the following sintax:

% ssh-agent command

where, command and all of its children processes will benefit from ssh-agent. command could be, of course, startx, or any command you use to start the desktop environment. You will still have to execute ssh-add, and enter the passphrase, but only once in your whole session. You will have to enter the passphrase again only if you log out of the desktop environment and log in again.

OK, but how do I make scripts benefit from this

You will find yourself automating the execution of some scripts sooner or later, for example putting some backups in a cron.

To do so, a ssh-agent must be already running, and you have to make the script somehow hook to it. To do so, include the following code chunks in your scripts:

Perl:

Create the following subroutine:

###################################################
#                                                 #
# Check that ssh-agent is running, and hook to it #
#                                                 #
###################################################

sub ssh_hook
{
  my $user = $_[0] or die "Specify a username!\n";

  # Get ID of running ssh-agent:
  chomp(my $ssh_id = `find /tmp/ssh* -name 'agent.*' -user $user`);
  die "No ssh-agent running!\n" unless $ssh_id;

  # Make this ID available to the whole script, through
  # environment variable SSH_AUTH_SOCK:
  $ENV{SSH_AUTH_SOCK} = $ssh_id;
};

and call it (before any SSH call in the program), like this:

&ssh_hook(username);

tcsh:

setenv SSH_AUTH_SOCK `find /tmp/ssh* -name 'agent.*' -user username`

bash:

export SSH_AUTH_SOCK=$(find /tmp/ssh* -name 'agent.*' -user username);

In all cases username is the user name of the user making the connection (and having run ssh-agent).

A lot of info was taken from this Gentoo HowTo and this HantsLUG page, and googling for “ssh without password”.

Comments (2)

How (legally) strong is the word "free"?

It seems that the answer is: a lot.

Perusing some old e-mails (I save all the e-mails I receive, except spam and stupid 2MB presentations), I found the following one, dated November 11, 2006:

Hello all,

I read in your page at:

http://www.linfo.org/index.html

That your “[…] project has the goal of providing high quality, comprehensive, accessible and free information about Linux and other free software”

How is it “free”, if the page also reads?:

“Copyright © 2004 – 2006 The Linux Information Project. All Rights reserved.”

Could you publish the information under a Creative Commons, or GNU Free
Documentation License? Either that, or remove the “free” part in the
paragraph above.

Yours sincerely,

Iñaki

As it follows from my e-mail, I was concerned for the use of the adjective “free” in an incorrect way. The reader might think they (of course) ignored my warning, because “free” is such a loose, multi-meaning, not-legally-binding word, much like “healthy”, “good”, “in a minute”, “you can do it yourself”, “natural”, “organic”… and all the jargon used in advertising to convey a positive look of the product, while still dodging potential sues for misguiding information.

Well, not quite. It seems that in software and information technology, “free” has a definite meaning, which linfo.org would not meet. As such, you can visit their current page, which now reads:

Welcome to The Linux Information Project (LINFO)! This project is dedicated to providing high quality, comprehensive and easily accessible information about Linux and other free software.

See any missing word? Why, the “free” is gone!

Maybe it sounds petty and nit-picking, but it isn’t. There is an increasing tendency to bastardize words like free software and the like, which I ascribe to closing the gap between “free and good” and “closed, for-profit, and evil”. Corporations have noticed how some terms are gaining progressive good reputation, like e.g. free software, and don’t want to lose terrain in the ensuing war.

This war has two fronts: first, demean everything that smells of “freedom”. For example, label “free software” products as “open souce software”. Why? Because it weakens its link with some freedom ideals, and conveys the idea that what makes that software different is simply that you can read the source code. You will also recognize bastards playing on this side because they will always refer to “free software” (software created and used with freedom) as “software that is free of cost” or “no-cost software”, or any other construction that tries to reduce all the benefits and characteristics of free software to the concept that it is free of cost, like mere freeware (read an example in a previous post[es]).

The second front is attaching the label “free” and/or “open” to any product that could conceivably (or inconceivably) bear it, much like “low-fat” would be attached to any food, be it naturally fatty or not (in which case little an achievement it would be), or even non-food (like tobacco), or “organic” to anything from food to clothes to shampoos.

In this confrontation, we start a slippery slope of giving blurry meanings to words, then end up having blurry concepts applied, like a “low-fat” super-hamburger that can single-handedly obstruct all your arteries with its cholesterol, but is called “low-fat” because it has lower fat content than another similar size burger, or a page showing information that they call “free”, but is under burdensome copyrights, that (for example) take from you the simplest right of copying the information and sharing it with others freely.

Comments

Linux in the metropolitan buses of Donostia

Today I have taken the usual bus to the city center, and noticed that the monitors they have in the buses, showing general info and commercials, where blank. Well, not exactly blank: some white letters littered the black screens. “Oh, dear” – I thought – “another Windows crash”. Not quite: the messages the monitors where showing corresponded to GNU/Linux!!

Below you can see a photo I took. Click in the picture for full-size version.

I also took a second pic, without flash:

I apologize for the poor quality of the pictures, but taking photos of low-luminance screens in bright ambients, and inside a moving bus is not trivial (and my digital camera is not the best ever).

If one forces one’s eyes, the following fragments can be read:

* search_bg_key: invalid format found in block [...]
* is_leaf: free space seems wrong: level 1
* reiserfs_read_inode2: i/o failure ocurred trying to find [...]
* /home/terminal/datos/backup/20070217-def[...]-md5
* Unable to handle paging request at virtual address [...]

From my ignorance, it seems one hard disk failed, or maybe a connection (say, the USB cable to an external disk) was broken, or a device’s capacity exhausted. Of course, it might well be a failure of the OS (albeit quite unlikely, being GNU/Linux).

In any case, I was shocked to discover that the city council has decided to give Linux a go. Well done, Mr. Elorza!

Comments (3)

« Previous Page« Previous entries « Previous Page · Next Page » Next entries »Next Page »