Monthly Archives: January 2005

simulpic: new upstream version for more than 8 years!

I have just uploaded a new version of simulpic. This software is a simulator for Microchip PIC16F84 microcontrollers. The initial version was a student project of the University of Pisa, and the development has stopped 8 years ago. Well, not really stopped as they were some patches in the Debian package.

Recently, a new upstream has decided to continue this project and to add some functionalities. He has just released a new version, the first one for more than 8 years! That demonstrates the power of Free Software.

High server load

During the last few days, I experienced a high load on my server (sometimes up to 15). Each time it happens, I observed that apache was unable to serve pages. Restarting it regularly seemed to fix the problem.

Yesterday, I started to investigate the problem. Actually it was “referer spam”. The stats of my blog are generated with webdruid and are available on http://blog.aurel32.net/stats/ . Some spammers tried to increase their website’s page rank by submitting spoofed referers. It seems that they use zombie hosts, as the requests come from many IPs. The bad thing is that the hosts don’t close the TCP connections, causing a lot of apache processes to be unable to serve pages. It’s like a DoS, though this was not the aim.

A search on Google gave me a way to stop that. I added the following lines to /etc/wordpress/htaccess:

RewriteEngine On
RewriteCond %{HTTP_REFERER} ^http://www.spammersite1.com [OR]
RewriteCond %{HTTP_REFERER} ^http://www.spammersite2.com
RewriteRule .* - [F,L]

The load started to go down. Good! I also added a robots.txt, so that the stats pages are not indexed anymore by the search engines (note to the wordpress maintainer: it would be nice to have a /etc/wordpress/robots.txt).

After a day, I grepped the apache logs to find all the zombies IPs, and I blacklisted all of them on my firewall with iptables, ie. 217 IPs!

This event reminds me that my server doesn’t have enough RAM and that I should add some more.

Do some kind of greylisting on Debian bugs?

Over the last few months, it seems that the number of silly bugs is increasing. Of course, in most of the cases, the severity of such silly bugs is set to critical. I won’t list all that silly bugs (it would take too long :) ), but I’ll take two examples, that are very recent as they occured today yesterday.

The first one is the bug #289666 titled “sane only work as root on 2.6.10 (2.6.*)”. The bug reporter explains that the module scanner.o has been removed from the kernel and thus he, and all other users, should use the root account to scan a document. He think it is a critical security bug. I don’t know why Julien Blache wrote a README to explain how to be able to scan as user! The command is very simple: adduser user scanner.

The second annoying bug was filled today on OpenOffice.org. It is bug #289800 and is titled “openoffice.org: please enable autosave per default”. Basically, the user was writting a text on OpenOffice.org, and its machine crashed. As he hasn’t saved the document, and as autosave was not enabled, he lost it. Surely not a critical bug. Maybe the user should learn all to click on the save icon instead of sending bug reports!

Maybe we should do something to avoid these annoying bugs. A kind of greylisting, which first sends back an email to the bug reporters (asking them if they have read the documentation, if they have verified that the bug has not been already reported, if they are using the latest version) and then that waits for a key contained in the email to be returned to validate the bug. This could also avoid people using invalid email addresses.

Yes, such a system looks like a bit silly… just as some of the bug reporters.