This feed contains pages in the "site" category.

I modified the livejournal import script so that it's capable of handling comments as well. There are better instructions on the script page on how to export comments.

Now I'm done with livejournal and I can finally delete my account there. All the relevant data has been migrated over. YAY!

Posted Sat Mar 6 18:57:33 2010 Tags: site

I finally converted my old blog posts from Livejournal to ikiwiki. They have no real value to anyone but me and it hasn't been a real priority to me anyway.

Livejournal is capable of exporting the blog posts in an xml format with html markup inside, so it's rather simple procedure to compile markdown pages for ikiwiki to consume. This is mostly because you can use plain HTML inside markdown pages.

I did the conversion with XSLT, which started with a simple template and grew up a bit once I added some cleanup features (like filenames and metadata). The script itself is available for download and the archive is available for browsing.

Posted Sun Feb 28 15:26:02 2010 Tags: site

It was way over due already, but I finally enabled comments in my ikiwiki installation. What this means is that you can now comment on my blog posts. I also enabled blogspam module so hopefully I get to avoid spamming.

In any case, have fun commenting.

Update: I also enabled anonymous commenting, so OpenID isn't required.

Posted Mon Mar 23 22:18:03 2009 Tags: site

It was bound to happen sooner or later. I've been hosting my own services at home for quite some time now. It has a lot of benefits, like being able to access all services when connections are down and so on. Lately I've been thinking about what would happen if I lost the internet connection for some time. Maybe I should stop thinking.

On Monday at 11:00 my DSL went down. At first I assumed that something was wrong with the connection and it would come back after a while. After some 30 minute wait I decided to call about the problem. Thats when the whole horror revealed itself. The provider was unable to find a contract for my DSL. I called the office and then it hit me.

When I changed companies in the beginning of this year, I had a discussion at the old company that I wanted to move the connection to the new company. The asked for the details and I provided them. And I made a mistake. I never verified that the connection was actually moved over. So what happened? The old company didn't actually move the connection over and now after 5 months they figured out that they had a DSL for someone that wasn't working there and closed it. Even though it was their mistake for not moving the line, I can still blame only myself for it all. I should have checked and double checked that it had been moved.

Yesterday I was able to set up a secondary host for e-mail and today I'm fixing the website, it's not fully functional but it will be enough until I get a proper connection. Luckily I have access to all sorts of resources so I'm able to set up things like this quite easily. I wouldn't be hosting stuff at home if I wasn't.

Posted Wed May 21 09:11:31 2008 Tags: site

Sometimes there are days when everything works as planned. Today I updated this site to ikiwiki 2.0. The major change was a switch to using directories instead of just plain html files. There are benefits in this approach so I upgraded.

What surprised me was the fact that nothing went wrong with the upgrade. At least I don't know of any problems...

Ikiwiki uses static files to store the contents, which is nice. The only problem with this approach is that in the end there will be migrations. All migrations need workarounds to allow things to work without going around fixing things all over. For this migration I had to write some redirect rules for lighttpd. It's a minor inconvenience if you consider the other benefits of static content.

The only problem now is to decide whether or not to enable comments on this blog. I'm not sure if I want to start fighting spam on my own website, I already get way too much spam in my email.

Posted Thu May 10 23:10:42 2007 Tags: site

In opensource software world it's rather common to hear something like "Does it work just like some-other-application" or "Why can't I use application-X instead of this one, everyone else has application-X".

In a way that's peer pressure. Microsoft is prime example of a company that uses peer pressure for marketing. They managed to acquire a fair share of the market by getting their operating system to OEM markets and by donations to schools. It's interesting that the old saying "There is no such thing as free lunch" applies here too. By donating something they are creating a user base that already knows some application or operating system. By using that user base it is possible to push your products to new markets.

That is actually brilliant marketing.

How is this related to me? Well, some time ago I decided to rebuild my old server. I decided that I would utilize the skills I've learned through my work experience, even if it is a home server. I set up a Xen server that separated my jabber server from my firewall. And since I already had a working IPv6 tunnel and plenty of addresses for my local network I could allow direct connections through IPv6 and forward ports through IPv4. I already knew that XMPP system had already implemented SRV records that allowed me to create clean rule sets how I wanted my servers to be contacted.

But there was a problem. It only occurred to me once I had finished the DNS configuration. It appears that Google Talk doesn't fully support SRV records. It appears that the IPv6 only record throws the servers off and I'm unable to connect to Gtalk users.

So due to peer pressure I was forced to change my preferences on how to connect to my jabber server. Suddenly it became clear to me why it is so hard to introduce new services and technologies. It's not enough if it's innovative or useful, there has to be solid interoperability with the competing products and minimal learning curve. Knowing this, it's not a wonder that the e-mail system is still in place even if it's flawed by design.

Posted Sun May 6 22:28:00 2007 Tags: site

There are many tactics in fighting spam. Most of them involve e-mail spam. Like many others, i've deployed quite a few tactics in fighting spam. The most recent ones being greylisting, bayesian filter and blacklists.

One natural division for all the spamfighting methods is whether it can be deployed to an ISP environment. I've seen all sorts of methods, others are ones that i would never put in to a production system and others that have no effect in a large scale system. One of these methods is plain and simple strict standards checking, which can very easily go wrong. I agree that this is something that should be required, but since there are too many broken sites out there, there is no place for these kinds of methods in an ISP environment. Same goes for greylisting. People expect mail to arrive immediately, which from a technical standpoint is an insane expectation due to the nature of SMTP as protocol (mails are allowed to be deferred), but it rules out greylisting.

There are middle grounds in all these, whitelist & blacklist + greylist works pretty well, but not quite well enough. But there appears to be a new contender out there. Policyd-weight is one of these, it attempts to make blacklist less strict (sites get blacklisted all the time) so that legimate mail gets through while weighing in the validity of HELO and other fields.

Few days ago i disabled greylisting from my personal site, it rejected so little now that i configured policyd-weight in front of it. Looking at the statistics from my dspam there is no change in the amount of spam hitting it. Too bad i don't have older statistics anywhere, it would have been fun to show numbers here, but i remember that the drop in spam was significant when i first deployed policyd-weight. I'm not saying that greylisting isn't effective, but policyd-weight is more effective (IMHO).

As for the question, is policyd-weight suitable for ISP environment? Yes, i think it is. It's a combination of all methods currently deployed which gracefully handles situations when something is slightly wrong.

I'm a lone spamfighter with a new tool...

Posted Sat Feb 17 20:40:25 2007 Tags: site

Ok, so i picked ikiwiki as the website engine. It turned out to be pretty much everything i wanted from a website engine.

Here is a list of my requirements from a few years back (yes, it's been that long):

None of the current blogs suit my need, what i want is a blog that uses a proper template (preferrably XSLT) instead of the current wave of hideous multifile template systems. Also i want to blog to be as static as possible, There is no need to regenerate each page when it's viewed, most pages never change anyways. And then there is one big requirement, I don't want a database backend.

How do these relate to ikiwiki?

  1. Proper templating system: ikiwiki uses a sane template system, none of those .pre + .post + .middle style of templates.
  2. Static rendering: all of the pages served are actually static html files built by ikiwiki. In fact, my webserver doesn't currently have ikiwiki installed at all, i just build the website locally.
  3. No database: YAY! There is no database required. If i want a datastore, i'll use a version control system.

All in all ikiwiki looks like the solution for me. Previously i've considered using subwiki, but it never got off the ground. I kind of like ikiwiki even though it's written in perl, which looks like a big mess once you get used to python =)

What is best, is that no matter how the pages were created, i'm able to edit them with simple tools like vim or any other text editor.

Posted Sun Jan 21 19:24:38 2007 Tags: site

So, i'm following the trend set by a lot of people and migrated to ikiwiki which turns out to be the most elegant solution to manage my webpage.

So, in the near future i'll be importing my old blog entries from livejournal here and setting up tags for the entries. Maybe this will get be blogging some more.

Posted Sun Jan 21 00:39:10 2007 Tags: site