Subscribe to
Posts
Comments
NSLog(); Header Image

Spam Slammed

I'm getting comment spammed quite a bit today. About 100 so far: far fewer than other "bursts," but these are coming from all over. What's worse: MT-Blacklist is having a hard time actually keeping up with it. I lowered my URL limit to 2 (meaning 3 or more is moderated), yet this doesn't stop the same URL from being posted four hundred times in one comment. No, MT-B doesn't catch that…

Another annoying facet of MT-B: you get 100 of the same comment spam. You go through and block the URLs in one, and then block more and more. Since you manually de-spammed the entries, your "hits" count does not increase. Furthermore, it's difficult to modify an entry once MT-B has pulled it from the blacklist. For example, today I got spam from pretendurl.com, and fifteen minutes later started getting spam from pretendurl.com. At that point, I just added the URLPattern \bpretnedurl\b to block all TLAs. But in doing so, I had to delete the individual entries, add a new URLPattern, and go on. It would have been much simpler to simply modify the addition to the blacklist when I loaded up the first de-spamming victim.

The thing is this: I understand how regular spam is productive. Stupid people click "unsubscribe" links or actually send in for the items advertised. The business of spam is a profitable one. People using MovableType or other blogging packages tend to be savvy folks, and I rarely see a comment spam sitting there accumulating PageRank. How is it cost effective for comment spammers? I delete the shit as soon as I see it - and then they're blocked.

It's a pain in the ass that has me seriously considering at times simply shutting down my blog entirely - and where would that get the comment spammers?

Regardless, I've turned on yet another MT-B feature: forced moderation on posts older than 15 days.

4 Responses to "Spam Slammed"

  1. Two days after I switched back to MT (I used Textpattern for a few months) I got hit by 400 trackback spams.

    Four bloody hundred.

    I see that MT-Blacklist finally got released for MT 3.1. Installing right away.

  2. Is it possible that they have figured out when the bots crawl a certain site, and infest just before that magical time? You might delete yours hourly, but other people might let it sit for a day. Thank goodness my site's not on MoveableType... I think that's the main thing keeping the bots from spamming me. It'd be so annoying to have to fill out my form 100 times manually (please don't anybody get any bright ideas to run a bad script on me).

  3. It's not just MT. Packages like Drupal get it as well. Three weeks after setting up a Drupal-based site I had to go to registered-user-only posting because the spammers were coming down on it. Addition of email loop confirmation and "captcha" to the first page made it nil. It's certainly a small pain to make an account because of it but, happily, it's a one-time thing for users.

    But I think Jo's right, the schedule for Google bots is closely-watched by webmasters so it's possible they know just when to hit it. Just staying on overnight is good enough to get their link in there.

  4. ...and another of my friends who has been running WordPress for less than two weeks just got hit hard from spammers. They're damn quick sometimes.