Tuesday, December 29, 2009

Mythbuntu 9.10 and Hauppage HVR-2200

I'm helping a friend set up his new MythTV Ubuntu 9.10 box with Hauppauge HVR-2200 card, and it's turned out to be not such smooth sailing.

First challenge is getting the Hauppauge tuner card working with Linux - these are the instructions that worked for us however rather than "make menuconfig" you have to do "sudo make menuconfig" to avoid permissions problems.

Then the issue of the mythtv-backend not starting on boot, fixed by running:

sudo update-rc.d mythtv-backend defaults 50 51


Mapping channels described here

Channels 9 and SBS use MPEG, not DVB, so whrn you add the channels, add them as MPEG, and they (mainly) work. Glenn reports that there are some issues with HD, but if you go to SD first, it then finds the HD

Monday, December 21, 2009

Bash, and matching dot-files with wildcards

Live and learn! For years, I've used Linux, and never known how to get * to match all files (by "all" I mean including files that start with a dot).

For example:

# du -ks * | sort -n
4 courierimapsubscribed
4 tmp
20 new
148 courierimapuiddb
348 courierimapkeywords
163356 cur
# du -ks .* | sort -n
388 .Trash
456 .Drafts
664 .ldap
58844 .2007
77644 .Sent
97852 .2008
450348 .
450364 ..



It finally irritated me enough to find out:


# shopt -s dotglob
# du -ks * | sort -n
4 courierimapsubscribed
4 tmp
20 new
148 courierimapuiddb
348 courierimapkeywords
388 .Trash
456 .Drafts
664 .ldap
58844 .2007
77644 .Sent
97852 .2008
163356 cur



So: shopt -s dotglob to turn it on, and shopt -u dotglob to turn it off again.

Thursday, December 17, 2009

Nagios, nsclient and nsclient++

Installed the latest NSClient++ on a Windows 2000 box, added definitions to monitor it in Nagios, re-loaded nagios, and hey presto... oh wait, lots of red bits. Why? Well, Nagios couldn't connect using the check_nt command. It works fine for all the other Windows servers.

Turns out the latest version uses port 12489 by default, which isn't what check_nt (on our version of Nagios: Version 2.12) is expecting. So followed a longish process to find out what port Nagios was expecting.

The answer is: 1248

So I edited nsc.ini, set port to 1248, re-started the service and suddenly, we're all happy.

Another little amble down the road of "huh?"

Tuesday, December 8, 2009

Mozilla + google + squid + pubmed == pain

One of my users reported the following issue: he goes to google, types "pubmed" and clicks the first link in the search results, which is in fact for PubMed (http://www.ncbi.nlm.nih.gov/pubmed/). He then gets this error:


ERROR: 404 Not Found
NCBI C++ Exception:
Info: CGI(CCgiRequestException::Unexpected or inconsistent HTTP request) "/export/home/miller/PORTAL/2.7/src/cgi/cgiapp.cpp", line 1056: --- Prefetch is not allowed for CGIs
Error: WEB(CCgiException::eInvalid) "/export/home/miller/PORTAL/2.7/src/internal/portal/web/papp.cpp",
line 82: --- OnExceptionURL is not set


The cause turns out to be the confluence of the following:

1. Firefox already implements a soon-to-be-standard HTML feature called
pre-fetching: a page can provide a series of hints about the next page
the user is likely to click to, and provide some links to resources for
pre-fetching. It's supposed to make the load time shorter. More info
here

2. Google now provide pre-fetch hints for the top links on the search
results. View the source of your search for pubmed, and you'll see this:

<link rel=prefetch href="http://www.ncbi.nlm.nih.gov/pubmed/">

3. PubMed clearly don't like people pre-fetching their site, and have
taken some fairly heavy-handed tactics to combat it:
see the source here
You can see they're checking for the x-moz: prefetch header
and returning HTTP status 403, with no pragma to prevent a proxy server
from caching that response. Then you click the link, and get the cached
version from the proxy with the error message. This is why shift-reload
works - it's forcing the proxy to go get it again, and since there's no
prefetch header, this time it works.

There are a couple of ways to avoid this, here they are in my favoured
order of preference:

1. PubMed find a better way to avoid prefetches on their CGIs (e.g.
either explicitly set pragma to prevent caching by proxies, or use an
HTTP 503)

2. our users get to pubmed via a bookmark

3. you can disable the firefox pre-fetch mechanism, but that's per-user,
per computer - adds a lot of overhead to IT which frankly, I could live
without.