20090323

How not to do it – the Microsoft way

Does Microsoft (or any technology company for that matter) really understand the concept of being business user-friendly?  Leaving aside the abomination otherwise known as Office 2007, which replaced a perfectly workable toolbar with a ribbon interface that seems to actively hide the most useful components, there’s a lot of stuff going on under the hood of any self-respecting PC and/or network that requires fixing in the default setup of Windows.

The latest example of the annoyances in this regard comes in the form of the default behaviour of Server 2008 with regard to remote sessions.  Who on earth thought it was a good idea to restrict user logins to a single session?  It’s become a joke now when I or my fellow conspirators throw one another off the administrative sessions we have so lovingly crafted just because the system thinks ‘one user, one login’.  Actually, what’s really annoying is that this is controlled by a group policy setting that I can never recall – it’s now firmly entrenched in the ‘IT problems and gotchas’ of our company SharePoint site, and needs to go into a build document.  But I still wonder why we should have to do this, when the old Server 2003 way worked perfectly well out of the box.

Another  default setting that annoys is the prejudice against NAT.  I’m tired of going through the registry edit and reboot required to get PPTP or l2TP working via a standard router to the Internet.  For bonus annoyance points, MS decided to change the required registry values between XP/2003 and Vista/2008.

The final target of ire this evening is the Windows Firewall.  All very nice in theory, but deplorable in practice.  We set up all these nice client PCs, get back to base – and find we can’t access them remotely.  Group policy to the rescue again, and disable that service.

I seem to spend half my time switching off, cancelling or simply bypassing roadblocks put in the way of simple, successful system management.  By all means create a ‘home user’ build for Windows, but please can we have a ‘business edition’ without all the unnecessary hindrances?

20090315

Atom’s back

Someone was working over the weekend – RSS functionality has returned to Blogger, and the feed definitions are being updated correctly.  Or at least as correctly as the Blogger style sheet will allow.  One monolithic line of code makes it difficult to be absolute about things like this.

I may be old-fashioned (OK, there’s no doubt – I am old-fashioned) and I certainly prefer human-readable text.  What does it cost to put a few line breaks and tab characters into a text file?  In this particular case, about a 2% increase in file size, which should not be a deal breaker.  For that, you get a text file that can be understood, and even parsed by the human eye

Perhaps you might argue that these things are not intended to be read by humans: their function is simply to define a set of data for a computer to work with.  But I would respond that we should not lose sight of the need for people to understand what computers are doing.  OK, there are some time and space-constrained operations where maximum efficiency is needed, but XML code is not amongst these: by its very nature, it’s a form of text, and doesn’t aspire to terribly efficient operation.  There’s a tremendous amount of repetition, unnecessary descriptive strings, and sheer bloat in even the most tightly structured XML file.  So why not add that extra 2% and make it readable?

If you want that space back, do something like parameterise the contents.  For instance, my atom.xml file contains my name repeated many times over – why not reduce it to a simple variable defined at the start of the list? 

If the standard insists on fully expanded strings, then go all the way and just add those two or three extra characters per line and make them proper lines.  Apart from anything else, this ensures that mistakes can be picked up more easily – but I must admit, the original mistake with the generating code that caused empty XML files to be created wasn’t going to be found very quickly from a blank output file.

20090314

Don’t update software on a Saturday

What’s today’s problem?

Well, bear with me, there’s a list:

Blogger has lost my RSS feed – and that of a lot of other people as well, apparently.  The automatically generated atom.xml is being created as a zero byte file.  So there’s a better than even chance that you won’t even see this post, at least while it’s meaningful.  There is a workaround, for anyone who needs it: use the feed identifier in the form of http://www.blogger.com/feeds/7959034/posts/default, where the number needs to be updated to your own account value.  Drop that intot he template, republish, and it’s working at least as far as anyone picking up the RSS feed this time around.  Apparently, it’s also possible to copy the actual file to your own atom.xml, but this needs to be done each and every time the blog gets updated, so to date I simply haven’t mustered up the energy.  Blogger already know about the problem, but it’s the weekend now and I suppose they’ve taken the time off.

Twitter – someone went and changed the API or something, and my old ceTwit application on the phone gave up reading posts.  I did manage to download an update, but this doesn’t seem to show a public timeline, and is consistently reporting a possible broken API and I’m left wondering if this is actually working at all.  However, I did manage to locate, download, configure and use a different product Twobile, which is working OK.  Since I seem to post Twitter updates even less frequently than I update this blog, however, it probably won’t break their service too soon.

For my final act of IT vandalism today, I thought I’d update the nVidia drivers for the graphics card in the workstation, a mere 8800 GT card.  Wonders will never cease – this actually worked.  Being flushed with success I also thought I’d go for the Windows Live tools (including Live Writer), and this is where it seems to have gone a little bit wrong – various components seem to be having issues with the firewall, and I no longer know if it will work.

Still, look on the bright side: even if it does work and this post gets published, no-one will ever see it as long as RSS is broken.

Caution - low flying laptop

The driver update didn't have the desired effect.  Still no profile updates, unless I manually change the registry entry.  One possible bright spot was the discovery of a script to do that job semi-automatically, but... it's in German and designed for Windows XP, not Vista.  My translation skills weren't good enough to get it working in the fifteen minutes or so before my attention was diverted elsewhere.

There's considerable discussion about how the video drivers achieved WHQL certification with such an evident bug - and why they haven't been fixed in over a year.  I see a lot of trumpeting success in getting the certification - but not an ounce of responsibility for errors like this.  Who is really at fault here - nVidia, Microsoft or Dell?

On the other hand, one beneficial effect of the new video drivers has been to prevent the irritating loss of dual screen functionality that occurred when the PC was locked: on return to activity, I'd see the second screen active for about five seconds, then it would switch off and push all active windows back to the primary screen, necessitating a trip to the keyboard to reactivate the monitor and then drag everything back over.  As the system still knew the second screen was there for those few moments, why did it then disable it?  Anyway, this bug at least is now fixed.  What's interesting is that it took me most of the day before I noticed the change in behaviour, which only goes to show how it should have been right in the first place.

20090312

Driven to distraction

I use a Dell Precision M4300 laptop for work - whilst not the lightest device around, it has sufficient horsepower and storage, and offers enough interfaces and screen real estate to cope with my various demands.

But since purchase it's been running slowly - Task Manager showed around 20-40% CPU utilization, all down to the System process.  Now this is a real can of worms, nearly everything that keeps the PC running is handled by this.  So I lost patience today and started to research the problem.

Turns out it's a fairly well known issue - Mark Russinovitch (he of Sysinternals fame) had already seen the same issue, and (being who he is) had figured out the cause.   Wonders will never cease - it turned out that it was the same driver, same symptoms and same cure that was effective for me.

This begs the question of why an issue that was known about a year ago (April 2008) has still not been addressed by Dell.  I was able to follow the Broadcomm links from Mark's blog entry to an updated driver for the NIC, but Dell's support site was less than helpful.  Once the new driver was installed, it was like working on a new PC - so much faster, better response to keyboard inputs, sheer luxury.  Imagine how much I would have appreciated it if Dell had taken the opportunity to  update their support site drivers to match.

Another issue on the same lines is the odd behaviour of the domain profile I use: apparently, this hasn't been updated on the server since around November of last year.  This means that changes I make on my desktop, for instance, are not reflected in the server copy, but files I thought I had deleted keep coming back: I have to log on to the server and delete the files from the copy there.  This one is weird: turns out it is a problem with the Nvidia display driver (for the FX 360M, a device that seems to be supported only by Dell, not Nvidia themselves) that screws up a couple of entries in the registry, a problem that only becomes apparent when you use Windows Server 2008 as a domain controller.  It just so happens that we switched to Server 2008 last November... Not sure if this latter problem is resolved yet, having already rebooted the PC several times to try and fix the NIC driver earlier in the day, I didn't get around to doing it again, so we'll see if the desktop reverts again tomorrow.  If so, I suspect the next step is to see if a laptop PC can develop wings in the short time it will take to fly across the office.

20090307

A site for sore eyes

I wonder if there's a degree course somewhere in website optimisation?  Even leaving aside the search engines completely, actually managing a site isn't a one-click operation.  Over time, the contents get more complex: this is true even (especially?) of a site like mine, which has been in existence for at least five years, albeit woefully lacking attention.  During this time, I occasionally discover new tools and new systems that get tried out, leaving remnants of pages and folders containing code that I have no way of identifying or managing properly.

It must take a very organised and disciplined approach to keep a site clean over time: even something as simple as multiple copies are likely to proliferate as local machines are replaced or new server installations come into play.  For instance, I have at least two copies of my own site on my home machine, and no real way of keeping them synchronised: Filezilla gives some help in this regard, but it's not really able to deal with things in the same way as, for instance, SyncToy or Microsoft Live Sync.  These latter two are very nice, but I haven't managed to get hem to support an FTP folder yet, so they don't quite fit the bill.

Once I have a local copy that I can depend upon, usually achieved by creating a new folder (don't want to risk losing old data.  Oops, there's another copy now!), I can check the contents out, and this is where I get really confused.  Some of the pages were created in MS Word (I know, I should have used Notepad, but I was feeling lazy); some were creating using whatever the tool of the moment was on the hosting company's site; some are straight from Photoshop or Lightroom; some are out of Blogger.  So far, I haven't gone as far as using Expression or anything like that, but I am seriously considering junking nearly everything (bar the contents of the blog) and starting again, with a properly designed site and a tool that I can stick to.

Whatever I select will need to have certain features:

  • Photo management.  Part of my lazy approach is that I don't like emailing pictures to all and sundry.  However, neither do I trust Flickr or the like to hold my photos.  (The truth is probably that I don't reckon they'd stand up to the public scrutiny there.)  So I like to just upload batches of pictures, mail interested parties and leave it up to them.
  • Blog integration.  I like Blogger, and also Live Writer's integration.  It may not be professional, but it sure is easy.
  • Ease of experimentation.  There's no way I'll be able to resist trying something new.  Pages will find their way in which aren't published via the approved route.  A site management tool must be able to integrate these back from the web into local storage.

I think the search is on.