McAfee, Internet Explorer and a lack of quality control at Toshiba

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week, I wrote about helping my father-in-law to ensure that the insurance company wasn’t fleecing him whilst replacing his stolen laptop.  His new machine (a Toshiba Satellite C855-12G) arrived this week (although it appears to be a discontinued model, which is presumably the reason it was discounted…) and I’ve spent part of the evening on family IT support duty getting it set up for him.

Unfortunately, I also found that the webcam is faulty (at least, neither Toshiba’s webcam application, Windows Device Manager nor Skype can see it, despite having downloaded the latest drivers from the Toshiba website), suggesting that Toshiba’s quality control is pretty shoddy (this doesn’t appear to be an isolated incident – see link 1, link 2, link 3). Back in the day, Toshiba was a respected notebook PC brand but I guess I should have insisted on Lenovo, Samsung or Dell…

Anyway, the real purpose of this post was to record some of the issues (and resolutions) that I found whilst removing the “crapware” from this new PC. To be fair, I’ve seen worse and the main thing to remove (apart from a non-English version of Windows Live Essentials) was McAfee Internet Security.  It never ceases to amaze me how many people will shell out cash for this type of application when there are perfectly good free alternatives, so I replaced it with Microsoft Security Essentials.

Unfortunately the McAfee uninstaller wouldn’t run, displaying an Internet Explorer-esque “Navigation was cancelled” screen (but without any chrome).  As Skype was also having problems adding contacts, I started to suspect something was blocking web traffic and that hunch turned out to be valid. Disabling Internet Exploder 9’s Content Advisor did the trick. How anybody can use it is beyond me (I had to enter a password four times  just to switch from Windows Update to Microsoft Update) but, once Content Advisor was disabled, both Skype and the McAfee uninstaller worked as they should.

 

 

Comparing PC specifications for average family use

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A couple of weeks ago, my parents-in-law were unlucky enough to be burgled. Thankfully they were not at home at the time and the thieves didn’t manage to take too much.  One thing they did take though, was their laptop computer.

The insurance company made an offer for a comparable PC to replace the stolen one (new for old) but, as four years is a long time in computing, I wanted to be sure that they really were getting a similar specification in 2012 terms. I’d been careful when I bought the original for them to get something that was OK for standard web surfing, email, etc. but not too expensive. Similarly I didn’t want anything bargain basement as it would only cause me “family IT support issues” later.

My normal answer, when asked for advice on buying new PCs, etc. is to look at the PC Pro A List to see what’s currently rated. Unfortunately that doesn’t help so much when taking a bottom-up view (i.e. starting out with a proposed model and seeing if it offers everything you need, rather than a top-down approach with a purpose in mind and choosing the model to match).

So I turned to the ‘net for advice. As helpful as my Twitter followers were, “what is a decent PC spec for the average home user?” is a pretty subjective question and the answers ranged from “I love my Core i7-powered beast” to “Core i3 should be fine”, with some suggesting that i3 might not have enough grunt and I should get an i5 instead. As it happened, there was a similar Core i5 model at the same price as the i3, but with 4GB RAM instead of 6GB, so I got the insurance company to plump for the faster processor (I can add RAM later).

Wikipedia was also useful, for reading up on graphics chipsets (to work out why the Intel HD Graphics 3000 chipset was an improvement on the Intel GMA X3100 in the old PC – don’t be fooled by the smaller number, it seems), and to confirm that I wasn’t getting a modern version of a budget Celeron processor.

One website really stood out though with great advice on the various processors and how they compared to each other. That site was notebookcheck.net (I was looking at the Intel Core i3-2350M and Core i5-2450M) and I’m pretty sure I’ll be revisiting when I need to compare specs again in future…

Installing Ubuntu 12.04 on an old laptop without PAE

This content is 12 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Initially perfect for young children (portable, cheap, small keyboard), the screen resolution (1024×576) on my sons’ netbook is becoming too restrictive and, with no Flash Player, some of the main websites they use (Club Penguin, CBeebies) don’t work on the iPad.  Setting up an external monitor each time they want to use the computer is not really practical so I needed to find another option – for now that option is recycling my the laptop that my wife replaced a couple of years ago (and which has been in the loft ever since…)

The laptop in question is an IBM ThinkPad T40 – a little long in the tooth but with a 1.5GHz Pentium M and 2GB of RAM it runs OK, although hundreds of Windows XP updates have left it feeling a little sluggish. Vista and 7 are too heavyweight so I decided to install Ubuntu (although I might also give ChromeOS a shot).

Unfortunately, the Ubuntu 12.04 installer stalled, complaining about a lack of hardware support:

This kernel requires the following features not present on the CPU:

pae

Unable to boot – please use a kernel appropriate for your CPU

So much for Linux being a lightweight operating system, suitable for use on old hardware (in fairness, other distributions would have worked). As it happens, it turns out that this is a known issue and there are a few workarounds – the one that worked for me was to use the non-PAE mini.iso installer (I wasn’t prompted to select the generic Linux kernel, but I did have to select the Ubuntu Desktop option).

Once Ubuntu was installed, Joey Sneddon (@d0od) has a useful article on the OMG! Ubuntu site listing 10 things to do after installing Ubuntu 12.04 – this helped with things like installing Codecs, Adobe Flash and better Libre Office menu integration – now to see how the family gets on with a non-Windows OS… I suspect the kids will hardly notice the difference.

Wake on LAN braindump

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I lost quite a bit of sleep over the last few nights, burning the midnight oil trying to get my Dell PowerEdge 840 (server repurposed as a workstation) to work with various Dell management utilities and enable Wake On LAN (WoL) functionality.

It seems that the various OpenManage tools were no help – indeed many of the information sources I found for configuring the Baseboard Management Controller and kicking SOLProxy and IMPI into life seemed to be out of date, or just not applicable on Windows 7 (although ipmish.exe might be a useful tool if I get it working in future and it can be used to send WoL packets). I did find that, annoyingly, WinRM 2.0 needs an HTTPS connection and that a self-signed certificate will not be acceptable (according to Microsoft knowledge base article 2019527).  If I ever return to the topic of WinRM and IPMI, there’s a useful MSDN article on installation and configuration for Windows Remote Management.

In the end, even though my system is running Windows 7, the answer was contained in a blog post about a PowerEdge 1750, WoL and Debian

“Pressing ‘CTRL-S’ brings us to a configuration panel which allows for enabling the Wake-On-LAN (WOL) mode of the card.”

I’d been ignoring this because it the Ctrl-S boot option advertises itself as the “Broadcom NetXtreme Ethernet Boot Agent” (and I didn’t want to set the machine up to PXE boot) but, sure enough, after changing the Pre-boot Wake On LAN setting to Enable, my PowerEdge 840 started responding to magic packets.

On my WoL adventure, I’d picked up a few more hints/tips too, so I thought it’s worth blogging them for anyone else looking to follow a similar path…

“Windows 2000 and Windows 2003 do not require that WOL be turned on in the NIC’s or LOM’s firmware, therefore the steps using DOS outlined in the Out?of?Box and Windows NT 4.0 procedures are not necessary and should be skipped.  Enabling WOL with IBAUTIL.EXE, UXDIAG.EXE or B57UDIAG.EXE may be detrimental to WOL under Windows 2000 and Windows 2003.”

    • Presumably this advice also applies to Windows XP, Vista, Server 2008, 7 and Server 2008 R2 as they are also based on the NT kernel, so there is no need to mess around with DOS images and floppy drives to try and configure the NIC…
  • I downloaded Broadcom’s own version (15.0.0.21 19/10/2011) of the Windows drivers for my NIC (even though Windows said that the Microsoft-supplied drivers were current) and I’m pretty sure (although I can’t be certain) that the Broadcom driver exposed advanced NIC properties that were not previously visible to control Wake Up Capabilities and WoL Speed. (Incidentally, I left all three power management checkboxes selected, including “Only allow a magic packet to wake the computer”). There’s more information on these options in the Broadcom Ethernet NIC FAQs.
  • There is a useful-sounding CLI utility called the Broadcom Advanced Control Suite that I didn’t need to download; however its existence might be useful to others.
  • Depicus (Brian Slack) has some fantastic free utilities (and a host of information about WoL) including:
  • Other WoL tools (although I think Depicus has the landscape pretty much covered) include:
  • There’s also some more information about WoL on Lifehacker.

Fixing a Dell server that required F1 on every boot

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last weekend, I dusted off (literally), my Dell PowerEdge 840 that was retired in favour of a low-power server a couple of years ago. My employer’s IT policies are making it harder and harder to do any personal computing from work (I know my laptop is for work but there’s a big grey area between work and play these days) and whilst the Mac Mini is fine for music, a bit of browsing and email, I wanted something a bit more “heavy duty” for some of my home computing needs.  With 8GB of RAM and a Quad core Xeon CPU, my old server is a pretty good workstation (7.0 on the Windows Performance Index for CPU and memory, 5.9 for primary hard disk, but only 1.0 for graphics!) and so it’s been brought back into service as a Windows 7 PC.

Unfortunately, every time I booted it, I had to press F1, until I worked out that it was still looking for some hard disks that I had removed.  Delving into the BIOS and switching the spare SATA ports to Off, rather than Auto, sorted out the problem and now the system boots without issue.

Getting started with Raspberry Pi (#RasPi)

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Raspberry Pi is a trademark of the Raspberry Pi FoundationMuch to my manager’s disgust (he has a programming background, whilst I’m an infrastructure guy “by trade” – although I did write code in my youth!), my Raspberry Pi arrived last week. Despite the botched launch, I still think this is one of the most exciting products we’ll see this year because, well, because it’s a fully functioning computer for around £25 (Model B) and that means the potential addressable market is enormous. Actually, that’s not quite right – the Pi is around £25 (plus VAT) and then you’ll need some peripherals – although they should be relatively easy to lay your hands on:

  • A micro-USB mobile phone charger (I use the one that came with my Nokia Lumia 800 but any 5V supply that can feed a micro-USB cable will do)
  • A USB keyboard
  • (Optionally) a mouse
  • (Optionally) some speakers
  • (Optionally) a USB hub (powered)
  • A wired network connection
  • An SD card
  • A display – but watch out as Raspberry Pi supports HDMI and component out (RCA) – not VGA.

My monitors are mostly VGA (I have one that will take DVI) and my TV is far too old for HDMI (it’s a 14-year-old Sony Trinitron 32″ widescreen CRT!) so I set the Pi up to use the analogue  connection to the TV.

Installing the operating system

I selected a Linux distro (the Raspberry Pi blog suggests that Fedora Remix is the recommended distro, as does the FAQ, although there is extensive discussion about whether to use Fedora or Debianthe Raspberry Pi quick start guide suggests that developers should use Debian and there are alternative downloads too). Eventually, I managed to install the Raspberry Pi Fedora Remix on my SD card (my Ubuntu machine recognised the SD card, but the Python version of the Fedora ARM Image Installer didn’t*; meanwhile my work laptop installed an image on the SD card but it wouldn’t boot – I suspect that’s down to the disk encryption software we use; finally I managed to run the Windows version of the Fedora ARM Image Installer on another Windows 7 PC).

Once I had an operating system installed, I booted and the RasPi picked up an IP address from my DHCP server, registered itself in DNS (raspi.domainname) and set to work expanding its disk to fill the 8GB SD card I’m using.

*getting this installer to work involved installing the python-qt4 package in the Ubuntu Software Centre, then running ./fedora-arm-installer.

Switching displays

Unfortunately, standard definition CRT TVs are no better at working with Raspberry Pi’s than they are with any other computer (except a games console) – and why I thought that should be the case is a mystery…

With only part of the display visible via component out (and not exactly easy to read) I started to investigate options for use of the HDMI port.  It turns out that HDMI to VGA is too expensive, but an HDMI to DVI cable cost just £2.39 at Amazon (thanks to Chromatix, The EponymousBob and GrumpyOldGit on the Raspberry Pi forums for sharing this info). With the RasPi hooked up to my only digital monitor, everything was much easier, although I did have to plug the cable directly into the monitor and I’m now waiting for delivery of a DVI-I female to female gender changer so that it’s a bit easier to swap the monitor cable between my computing devices.

So, what’s it like to use then?

Did I mention that the Raspberry Pi is a fully functioning computer for around £25? Well then, what’s not to like? Sure, performance is not lightning fast – the Raspberry Pi FAQs suggest:

“… real world performance is something like a 300MHz Pentium 2, only with much, much swankier graphics”

but that’s plenty for a bit of surfing, email and teaching my kids to write code.

I am finding though that I’m struggling a little with my chosen distro. For example, I haven’t yet managed to install Scratch and it doesn’t seem to be one of the recognised packages so I may have to resort to compiling from source – hardly ideal for getting kids started with coding. For that reason, I might switch to Debian (I’m downloading it as I write) but for now I’ll continue to explore the options that the Fedora Remix provides.

I’m sure there will be more RasPi posts on this blog but if you’re one of the thousands waiting for yours to arrive, hopefully this post will help to prepare…

And once the educational models are available, I’ll be encouraging my sons’ school to buy a lab full of these instead of a load more netbooks running Windows XP…

Running Android on a netbook

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve been thinking for a while it might be an interesting experiment to get Android running on my netbook. Amazingly, it was incredibly simple, thanks to a bit of ‘net research and the Android x86 project.

Sam Cater’s Android on your netbook post covers all the basics of downloading the software and preparing a USB stick to boot it (using UNetbootin). Depending on your hardware, you may find that you need a different version – I couldn’t get the Ice Cream Sandwich (Android 4.0) RC to work on my Lenovo S10e, for instance, but a deprecated generic version of Froyo (Android 2.2) seemed to boot with no issues.

It doesn’t even seem to matter that there is no touch support in my chosen hardware – the mouse and keyboard seemed to do the job for me. It will need some more work for me to get Wi-Fi into action (this thread might help) but, for now, I’m happy that 10 minutes on the ‘net (and that’s all it was – 15 at the most) found a use for an old USB stick and gave me a chance to have a play.

 

Cloning a Windows system disk using nothing but free software

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

As part of the process of replacing the hard disk in my server at home, I needed to clone the operating system between two drives. As my Windows Server installation consists of two partitions (my C: and a 100MB system reserved partition), I couldn’t use Microsoft’s disk imaging tool (imagex.exe) as it only works on single partitions (i.e. it’s not possible to image multiple partitions in a single operation).

I could have used commercial software like Symantec Ghost but I figured there must be a legitimate, free, way to do this by now and it turns out there is – I used the open source Clonezilla utility (I also considered some alternatives but found that some needed to be installed and I wanted something that would leave no trace on the system).

I had some issues at first – for some reason my machine wouldn’t boot from the CD I created but I found the best way was to install Clonezilla on the target disk.

To do this, I put the new disk in a USB HDD docking station and created a 200MB FAT partition on it. Next, I downloaded the .ZIP version of CloneZilla and copied the files to the new disk. I then ran /utils/win32/makeboot.bat to make the disk bootable (it’s important to run makeboot.bat from the new disk, not from the .ZIP file on the local system disk). The last step (which I didn’t see in the instructions and spent quite a bit of time troubleshooting) is to make the new disk active (using Disk Management or diskpart.exe).

With Clonezilla installed on my “new” disk, I connected it to the server and booted from this disk, electing to load CloneZilla into RAM and overwrite it as part of the cloning process.

I then left it to run for a few minutes before removing the old disk and rebooting back into Windows Server.

(Quite why I’m still running a Windows Server at home, I’m not sure… I don’t really need an Active Directory and for DNS, DHCP and TFTP I really should switch to Linux… I guess Windows is just what I know best… it’s comfortable!)

Three gotchas to be aware of:

  • If you don’t make the Clonezilla partition active you won’t be able to boot from it (basic, I know, but it’s not in the instructions that I followed).
  • Clonezilla clones the partitions as they are (i.e. it’s a clone – and there is no resizing to use additional space on the disk) – it’s easy to expand the volume later, but if you’re moving to a smaller disk, you may have to shrink the existing partition(s) before cloning.
  • The AMD64 version of Clonezilla hung at the calculating bitmap stage of the Partclone process , with a seemily random remaining time and 0% progress. I left this for several hours (on two occasions) and it did not complete (it appeared to write the partition structure to the disk, but not to transfer any data).  The “fix” seems to be to use the i686 version of Clonezilla.

Using Windows to remove a Mac OS X EFI partition from a disk

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

The old hard drive from my Mac is destined to find a new role in my low-power server (hopefully dropping the power consumption even further by switching from a 3.5″ disk to a 2.5″ disk). Before that can happen though, I needed to wipe it and clone my Windows Server installation.

After hooking the drive up, I found that it had two partitions: one large non-Windows partition that was easily removed in Server Manager’s Disk Management snap-in; and one EFI partition that Disk Management didn’t want to delete.

The answer, it seems, is to dive into the command line and use diskpart.exe.

After selecting the appropriate disk, the clean command quickly removed the offending partition. I then initialised it in Disk Management, electing to make it an MBR disk (it was GPT).

Cloning my Mac’s hard drive to gain some extra space

This content is 13 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

My MacBook (bought in 2008, unfortunately just before the unibody MacBook Pros were introduced) has always been running with upgraded memory and storage but it was starting to creak.  Performance is okay (it’s not earth-shattering but all I do on this machine is digital photography-related workflow) and it won’t take any more RAM than the 4GB I have installed but I was constantly battling against a full hard disk.

After a recent holiday when I was unable to archive the day’s shots and had to start filling my “spare” (read old and slow) memory cards to avoid deleting unarchived images, I decided to upgrade the disk. I did briefly consider switching to a solid state solution (until I saw the price – enough to buy a new computer), then I looked at a hybrid device, before I realised that I could swap out the 320GB Western Digital SATA HDD for a 750GB model from Seagate. The disk only cost me around £73 but next day shipping bumped it up a bit further (from Misco – other retailers were offering better pricing but had no stock). Even so, it was a worthwhile upgrade because it means all of my pictures are stored on a single disk again, rather than spread all over various media.

Of course, no image really exists until it’s in at least two places (so I do have multiple backups) but the key point is that, when I’m travelling, Lightroom can see all of my images.

I didn’t want to go through the process of reinstalling Mac OS X, Lightroom, Photoshop CS4, etc. so I decided to clone my installation between the two disks.  After giving up on a rather Heath Robinson USB to IDE/SATA cable solution that I have, I dropped another £24.99 on a docking station for SATA disk drives (an emergency purchase from PC World).

I’m used to cloning disks in Windows, using a variety of approaches with both free OS deployment tools from Microsoft and third party applications. As it happens, cloning disks in OS X is pretty straightforward too; indeed it’s how I do my backups, using a utility called Carbon Copy Cloner (some people prefer Super Duper). Using this approach I: created a new partition on the new disk (in Disk Utility), then cloned the contents of my old hard disk to the new partition (with Carbon Copy Cloner); then test boot with both drives in place (holding down the Alt/Option key to select the boot device); before finally swapping the disks over, once I knew that the copy had been successful.  Because it’s a file level copy, it took some time (just under six hours) but I have no issues with partition layouts – the software simply recreated the original file system on the partition that I specified on the new disk.  There’s more details of the cloning process in a blog post from Low End Mac but it certainly saved me a lot of time compared with a complete system rebuild.

Now all I need to do is sort out those images…