NetBooks, solid state drives and file systems

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Yesterday, I wrote about the new NetBook PC that I’ve ordered (a Lenovo IdeaPad S10). In that post I mentioned that I had some concerns about running Windows 7 on a PC with a solid state drive (SSD) and I wanted to clarify something: it’s not that Windows 7 (or any other version of Windows) is inherently bad on SSD, it’s just that there are considerations to take into account when making sure that you get the most out of a solid state drive.

Reading around various forums it’s apparent that SSDs vary tremendously in quality and performance. As a consequence, buying a cheap NetBook with a Linux distro on it and upgrading the SSD to a larger device (the Linux models generally ship with lower capacity SSDs than their more expensive Windows XP brethren) is not necessarily straightforward. Then there’s the issue of form factor – not all SSDs use the same size board.

Another commonly reported issue is that NTFS performance on an SSD is terrible and that FAT32 should be used instead. That rings alarm bells with me because FAT32: does not include any file-level access control lists; has a maximum file size of 4GB (so no good for storing DVD ISOs – not that you’ll get many of those on the current generation of SSDs – anyway, most NetBooks do not ship with an optical drive).

The reason for poor NTFS performance on SSDs may be found in a slide deck from the 2008 Windows Hardware Engineering Conference (WinHEC), where Frank Shu, a Senior Program Manager at Microsoft, highlights:

  • The alignment of NTFS partition to SSD geometry is important for SSD performance in [Windows]
    • The first Windows XP partition starts at sector #63; the middle of [an] SSD page.
    • [A] misaligned partition can degrade [the] device’s performance […] to 50% caused by read-modify-write.

It sounds to me as if those who are experiencing poor performance on otherwise good SSDs (whilst SSDs come in a smaller package, are resistant to shocks and vibration, use less power and generate less heat than mechanical hard drives SSD life and performance varies wildly) may have an issue with the partition alignment on their drives. Windows 7 implements some technologies to make best use of SSD technology (read more about how Windows 7 will, and won’t, work better with SSDs in Eric Lai’s article on the subject).

In addition, at the 2007 WinHEC, Frank Shu presented three common issues with SSDs:

  • Longer setup time for command execution.
  • SSD write performance.
  • Limited write cycles for NAND flash memory (100,000 write cycles for single layer cell devices and 10,000 write cycles for multi layer cell devices).

(He also mentioned cost – although this is dropping as SSDs become more prevalent in NetBooks and other PC devices aimed at highly-mobile users).

In short, SSD technology is still very new and there are a lot of factors to consider (I’ve just scraped the surface here). I’m sure that in the coming years I’ll be putting SSDs in my PCs but, as things stand at the end of 2008, it’s a little too soon to make that jump – even for a geek like me.

Incidentally, Frank Shu’s slide decks on Solid State Drives – Next Generation Storage (WinHEC 2007: WNS-T432) and Windows 7 Enhancements for Solid-State Drives (WinHEC 2008: COR-T558) are both available on the ‘net and worth a look for anyone considering running Windows on a system with an SSD installed.

Why Lenovo’s S10 seemed like a good idea(pad) to me

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I try to keep my work and home life on different computers. It doesn’t always work, but that’s the idea anyway. The problem I find is that, every time I’m away from home (which is when I get most of my blogging done), I find myself carrying around two laptops and, even without any peripherals (power adapters, etc.), that’s 4.5kg of luggage. Any sensible person would use an external hard disk for one of the workloads but… there you go…

Lenovo IdeaPad S10I’ve been watching developments with small form-factor PCs (so called “NetBooks”) for a while now and over the weekend I took the plunge. Tomorrow morning I’m expecting a delivery of a Lenovo IdeaPad S10 to slip in my bag alongside the Fujitsu-Siemens S7210 that I use for work.

So why did I choose the Lenovo?

  • In terms of build quality, my IBM ThinkPad is by far and away the best notebook PC I’ve ever had (better than the various Toshiba, Compaq, Dell and Fujitsu-Siemens units I’ve used – and certainly better than my Apple MacBook) – I’m hoping that Lenovo have continued that quality as they’ve taken on the former IBM PC business (the reviews I’ve read certainly indicate that they have).
  • I want to use this NetBook with Windows 7 – and I know it can work (this is the model that Steven Sinofsky showed in a keynote at Microsoft’s 2008 Professional Developers Conference).
  • I was impressed with Windows 7 running on Paul Foster’s Acer Aspire One, but the keyboard is just too small for my fat fingers.
  • The Lenovo S10 has a PC Express Card slot (so it should work with my Vodafone 3G card – and yes, I know I can get a USB version but I’d need to convince my employers of the need for an upgrade, which would not be an easy sell when they give me a perfectly good laptop with a PC Express Card slot to use…).
  • I also seriously considered the Dell Mini 9 (especially when they mis-priced it on their website for £99 last week – incidentally, the resulting orders were not fulfilled) but I’m not convinced that using a pre-release operating system on a solid state hard drive is really a good idea – I could easily kill the drive within a few months. Meanwhile, the Lenovo has a traditional 160GB hard disk and the 10.2″ screen (rather than 9″) translates into more space for a larger keyboard without noticeably increasing the size of the computer (for those who still want a 9″ model, Lenovo have announced an S9 but I’ve seen no sign of it in the UK yet). Another option that I discounted was the Samsung NC10 – which has a better battery and one more USB port but no PC Express Card slot.
  • The equivalent Asus and Acer models may be less expensive but the big names (IBM, Dell, HP as well as Samsung and Toshiba) are all reducing their prices – and by waiting for the reduction in the UK’s VAT rate to take effect the price was £292.25 for the S10 at eBuyer with free shipping (although I paid another tenner for next-day delivery).

I’m sure my sons will be amused when yet another computer appears on my desk (my wife may be slightly less so…) but I’m thinking of this as an early Christmas present to myself…

Further reading

Here are some of the posts that I found useful before deciding to buy this PC:

Microsoft after hours: the sequel

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

A little over 18 months ago, I attended an event at Microsoft titled “Vista After Hours”. The idea was that, instead of showing us all the features of the Windows ecosystem that were relevant to daily life as an IT Professional, Microsoft would demonstrate some of the things that can be done in Windows apart from work – demonstrating that the world of Windows is not all about dull, corporate applications.

Earlier this week, I was back for more – as Viral Tarpara, Paul Foster and Jamie Burgess demonstrated some of Microsoft’s products aimed at consumers and hobbyists.

As is likely to become the norm around here for such events (so many blog posts, so little time), I won’t write it up in full but here are some of the highlights:

  • Gears of War 2 – the latest big game for the Xbox 360 and phenominally successful (but I’m not a games guy).
  • Viral took a look at Windows Live Services – Google, Yahoo! and Microsoft (MSN/Live) are all doing similar things (although each will claim that it has the best new features!) – taking a look at a few of Microsoft’s Windows Live Services:
    • A new look for live.com is on it’s way to the UK. Personally, I like it – and you can hover the mouse over certain positions on that day’s picture to see links to potentially interesting facts.
    • Windows Live Mail: with a new Outlook-like interface and the ability to connect to multiple mail services (and chose which send via); add own stationary (arghh!); and it will soon include photo e-mail capabilities (e.g. select 4 photos, all resized and embedded in e-mail – rather than as an attachment – then add a frame, or make it black and white, make contrast corrections, etc.).
    • Windows Live Photo Gallery provides a gallery view for resizing, viewing/adding metadata, tagging and editing photos (preserving the original) but publishing etc. is where the Live Services come in and pictures may be published to Flickr, Windows Live Spaces, etc. The end result is highly functional software on the desktop PC, plus services in the cloud.
    • Windows Live Writer is Microsoft’s blogging software and it: integrates with various platforms (WordPress, Blogger, etc. – even SharePoint); applies the site’s stylesheet to the posts as you write; allows insertion of pictures, videos (YouTube or Soapbox), etc.
    • Windows Live Maps: whilst many people use Google Maps – Microsoft claim that Live is superior for business requirements (I prefer the Google mapping view) and it now features: a 3D view using an Internet Explorer/Firefox browser plugin (and no more page refreshes – zoom in and out – very impressive, although it’s a lot smoother on Microsoft’s Internet connection than on mine); a bird’s eye view which uses a Photosynth-like effect to select high resolution images; a free API to use and expose in own applications; collections of public or private searches (e.g. a walk around 3D Manhattan) using public data to link to map (e.g. Times Square).
  • Paul demonstrated Photosynth, which works out how pictures relate to one another in a four dimensional space to build up a complete picture. Because synths only show the data that this appropriate at this moment in time it’s possible to jump around and explore the environment at a reasonable speed. Using the example of Stonehenge, even though the photos were all taken at eye level, the synther can work out where the stones stand so that it is possible to view from above (or even below!). More images helps it to work out more points of view and speech synthesis technologies such as mousegrid can be used to navigate and scroll around.
  • Even I (the non-gamer) was impressed by the new Xbox 360 experience that Jamie demonstrated (due for a worldwide release today for a phased deployment to all Internet-connected Xbox 360s):

    • The user interface has been redesigned and blades have been replaced with a dashboard.
    • Music can be streamed from another PC to the Xbox and played over the top of games or anything else; effectively the Xbox becomes the presentation layer in the living room.
    • Avatars are a huge new feature – with more and more options coming online all the time.
    • Games may be stored on the hard drive.
    • Xbox messaging capabilities integrates with non-Xbox users of Windows Live Messenger (e.g. on PCs).
    • The interface is much more graphical/visual than previously and therefore become much more immersive.
  • Paul showed how Community Games allow anyone (or at least anyone who can write code) to create and publish their own games to Xbox Live (10 million people) including charging Microsoft points and sharing the revenue with Microsoft (the approval process does require accurate rating of the game’s suitability). XNA Game Studio is used with the Express Edition development tools and the resulting games will run on Windows, XBox, or Zune. For more information, check out the XNA UK user group, which aims to provide “a helping hand for bedroom coders throughout the land”.
  • Moving on to home automation systems, Jamie spoke about how he had run co-axial and CAT5 cabling around his parents’ house to stream content from two Sky Plus boxes to almost any room, using IR receivers in the ceiling to control everything from a single remote control. Further information on this type of setup (with Windows Media Center) can be found at The Digital Lifestyle and The Green Button. Much more tangible was Paul’s demonstration of his home automation with everything from recording and playing media content in Windows Media Center to using the mControl digital home software to remotely access CCTV feeds, set the temperature in a room and even water the plants in the garden. B&QBased on a system of scheduling and triggers, Paul demonstrated a HomeEasy system (available from B&Q) with an RF controller and xPL software to control lights (a blog post has been promised…). More Home Automation products are available from Let’s Automate.
  • Viral took a look a some more of the Windows Live services and admitted that the current version of the Windows Live Homepage is not as engaging as other Web 2.0 technologies (the good thing about Viral is that he may be a ‘softie but he also admits to using alternative solutions “because that’s how real people work”) before commenting that a new version will have tighter integration with various other services (e.g. Flickr, Twitter, etc.).
  • Viral also showed off some of the new features in the latest Windows Live Messenger beta – things like assigning your own entrance sound to play on your friends’ messenger client (uh huh… that will be annoying); what’s new (see what friends are up to – a bit like a Facebook status); activities – games, calendar swap, etc.; and photosharing where you can send a series of thumbnails by messenger and recipient can browse for more detail.
  • Ethernet over powerline is a technologies I considered until I replaced my wireless access point with something decent and Jamie briefly mentioned the success he’s had with a NetGear 200Mbps solution in his modern apartment (where the building construction makes Wi-Fi difficult.
  • Jamie then went on to talk about modifying his Mazda MX5 with a 7 inch touchsreen, connected to a mini-ITX PC in the boot, running a Centrafuse front end for GPS (USB attached), Radio, Phone via Bluetooth, Playlist, Music and videos (using a USB dongle Wi-Fi synchronisation between the car and his home whilst in the garage), OBDII diagnostic data, camera, weather, etc. Apparently, you can even have Live Mesh working on this solution too. It sounds like a neat in car entertainment solution but it also sounds like the classic case of a rich kid putting more electronics inside his car than the car is worth… but if this sounds like something of interest then check out MP3car.com.
  • So, moving on to Live Mesh, Viral demonstrated it as a combination of social networking and synchronisation so that files in Mesh-enabled folders on each connected device are synchronised so that data is accessible wherever (based on synchronisation policies to control which contacts can see which data). Using the “Synchronising Life” video I embedded in my recent post on Windows Live FolderShare, he spoke of the potential for Mesh-enabled picture frame and gave a real-world example of how he (in the UK) and his girlfriend (in the USA) share pictures and other information via Live Mesh as the different timezones and work schedules mean that they may not be online at the same time.
  • Paul spoke of how he has Windows XP Pro trimmed down to 384MB and running on a USB key with a mini-ITX PC. It’s possible to do this using the evaluation tools for Windows XP Embedded/CE to strip down although the operating system image does expire. Pico-ITX PCs are even smaller yet still offer USB support, VGA output and SATA II drives. Find out more at mini-itx.com.
  • A Microsoft Surface table is a $10,000 device based on a technology called Frustrated Total Internal Reflection (FTIR). Paul demonstrated build a DIY multitouch device using nothing more than a cardboard box, a webcam, a sheet of perspex and a sheet of paper, together with software from the Natural User Interface group). Basically, he fed the webcam through a hole in the bottom of the box (camera facing up) and used the perspex as surface (with paper on top to block out ambient light). The NUI software will handle the view, inverting the image, removing the background, etc. but some additional coding will be required in order to build multi-touch applications. I have to say that it was pretty amazing!
  • Next up – robotics. Those who were at the Windows Server 2008 launch in Birmingham earlier this year may remember Paul’s A1-DW robot (A1 = top stuff, DW = a bit of a dimwit – he needs to be told what to do) but Paul showed a video of the robot working its way around his house. A1-DW is controlled with software developed using the Microsoft Robotics Developer Studio (MRDS) which is free for non-commercial use and provides a combination of a visual programming language and physics-enabled based simulation. In Paul’s demonstration he used a simple programme to join the SetDrivePower control on a GenericDifferentialDrive to the TriggersChanged event from XInputController (a Wireless Xbox controller) and drove it around the room – the idea being that services scattered across a home network (one big grid computer) can be used to control less powerful robot.
  • The next demonstration was of Windows Home Server, showing how this product has a very simple user interface, designed to make it easy for consumers to set up a server in their home and manage users, shared folders, storage and websites (e.g. for sharing a photo album with friends and family). Plugins are available (e.g. mControl for home server) whilst the network status is indicated with a simple red, amber, green system which advises of any action to be taken (e.g. update anti-virus definitions, perform a backup). There is also a simple interface for setting up backups, password policies, remote access (reverse DNS is established via the Windows Live ID authentication process – upon sign in, the IP address of the server is recorded in the homeserver.com DNS zone), port forwarding (via uPnP), etc. Windows Home Server is available to system builders as an OEM product, or a fully-configured system costs around £500 (e.g. the HP EX400 MediaSmart server at £499). For more information on Windows Home Server and the digital home, see We Got Served.
  • Looking at some of the developments in Microsoft hardware, Viral demonstrated: Microsoft’s new mice with a blue LED light which can track smoothly regardless of the surface; new LifeCam devices with HD picture quality and messenger integration; and an arcmouse where the end folds in for travel without the usual restrictions of a mobile mouse (i.e. its small size).
  • Finally, Paul showed off Windows 7 Ultimate Edition running on a netbook. The model he used was an Acer Aspire One with a 1.6GHz Intel Atom CPU, 1GB RAM, 120GB Hard drive (not SSD) and I was very impresed at the performance and the graphics (e.g. very smooth Flip-3D effects). For those who were confused by the apparant doublespeak in my recent post about installing Windows 7 on an old PC, it’s worth considering that this machine cost him £228 including shipping (for a Linux version) and has a Windows Experience index of 2.3 (2.9 for the CPU, 3.3 for RAM, 2.3 for graphics, 3.0 for gaming graphics and 5.0 for disk). Having seen this, I’m almost certainly going to be buying a Dell Inspiron Mini 9 for Windows 7.

For someone who mostly concentrates on Microsoft’s business-focused products, it was interesting to spend an evening on the consumer side of the fence. In summary: an evening of geeky goodness.

Yes, you can use all the processing power on a multi-core system

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve heard a few comments recently about it not being worth buying multi-core processors because it’s impossible to harness all of the processing power and I have to say that is a pile of stuff and nonsense (putting it politely).

Well, it is nonsense if the operating system can recognise multiple processors (and Windows NT derivatives have had multi-processor support for as long as I can remember) but it also has a lot to do with the software in use. If everything is single threaded (it shouldn’t be these days), then the operating system scheduler can’t spread the threads out and make the most of it’s processing capabilities.

Anyway, I’ve been maxxing out a 2.2GHz Core2Duo-based notebook PC for the last couple of days with no difficulties whatsoever. My basic workload is Outlook 2007, Office Communicator 2007, Internet Explorer (probably a few windows, each with a couple of dozen tabs open) and the usual bunch of processes running in the background (anti-virus, automatic updates, etc.). Yesterday, I added three virtual machines to that mix, running on a USB2-attached hard drive (which, unlike a Firewire drive, also requires a big chunk of processing) as well as TechSmith SnagIt, as I was testing and documenting a design that I was working on and that did slow my system down a little (the first time there has been any significant paging on this system, which runs 64-bit Windows Server 2008 and has 4GB of RAM).

Then, today, I was compressing video using Camtasia Studio 5 (another TechSmith product) and, despite having closed all other running applications besides a couple of Explorer windows, it was certainly making full use of my system as the screenshots below show. Watch the CPU utilisation as I start to render my final video output:

Windows Task Manager showing increased CPU utilisation as video rendering commences

during rendering:

Windows Task Manager showing increased CPU utilisation as video rendering commences

and after the task was completed, when CPU activity dropped to a more normal level:

Windows Task Manager showing increased CPU utilisation as video rendering commences

Of course, a lot of this would have been offloaded to the GPU if I had a decent graphics card (this PC has an Intel GMA965 controller onboard) but I think this proves that multiple processor cores can be fully utilised without too much effort…

Using packet level drivers for MS-DOS network connectivity

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

One of the reasons to use Windows PE for operating system deployment is that it’s built on a modern version of Windows so, at least in theory, driver support is less of an issue than it would be using MS-DOS boot disks.

Even so, there are still times when a good old MS-DOS boot disk comes in handy and networking is a particular pain point – NDIS drivers are a pain to configure so packet-level drivers are often useful for operating system deployment tasks (but not optimised for anything more substantial). Available for many of the more common Ethernet cards, they are generally 16-bit utilities for MS-DOS and so will not work in 32-bit or 64-bit operating systems.

As this is not exactly cutting edge technology, many of the useful sites are starting to drop off the ‘net (but a surprising number remain) – here’s a few links I found that might come in handy:

I have a new printer to install – but where has the OS X Printer Setup Utility gone?

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

After many years of faithful service, my HP LaserJet 2200dn has started printing black lines and ghosting all over the page. Because most of my printing is for work, I asked the company to finance the repairs (or to provide a replacement) and, because they are so serious about green IT (erhum…), rather than use their engineering resource to work out what was wrong and buy the appropriate consumables, they have given me a new printer (an HP Officejet 6310 All-in-One, which seems to be a nice device but it is an inkjet – so expensive to run – and an unnecessary waste of resources as the old printer could have been fixed).

Predictably, I’m having problems installing the software on 64-bit Windows Server 2008 but I’m sure I’ll get there if I do some research (which I won’t at 10pm on a Sunday), but the XP installation on another PC was straightforward (if bloated and time consuming) and the Mac installation seems to have gone reasonably well too (using Bonjour to track down the device on the network). The only catch on the Mac seems to be that the software is written for Mac OS X up to 10.4 and I’m running 10.5.3. This means that some of the hooks in the installer didn’t work – like when it was looking for the printer setup utility and it seems that utility does not exist in Leopard. Luckily, the Leopard’s lost features blog pointed me in the right direction:

“Tiger’s ‘Printer Setup Utility’ has been removed, and all printer configuration is now done and managed exclusively through the Print & Fax system preference pane.”

High volume, low cost, portable hard disk

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

When I bought my MacBook, I immediately upgraded the hard disk to a 320GB model (I generally avoid Western Digital, but I decided to risk it this time on the basis that as long as the data is backed up then everything should be OK).

Ever since then, I’ve been looking for a suitable USB-powered hard disk to back the MacBook up. I wanted a good-looking portable unit but upgrading the disk to match (or exceed) the internal disk was going to be problematic from a power and cooling perspective. Then I walked into PC World yesterday and saw a 320GB Western Digital My Passport Essential hard disk for £99.99. Unfortunately they only had the 320GB size in Midnight Black (my MacBook is white), so I paid a little bit more for an Arctic White one from dabs.com.

Even though the drive supports Windows and Macintosh computers (and, although it doesn’t say so, it should work with any other PC operating system as long as it can load the appropriate USB drivers), the supplied software is only for Windows. I moved the software to another disk and connected the drive to my Mac, where I reformatted it using HFS+ and a GUID partition table (the drive was supplied as FAT32 – which is great for device portability but does have some limitations on file size – and with a master boot record (MBR). As it happens, that step was not necessary because my chosen backup software erased the disk.

After running Carbon Copy Cloner my Mac hard disk contents were duplicated onto the external disk and I could breath a sigh of relief, safe in the knowledge that when (not if) the internal hard disk fails at least I have a copy to work from.

Carbon Copy ClonerThere’s just one point to note about the cloning process… on my 2.2GHz MacBook with 4GB of RAM, the cloning operation started out by taking around 4 minutes per GB. With just short of 300GB to transfer that’s 20 hours, so I did’t pay too much attention to the progress bar (which indicated that the clone was about 25% complete after about 12 minutes) – it just happens that the operating system and applications (at the front of the disk) have lots of small files whereas my data (written later) includes a lot of large media files. Even as the progress bar slowed to a crawl, the file transfer rate seemed to improve and the operation finally completed in about 6 hours and 40 minutes. Subsequent backups should be faster as they will be incremental.

The no-compromise ultraportable?

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

As the day job has been taking over my life (reducing my time for blogging), I thought I’d finish up the week with some light-hearted humour. I’ve commented before that I think Apple’s MacBook Air ultraportable PC is overpriced and underspecced. And whilst it may be selling to the Apple fanboys and those execs with more money than sense it’s not really much use for people who really need a light PC to travel with for their work (in my opinion, as someone who travels a lot, and uses standard notebook PCs – although, sadly, my employer won’t give me a ThinkPad either). Not wanting to start up the Mac vs. PC rubbish (I’ve been there before), I thought I’d post Lenovo’s view on what an ultraportable PC should be like:

This video has been floating around the web for a few days now, and some of the responses I’ve seen have been along the lines of “Yeah, but the MacBook Air does everything I need without needing to plug anything in”. Right. Of course it does. Well, if the MacBook Air is good for you, then all I have to say is “good for you”. Personally, I’ll take the ThinkPad. And if Vista is too much of a compromise (I don’t think it is) then I’ll take a normal Apple MacBook (mine is running OS X and Vista).

Comparing internal and USB-attached hard disk performance in a notebook PC

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Recently, I was in a meeting with a potential business partner and their software was performing more slowly than they had expected in the virtual environment on my notebook PC. The application was using a SQL Server 2005 Express Edition database and SQL Server is not normally a good candidate for virtualisation but I was prepared to accept the performance hit as I do not want any traces of the product to remain on my PC once the evaluation is over.

Basic inspection using Task Manager showed that neither the virtual nor the physical system was stressed from a memory or CPU perspective but the disk access light was on continuously, suggesting that the application was IO-bound (as might be expected with a database-driven application). As I was also running low on physical disk space, I considered whether moving the VM to an external disk would improve performance.

On the face of it, spreading IO across disk spindles should improve performance but with SATA hard disk interfaces providing a theoretical data transfer rate of 1.5-3.0Gbps and USB 2.0 support at 480Mbps, my external (USB-attached) drive is, on paper at least, likely to result in reduced IO when compared with the internal disk. That’s not the whole story though – once you factor in the consideration that standard notebook hard drives are slow (4200 or 5400RPM), this becomes less of a concern as the theoretical throughput of the disk controller suddenly looks far less attainable (my primary hard drive maxes out at 600Mbps). Then consider that actual hard disk performance under Windows is determined not only by the speed of the drive but also by factors such as the motherboard chipset, UDMA/PIO mode, RAID configuration, CPU speed, RAM size and even the quality of the drivers and it’s far from straightforward.

I decided to take a deeper look into this. I should caveat this with a note that performance testing is not my forte but I armed myself with a couple of utilities that are free for non-commercial use – Disk Thruput Tester (DiskTT.exe) and HD Tune.

Both disks were attached to the same PC, a Fujitsu-Siemens S7210 with a 2.2GHz Intel Mobile Core 2 Duo (Merom) CPU, 4GB RAM and two 2.5″ SATA hard disks but the internal disk was a Western Digital Scorpio WD1200BEVS-22USTO whilst the external was a Fujitsu MHY2120BH in a Freecom ToughDrive enclosure.

My (admittedly basic) testing revealed that although the USB device was a little slower on sequential reads, and quite a bit slower on sequential writes, the random access figure was very similar:

Internal (SATA) disk External (USB) disk
Sequential writes 25.1MBps 22.1MBps
Sequential reads 607.7MBps 570.8MBps
Random access 729.3MBps 721.6MBps

Testing was performed using a 1024MB file, in 1024 chunks and the cache was flushed after writing. No work was performed on the PC during testing (background processes only). Subsequent re-runs produced similar test results.

Disk throughput test results for internal diskDisk throughput test results for external (USB-attached) disk

Something doesn’t quite stack up here though. My drive is supposed to max out at 600Mbps (not MBps) so I put the strange results down to running a 32-bit application on 64-bit Windows and ran a different test using HD Tune. This gave some interesting results too:

Internal (SATA) disk External (USB) disk
Minimum transfer rate 19.5MBps 18.1MBps
Maximum transfer rate 52.3MBps 30.6MBps
Average transfer rate 40.3MBps 27.6MBps
Access time 17.0ms 17.7ms
Burst rate 58.9MBps 24.5MBps
CPU utilisation 13.2% 14.3%

Based on these figures, the USB-attached disk is slower than the internal disk but what I found interesting was the graph that HD Tune produced – the USB-attached disk was producing more-or-less consistent results across the whole drive whereas the internal disk tailed off considerably through the test.

Disk performance test results for internal disk
Disk performance test results for external (USB-attached) disk

There’s a huge difference between benchmark testing and practical use though – I needed to know if the USB disk was still slower than the internal one when it ran with a real workload. I don’t have any sophisticated load testing tools (or experience) so I decided to use the reliability and performance (performance monitor) capabilities in Windows Server 2008 to measure the performance of two identical virtual machines, each running on a different disk.

Brent Ozar has written a good article on using perfmon for SQL performance testing and, whilst my application is running on SQL Server (so the article may help me find bottlenecks if I’m still having issues later), by now I was more interested in the effect of moving the virtual machine between disks. It did suggest some useful counters to use though:

  • Memory – Available MBytes
  • Paging File – % Usage
  • Physical Disk – % Disk Time
  • Physical Disk – Avg. Disk Queue Length
  • Physical Disk – Avg. Disk sec/Read
  • Physical Disk – Avg. Disk sec/Write
  • Physical Disk – Disk Reads/sec
  • Physical Disk – Disk Writes/sec
  • Processor – % Processor Time
  • System – Processor Queue Length

I set this up to monitor both my internal and external disks, and to log to a third external disk so as to minimise the impact of the logging on the test.

Starting from the same snapshot, I ran the VM on the external disk and monitored the performance as I started the VM, waited for the Windows Vista Welcome screen and then shut it down again. I then repeated the test with another copy of the same VM, from the same snapshot, but running on the internal disk.

Sadly, when I opened the performance monitor file that the data collector had created, the disk counters had not been recorded (which was disappointing) but I did notice that the test had run for 4 minutes and 44 seconds on the internal disk and only taken 3 minutes and 58 seconds on the external one, suggesting that the external disk was actually faster in practice.

I’ll admit that this testing is hardly scientific – I did say that performance testing is not my forte. Ideally I’d research this further and I’ve already spent more time on this than I intended to but, on the face of it, using the slower USB-attached hard disk still seems to improve VM performance because the disk is dedicated to that VM and not being shared with the operating system.

I’d be interested to hear other people’s comments and experience in this area.

So much for Apple’s legendary build quality

This content is 17 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Readers of this blog may recall that I bitched about the time it took for Apple to deliver my new MacBook recently. It was ordered on 5 February, finally arrived on 14 February – and broken on 31 March. What did I do to break it? I rested my hands on the palm rest. Is that a user error?

Seriously, I was in the pub last night with Alex and Simon (from ascomi, who are helping me work on a new version of this site) and there was a fair degree of Mac vs. PC banter going on when all of a sudden there was a crack under my right palm and I saw that part of the top cover/keyboard assembly was split at the edge. I had only had the computer in my possession for 6 weeks and have really looked after it – to say that I was not happy is a bit of an understatement. So much for Apple’s legendary build quality.

Split top cover on nearly-new MacBook after 6 weeks of light (and careful) use

As it happens, some people regard the MacBook as the ugly step child of the Apple family – I disagree (hence the reason I bought one) but I do think that it is a little pricey and for that premium pricing I do expect premium build quality. It may not be as bad as the last Dell notebook I used but it is nowhere near as good as my IBM ThinkPad T40 and I have never had a case crack through normal use (drops and inadequate protection in transit maybe).

It seems that the MacBook case crack is a common defect and, whilst Apple refuses to acknowledge it as a design fault (it seems to occur next to the small bevel that keeps the screen and keyboard apart when the MacBook is closed, suggesting that may be causing undue pressure on that part of the top case) but Brian Ford wrote about the same problem four days ago and although getting picked up by John Gruber (Daring Fireball) will have helped, last night had 144 comments on his post. On that basis, this does not appear to be an isolated issue.

Furthermore, the problem has been around for a while now and whilst some reports suggest that Apple has changed the affected component and it does not occur on new models, I see no evidence of that as my computer is less than two months old – I call that pretty new.

I phoned AppleCare as soon as they were open this morning and spoke to a really helpful guy. He asked me if I had taken out AppleCare protection (no, but I have a warranty) and then proceeded to make an appointment with an Apple “Genius” at the Apple Store (I don’t know what’s worse – Apple’s idea that their tech support guys are all geniuses or Microsoft’s idea that there are IT departments full of heroes all across the world) . When there were no slots available, I asked which store he had tried and he said “Oh, most people ask for Regent Street in London”. I said “I’d like an appointment at my local store please” and suddenly there were lots of slots free and I just needed to pick my time!

So, I set off to the Milton Keynes Apple Store, arrived a couple of minutes early, booked in, and saw my name top of the Mac queue at the Genius Bar. Then I waited, and waited, and pestered the sales staff until a (very helpful) genius called Simon came over to help. It seems that the iPod queue and the Mac queue are actually one, and that there was only one genius, who was very very busy with a lot of people to see this morning, meanwhile the shop was littered with trainers and sales staff apparently doing very little.

Thankfully, Simon the genius noted that my MacBook was in “mint” condition (although the Genius Bar Work Authorisation will only allow it to be recorded as “As New”) and there was no argument that it had been mistreated in any way. Apple will be replacing the top cover/keyboard assembly and say that it will take 5 to 7 days but why so long? It should be a 1 hour job (maximum), plus the time to obtain parts and schedule the work – so, 2 to 3 days would be more reasonable. Doubtless I will hear from support technicians who say “you try our job for a day – we work really hard” (to which I say “I’ve been there – and so do lots of people”). In the meantime, I’ll be without my MacBook for a week.

I’ve posted my picture of the issue to the Flickr group that has been set up to highlight this issue. In the meantime, if you are having similar problems, I urge you to do the same and to leave a comment on Brian Ford’s Newsvine article so that he can build enough evidence to (hopefully) get Apple to actually do something about this issue.