A quick look at Windows ReadyBoost

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

My netbook it only came with 1GB of RAM, so I decided to see what effect the option to “Speed up my system using Windows ReadyBoost” would make (presented by Windows Vista and later when inserting removable media – more details can be found over on the Kodyaz Development Resources site).

First of all I tried a 1GB USB key that I’d been given with some presentation materials on it but Windows told me the device was not fast enough to use for ReadyBoost.

That was something of a surprise to me – I knew that not all devices were suitable for ReadyBoost but how could I tell why my device was failing? In his article, Is your flash drive is fast enough for ReadyBoost?, Ed Bott explains that:

“If you get a failure message when you first insert a flash device and try to use it as a ReadyBoost drive, you can click Test Again to get a second hearing. If the drive fails several tests, you can look up the specific performance results for yourself. Open Event Viewer (Eventvwr.msc) and click the Applications And Services Logs category in the console tree on the left. Under this heading, click Microsoft, Windows, and ReadyBoost. Under this latter heading, select Operational. The log entries in the center pane include performance test results for both successful and unsuccessful attempts.”

Sure enough, checking the logs on my Windows 7 system showed messages like:

Source: ReadyBoost
EventID: 1008
Description: The device (UT163 USB Flash Disk) will not be used for a ReadyBoost cache because it does not exhibit uniform performance across the device.  Size of fast region: 0 MB.

and:

Source: ReadyBoost
EventID: 1004
Description: The device (UT163 USB Flash Disk) will not be used for a ReadyBoost cache because it has insufficient write performance: 173 KB/sec.

173KB per second is about 10% of the required speed for ReadyBoost so I tried again, this time using a 1GB SD card.

First I saw an event to indicate that the card exhibited the necessary performance characteristics:

Source: ReadyBoost
EventID: 1000
Description: The device (Generic- Multi-Card) is suitable for a ReadyBoost cache.  The recommended cache size is 991232 KB.  The random read speed is 3311 KB/sec.  The random write speed is 3500 KB/sec.

and then a second event recording the creation of the cache:

Source: ReadyBoost
EventID: 1010
Description: A ReadyBoost cache was successfully created on the device (Generic- Multi-Card) of size 966 MB.

So, after creating the cache, did ReadyBoost actually make a difference?  It’s difficult to say – on a relatively low-powered PC (the one I used only has an Intel Atom 1.6GHz) performance is not blindingly fast and, as the USB ports (including internal ones used for devices like media card readers) rely on the main CPU for IO processing, it could be argued that use of USB attached memory would even compound the issue when the PC is running out of steam.  Those with faster PCs, or faster memory devices may see a difference.

Long Zheng has a good summary in his article which puts forward the notion that ReadyBoost works but that it’s not a miracle:

“I don’t agree with […] how ReadyBoost has been marketed and perceived by the public. ReadyBoost does not improve performance, it only improves responsiveness. It won’t make your system or [applications] run any faster, but it will make things faster to load and initialize to a working-state.

If you’re on a budget, then ReadyBoost is premium accessory that is definitely not value-for-money. You’re literally paying a price to slice milliseconds off loading times. But if you’re a professional or heavy business user, then ReadyBoost might be a cheaper, easier or the only alternative to upgrading memory.”

Long suggests that ReadyBoost is not value for money. I’d add that it may be if, like me, you have a lot of small USB keys that are doing nothing more than gathering dust on a shelf.  It’s probably not worth investing money in new hardware especially to use ReadyBoost though.  Indeed, one of Long’s readers (Tomer Chachamu) makes a distinction which is extremely important to consider:

“I am using [ReadyBoost] for several weeks now and I can confirm your experiences, that it helps a lot to improve the responsivness [sic.] of the system.

So it helps to make the whole system perform faster. So isn’t it the same?

High responsiveness: the system ‘feels fast’ and you don’t have to wait for something to load when you’re about to go to a command. (Example of high responsiveness: when you logon, you immediately want to go to the start menu and launch something. The time from logon to launch is a busy wait for you.) – this is affected by readyboost [sic].

High speed: the system performs computational (or I/O) tasks fast. (Example: you are ripping a massive library of CDs. It takes about 10 minutes. If it took less time, say by offloading floating point calculations to the GPU, then that would be high speed. It’s still longer than half a minute so the system is fast, but not responsive. When you’re encoding the CDs, you can do other useful activities, so it’s a non-busy wait.) – this is not affected by readyboost [sic].”

ReadyBoost is not about high speed – it’s about responsiveness (which explains why PC World were unimpressed when they tested some ReadyBoost-capable USB flash drives on Windows Vista).

In the end, I decided to buy some more RAM but, for those considering using ReadyBoost, it’s worth checking out Tom Archer’s ReadyBoost FAQ.

Get more memory at Crucial.com!

Configuring Windows Mail for Tiscali’s IMAP servers

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last weekend, I set up my father-in-law’s new PC and needed to get Windows Mail (the built-in client in Windows Vista that replaces Outlook Express) working with his ISP’s mail server.  The ISP in question is Tiscali, but I still wanted the messages to be available on the server for webmail access, so I wanted to use IMAP and not POP to collect e-mail.

Tiscali’s instructions seem to be for Outlook Express and POP3 (at least the ones I found were) but I decided to see if they offered an IMAP service and it seems they do – all I needed to provide to Windows was some basic account information (name, username and password), the incoming server name (imap.tiscali.co.uk) and outgoing server (smtp.tiscali.co.uk).  At first sight, some of the mail folders were missing but they were easily made visible by selecting IMAP Folders… from the Tools menu.

Show/hide IMAP folders in Windows Mail

Finally, to tidy up the experience I remapped some of the special folders Trash/Spam/Sent instead of the Microsoft defaults of Deleted Items/Junk E-mail/Sent Items in the account properties and hid the unused ones from view.

Remapping IMAP folders in Windows Mail

Managing stored credentials from the Windows command prompt using cmdkey

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I’ve been meaning to blog about a command which is a reasonably recent addition to Windows for a few weeks now – cmdkey.exe (thanks to John Craddock for highlighting this at a recent XTSeminars event).

Basically Microsoft’s cmdkey, introduced with Windows Server 2003 (and which should not be confused with Jason Hood’s companion for cmd.exe), is used to create, list and delete stored security credentials.

For example, I back up the notebook PC that I use for work to my Netgear ReadyNAS using SyncToy. My ReadyNAS does not support Active Directory but it does provide SMB/CIFS access. This means that I can authenticate directly against a share but the username and password do not match the cached domain credentials on the notebook PC.

Supplying credentials each time I need to connect (or forgetting to before attempting a sync) is inconvenient, so I used cmdkey to store the username and password that I use to connect to the share:

cmdkey /add:computername /user:username /pass:password

In this case cmdkey responded as follows:

CMDKEY: Credential added successfully.

Typing:

cmdkey /list

returns:

Currently stored credentials:

Target: computername
Type: Domain Password
User:
username

and I can connect to a share without supplying any credentials:

net use h: \\computername\sharename

The command completed successfully.

Furthermore this drive mapping (and stored credentials) persists on reboot – when the computer is restarted, H: is visible as a disconnected drive in Windows Explorer but as soon as I double-click it I connect without a prompt to supply credentials.

Camera raw support in Windows Vista and later

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Most of my digital photography workflow takes place on a Mac, where I use Adobe Camera Raw and Bridge/Photoshop CS3 to handle camera raw images.  With my recent purchase of a netbook (which is small enough and light enough to take out with me on a shoot – and less expensive than a dedicated storage device like an Epson P-7000), it would be useful to view the images in Windows but the Microsoft Raw Image Thumbnailer and Viewer for Windows XP has not been updated since 2005 and is not compatible with Windows Vista or later.

I did wonder if the technology had been absorbed into Windows Explorer and it seems it has… I found a forum post that suggests using Windows Photo Gallery and then installing some codecs (this post has more information on raw support in Windows Vista) but it turns out that the camera raw codecs are also available for direct download (i.e. with no need for Windows Photo Gallery) and after installation the raw file contents are available in thumbnails, previews and applications.

Unfortunately the major manufacturers (Canon and Nikon) do not produce codecs for 64-bit Windows (i.e. for people running high-end workstations with lots of memory for editing large images…) but the 32-bit codecs are fine for my little netbook with 2.5GB of RAM and there is 64-bit support for Adobe digital negatives (.DNG).

During installation, the Canon codecs complained that the screen resolution was not high enough on the netbook (1024×576) and refused to install but that was easily overcome by connecting to an external monitor with a higher resolution (no such issue with the Nikon codecs).

Incidentally, whilst I was researching this blog post I found that Microsoft also has an interesting program called Pro Photo Tools, which includes the ability to geotag photos, edit metadata, convert between raw formats, TIFF, JPEG and HD Photo; and work with Sidecar (.XMP) files (for interoperability with Adobe products – i.e. Bridge).  It too relies on the installation of the relevant raw codecs but should fit in quite nicely for some basic metadata tagging on the netbook whilst still in the field before transferring the images to the MacBook for any final tweaks when I get home.
Nikon raw image viewed in Microsoft Pro Photo Tools

Securely wiping hard disks using Windows

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

My blog posts might be a bit sporadic over the next couple of weeks – I’m trying to squeeze the proverbial quart into a pint pot (in terms of my available time) and am cramming like crazy to get ready for my MCSE to MCITP upgrade exams.

I’m combining this Windows Server 2008 exam cramming with a review of John Savill’s Complete Guide to Windows Server 2008 and I hope to publish my review of that book soon afterwards.

One of the tips I picked up from the book this morning as I tried to learn as much as I could about Bitlocker drive encryption in an hour, was John’s tip for securely wiping hard drives using a couple of Windows commands:

format driveletter: /fs:ntfs /x

will force a dismount if required and reformat the drive, using NTFS.

cipher /w:driveletter:

will remove all data from the unused disk space on the chosen drive.

I don’t know how this compares with third party products that might be used for this function but I certainly thought it was a useful thing to know. This is not new to Windows Server 2008 either – it’s certainly available as far back as Windows XP and possibly further.

For more tips like this, check out the NTFAQ or John’s site at Savilltech.com.

Windows Vista and Server 2008 SP2 is opened up to the public, target release date announced

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

After the storm of announcements from Microsoft at PDC, WinHEC and TechEd EMEA it’s been a quiet few weeks but, for those who haven’t seen, Microsoft announced that the Windows Vista and Server 2008 Service Pack 2 beta will be opened up to a wider audience, starting with TechNet and MSDN subscribers at 14:00 tomorrow (I guess that’s Redmond time, so 22:00 here in the UK) and then via a broader customer preview programme (CPP) on Thursday (4 December).

This release is intended for technology enthusiasts, developers, and administrators who would like to test SP2 in their environments and with their applications prior to final release and, for most customers, Microsoft’s advice is to wait until the final release prior to installing this update.

Full details of the changes in the SP2 beta may be found in Microsoft’s Windows Server TechCenter.

Microsoft also announced the date that they are aiming for (not a firm commitment) – SP2 should be expected in the first half of 2009.

Installing Windows from a USB drive

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Last week I downloaded the milestone 3 build of Windows 7 and installed it in a virtual machine. Then I heard how Windows 7 has been tuned (compared with Vista) to run on lower-specification hardware so I decided to install it on my aging Compaq D510SFF, which is not going to give me blinding performance (particularly for graphics) but does at least have a 2.4GHz Pentium 4 CPU, 2GB of RAM and a 320GB hard disk so it shouldn’t be too bad either.

I downloaded the 32-bit version (previously I’d used 64-bit), burned a DVD, popped it in the drive and booted:

  • Problem #1 – this PC has a CD-R drive and I have a DVD ISO.

The only DVD drives I had available were in my server (which I don’t want to take down right now) and in my work laptop (a slimline drive – with a strange connector on the back) so I went shopping for hardware:

  • MaplinProblem #2 – my local branch of Maplin had sold out of DVD drives and PC World didn’t have any brown box ones (just the overpriced ones in a pretty box).
  • (Problem #2a – markwilson.it has been spending too much on hardware recently and the bank balance is not looking too good. Spending money on components for an aging PC does not make too much sense.)

Back to the drawing board. I could PXE boot to a Windows Deployment Services server but I didn’t really want to go to the effort of setting all that up so, after checking I hadn’t missed anything obvious with my trusted colleagues Dave and Garry, I turned my attentions to USB booting the PC.

  • Problem #3 – the largest USB drive I have is 1GB – and a DVD .ISO is much bigger than that.

I decided to see if I could use a USB hard disk and it turns out I can – this is how it works. The advice is based on Vista but it works for later releases of Windows too:

  1. Make some space on a hard disk for a new partition. I shrank the existing volume in Disk Management to give me 32MB of free space but I could have just wiped the drive too.
  2. Dive into the command line and fire up diskpart.exe, issuing the following commands:
    • list disk (to see the available disks and see which one I had just created 32MB of free space on)
    • select disk number
    • clean (skip this if you do not want to wipe the disk clean – i.e. if you want to keep data on other partitions)
    • create partition primary
    • select partition number
    • active
    • format fs=fat32 (I later read that NTFS would work too but FAT32 worked for me on a relatively small partition like this)
    • assign
    • exit
  3. Copy the contents of the Windows installation DVD to the new partition with xcopy dvddrive:\*.* /s /e /f harddrive:\
  4. According to the blog post from Kurt Shintaku that I used for reference, that should be enough but that doesn’t actually create a boot sector. Dave Glover’s post on the subject alerted me to the presence of the bootsect.exe utility from the \boot folder on the installation DVD and bootsect /nt60 harddrive: successfully updated the bootcode on my USB hard drive.
  5. Boot the PC from USB and install Windows.

And so does Windows 7 run well on that old PC? I wish I could tell you but, unlike everyone who got their copy from PDC, those of us signed up via Microsoft Connect are under NDA… grrr. What I can say is that, if you’re not bothered about high-end graphics, then even Vista will run on a PC like this… and based on what’s already been said by Microsoft I wouldn’t expect 7 to be any worse and it may even be slightly better.

Trusting a self-signed certificate in Windows

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

All good SSL certificates should come from a well-known certification authority – right? Not necessarily (as Alun Jones explains in defence of the self-signed certificate).

I have a number of devices at home that I access over HTTPS and for which the certificates are not signed by Verisign, Thawte, or any of the other common providers. And, whilst I could get a free or inexpensive certificate for these devices, why bother when only I need to access them – and I do trust the self-signed cert!

A case in point is the administration page for my NetGear ReadyNAS – this post describes how I got around it with Internet Explorer (IE) but the principle is the same for any self-signed certificate.

First of all, I added the address to my trusted sites list. As the ReadyNAS FAQ describes, this is necessary on Windows Vista in order to present the option to install the certificate and the same applies on my Windows Server 2008 system. Adding the site to the trusted sites list won’t stop IE from blocking navigation though, telling me that:

There is a problem with this website’s security certificate.

The security certificate presented by this website was not issued by a trusted certificate authority.

Security certificates problems may indicate an attempt to fool you or intercept any data you send to the server.

We recommend that you close this webpage and do not continue to this website.

Fair enough – but I do trust this site, so I clicked the link to continue to the website regardless of Microsoft’s warning. So, IE gave me another security warning:

Security Warning

The current webpage is trying to open a site in your Trusted sites list. Do you want to allow this?

Current site: res://ieframe.dll
Trusted site: https://
mydeviceurl

Thank you IE… but yes, that’s why I clicked the link (I know, we have to protect users from themselves sometimes… but the chances are that they won’t understand this second warning and will just click the yes button anyway). After clicking yes to acknowledge the warning (which was a conscious choice!) I could authenticate and access the website.

Two warnings every time I access a site is an inconvenience, so I viewed the certificate details and clicked the button to install the certificate (if the button is not visible, check the status bar to see that IE has recognised the site as from the Trusted Sites security zone). This will launch the Certificate Import Wizard but it’s not sufficient to select the defaults – the certificate must be placed in the Trusted Root Certification Authorities store, which will present another warning:

Security Warning

You are about to install a certificate from a certification authority (CA) claiming to represent:

mydeviceurl

Windows cannot validate that the certificate is actually from “certificateissuer“. You should confirm its origin by contacting “certificateissuer“. The following number will assist you in this process:

Thumbprint (sha1): thumbprint

Warning:

If you install this root certificate, Windows will automatically trust any certificate issued by this CA. Installing a certificate with an unconfirmed thumbprint is a security risk. If you click “Yes” you acknowledge this risk.

Do you want to install this certificate?

Yes please! After successfully importing the certificate and restarting my browser, I could go straight to the page I wanted with no warnings – just the expected authentication prompt.

Incidentally, although I used Internet Explorer (version 8 beta) to work through this, once the certificate is in the store, then all browsers any browser that uses the certificate store in Windows should act in the same manner (the certificate store is not browser-specific some browsers, e.g. Firefox, implement their own certificate store). To test this, I fired up Google Chrome and it was able to access the site I had just trusted with no issue but if I went to another, untrusted, address with a self-signed certfiicate (e.g. my wireless access point), Chrome told me that:

The site’s security certificate is not trusted!

You attempted to reach mydeviceurl but the server presented a certificate issued by an entity that is not trusted by your computer’s operating system. This may mean that the server has generated its own security credentials, which Google Chrome cannot rely on for identity information, or an attacker may be trying to intercept your communications. You should not proceed, especially if you have never seen this warning before for this site.

Chrome also has some excellent text at a link labelled “help me understand” which clearly explains the problem. Unfortunately, although Chrome exposes Windows certificate management (in the options, on the under the hood page, under security), it doesn’t allow addition a site to the trusted sites zone (which is an IE concept) – and that means the option to install the cerficate is not available in Chrome. In imagine it’s similar in Firefox or Opera (or Safari – although I’m not sure who would actually want to run Safari on Windows).

Before signing off, I’ll mention that problems may also occur if the certificate is signed with invalid details – for example the certificate on my wireless access point applies to another URL (www.netgear.com) and, as that’s not the address I use to access the device, that certificate will still be invalid. The only way around a problem like this is to install another, valid, certificate (self-signed or otherwise).

The Windows Blog

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

I don’t know about you, but I’m getting confused with all the Windows blogs coming out of Microsoft – last week I wrote about two new Windows 7 blogs (one for developers and another for IT pros) and those who are watching the Windows Vista Team blog may have noticed that it moved to a new site today.

The Windows Blog is a new site, hosted outside the normal TechNet/MSDN domain names and it features a combined feed from several Windows blogs, including the Windows Vista team blog, the Windows Experience blog and the new Windows 7 team blog.

Once the dust has settled (and after tomorrow’s PDC keynote), I expect to see lots of new content appearing on the new site.

Windows Vista (and Server 2008) SP2 beta announced

This content is 16 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.

Next week’s Professional Developer’s Conference should see lots of news from Microsoft around Windows 7, Windows Server 2008 R2 and Microsoft’s cloud computing strategy but those who are looking for something that should be hear a a little sooner, the Windows Vista team’s announcement that a beta of service pack 2 is just around the corner will probabably be of interest.

As seems to be the norm these days, the service pack will include new functionality (including Windows Search 4.0, native Blu-Ray support and updated Bluetooth and Wi-Fi connectivity options) but, even though some of these features are client-focused, it intended that a single service pack will apply to both client and server versions of Windows (quite how that works, only time will tell – the Windows Server team is focusing on including the RTM version of Hyper-V and power improvements in SP2 – perhaps it will be a single service pack, but two different versions?).

No news yet as to an intended release date for the final service pack – Microsoft’s Mike Nash wrote:

“The final release date for Windows Vista SP2 will be based on quality. So we’ll track customer and partner feedback from the beta program before setting a final date for the release.”

Windows Vista SP2 beta will be available to a limited group of testers from 29 October.